Sample records for advanced petascale simulations

  1. Real science at the petascale.

    PubMed

    Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V

    2009-06-28

    We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.

  2. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The Com

  3. Advances in petascale kinetic plasma simulation with VPIC and Roadrunner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowers, Kevin J; Albright, Brian J; Yin, Lin

    2009-01-01

    VPIC, a first-principles 3d electromagnetic charge-conserving relativistic kinetic particle-in-cell (PIC) code, was recently adapted to run on Los Alamos's Roadrunner, the first supercomputer to break a petaflop (10{sup 15} floating point operations per second) in the TOP500 supercomputer performance rankings. They give a brief overview of the modeling capabilities and optimization techniques used in VPIC and the computational characteristics of petascale supercomputers like Roadrunner. They then discuss three applications enabled by VPIC's unprecedented performance on Roadrunner: modeling laser plasma interaction in upcoming inertial confinement fusion experiments at the National Ignition Facility (NIF), modeling short pulse laser GeV ion acceleration andmore » modeling reconnection in magnetic confinement fusion experiments.« less

  4. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  5. The Petascale Data Storage Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth; Long, Darrell; Honeyman, Peter

    2013-07-01

    Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.

  6. The grand challenge of managing the petascale facility.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R. J.; Mathematics and Computer Science

    2007-02-28

    This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, wemore » should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected DOE computation facilities

  7. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  8. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  9. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  10. Final Project Report. Scalable fault tolerance runtime technology for petascale computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnamoorthy, Sriram; Sadayappan, P

    With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less

  11. Foundational Tools for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less

  12. Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation

    PubMed Central

    Phillips, James C.; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J.; Kalé, Laxmikant V.

    2014-01-01

    Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison. PMID:25594075

  13. Petascale supercomputing to accelerate the design of high-temperature alloys

    NASA Astrophysics Data System (ADS)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen

    2017-12-01

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.

  14. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  15. Adjusting process count on demand for petascale global optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sosonkina, Masha; Watson, Layne T.; Radcliffe, Nicholas R.

    2012-11-23

    There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, themore » modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed.« less

  16. Petascale Simulations of the Morphology and the Molecular Interface of Bulk Heterojunctions

    DOE PAGES

    Carrillo, Jan-Michael Y.; Seibers, Zach; Kumar, Rajeev; ...

    2016-07-14

    Understanding how additives interact and segregate within bulk heterojunction (BHJ) thin films is critical for exercising control over structure at multiple length scales and delivering improvements in photovoltaic performance. The morphological evolution of poly(3-hexylthiophene) (P3HT) and phenyl-C 61-butyric acid methyl ester (PCBM) blends that are commensurate with the size of a BHJ thin film is examined using petascale coarse-grained molecular dynamics simulations. When comparing 2 component and 3 component systems containing short P3HT chains as additives undergoing thermal annealing we demonstrate that the short chains alter the morphol- ogy in apparently useful ways: They efficiently migrate to the P3HT/PCBM interface,more » increasing the P3HT domain size and interfacial area. Simulation results agree with depth profiles determined from neutron reflectometry measurements that reveal PCBM enrichment near substrate and air interfaces, but a decrease in that PCBM enrich- ment when a small amount of short P3HT chains are integrated into the BHJ blend. Atomistic simulations of the P3HT/PCBM blend interfaces show a non-monotonic dependence of the interfacial thickness as a function of number of repeat units in the oligomeric P3HT additive, and the thiophene rings orient parallel to the interfacial plane as they approach the PCBM domain. Using the nanoscale geometries of the P3HT oligomers, LUMO and HOMO energy levels calculated by density functional theory are found to be invariant across the donor/acceptor interface. Finally, these connections between additives, processing, and morphology at all length scales are generally useful for efforts to improve device performance.« less

  17. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE PAGES

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...

    2017-10-25

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  18. Petascale supercomputing to accelerate the design of high-temperature alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwon; Lee, Sangkeun; Shyam, Amit

    Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less

  19. Understanding I/O workload characteristics of a Peta-scale storage system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Youngjae; Gunasekaran, Raghul

    2015-01-01

    Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the I/O workloads of scientific applications of one of the world s fastest high performance computing (HPC) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). OLCF flagship petascale simulation platform, Titan, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize the system utilization, the demands of reads and writes, idle time, storage space utilization,more » and the distribution of read requests to write requests for the Peta-scale Storage Systems. From this study, we develop synthesized workloads, and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution. We also study the I/O load imbalance problems using I/O performance data collected from the Spider storage system.« less

  20. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root

  1. Petascale Simulation Initiative Tech Base: FY2007 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, J; Chen, R; Jefferson, D

    The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton

  2. Multi-petascale highly efficient parallel supercomputer

    DOEpatents

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  3. Towards Petascale DNS of High Reynolds-Number Turbulent Boundary Layer

    NASA Astrophysics Data System (ADS)

    Webster, Keegan R.

    In flight vehicles, a large portion of fuel consumption is due to skin-friction drag. Reduction of this drag will significantly reduce the fuel consumption of flight vehicles and help our nation to reduce CO 2 emissions. In order to reduce skin-friction drag, an increased understanding of wall-turbulence is needed. Direct numerical simulation (DNS) of spatially developing turbulent boundary layers (SDTBL) can provide the fundamental understanding of wall-turbulence in order to produce models for Reynolds averaged Navier-Stokes (RANS) and large-eddy simulations (LES). DNS of SDTBL over a flat plate at Retheta = 1430 - 2900 were performed. Improvements were made to the DNS code allowing for higher Reynolds number simulations towards petascale DNS of turbulent boundary layers. Mesh refinement and improvements to the inflow and outflow boundary conditions have resulted in turbulence statistics that match more closely to experimental results. The Reynolds stresses and the terms of their evolution equations are reported.

  4. MOGO: Model-Oriented Global Optimization of Petascale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Shende, Sameer S.

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less

  5. Petascale Diagnostic Assessment of the Global Portfolio Rainfall Space Missions' Ability to Support Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Reed, P. M.; Chaney, N.; Herman, J. D.; Wood, E. F.; Ferringer, M. P.

    2015-12-01

    This research represents a multi-institutional collaboration between Cornell University, The Aerospace Corporation, and Princeton University that has completed a Petascale diagnostic assessment of the current 10 satellite missions providing rainfall observations. Our diagnostic assessment has required four core tasks: (1) formally linking high-resolution astrodynamics design and coordination of space assets with their global hydrological impacts within a Petascale "many-objective" global optimization framework, (2) developing a baseline diagnostic evaluation of a 1-degree resolution global implementation of the Variable Infiltration Capacity (VIC) model to establish the required satellite observation frequencies and coverage to maintain acceptable global flood forecasts, (3) evaluating the limitations and vulnerabilities of the full suite of current satellite precipitation missions including the recently approved Global Precipitation Measurement (GPM) mission, and (4) conceptualizing the next generation spaced-based platforms for water cycle observation. Our team exploited over 100 Million hours of computing access on the 700,000+ core Blue Waters machine to radically advance our ability to discover and visualize key system tradeoffs and sensitivities. This project represents to our knowledge the first attempt to develop a 10,000 member Monte Carlo global hydrologic simulation at one degree resolution that characterizes the uncertain effects of changing the available frequencies of satellite precipitation on drought and flood forecasts. The simulation—optimization components of the work have set a theoretical baseline for the best possible frequencies and coverages for global precipitation given unlimited investment, broad international coordination in reconfiguring existing assets, and new satellite constellation design objectives informed directly by key global hydrologic forecasting requirements. Our research poses a step towards realizing the integrated

  6. Toward Petascale Biologically Plausible Neural Networks

    NASA Astrophysics Data System (ADS)

    Long, Lyle

    This talk will describe an approach to achieving petascale neural networks. Artificial intelligence has been oversold for many decades. Computers in the beginning could only do about 16,000 operations per second. Computer processing power, however, has been doubling every two years thanks to Moore's law, and growing even faster due to massively parallel architectures. Finally, 60 years after the first AI conference we have computers on the order of the performance of the human brain (1016 operations per second). The main issues now are algorithms, software, and learning. We have excellent models of neurons, such as the Hodgkin-Huxley model, but we do not know how the human neurons are wired together. With careful attention to efficient parallel computing, event-driven programming, table lookups, and memory minimization massive scale simulations can be performed. The code that will be described was written in C + + and uses the Message Passing Interface (MPI). It uses the full Hodgkin-Huxley neuron model, not a simplified model. It also allows arbitrary network structures (deep, recurrent, convolutional, all-to-all, etc.). The code is scalable, and has, so far, been tested on up to 2,048 processor cores using 107 neurons and 109 synapses.

  7. Working Towards New Transformative Geoscience Analytics Enabled by Petascale Computing

    NASA Astrophysics Data System (ADS)

    Woodcock, R.; Wyborn, L.

    2012-04-01

    Currently the top 10 supercomputers in the world are petascale and already exascale computers are being planned. Cloud computing facilities are becoming mainstream either as private or commercial investments. These computational developments will provide abundant opportunities for the earth science community to tackle the data deluge which has resulted from new instrumentation enabling data to be gathered at a greater rate and at higher resolution. Combined, the new computational environments should enable the earth sciences to be transformed. However, experience in Australia and elsewhere has shown that it is not easy to scale existing earth science methods, software and analytics to take advantage of the increased computational capacity that is now available. It is not simply a matter of 'transferring' current work practices to the new facilities: they have to be extensively 'transformed'. In particular new Geoscientific methods will need to be developed using advanced data mining, assimilation, machine learning and integration algorithms. Software will have to be capable of operating in highly parallelised environments, and will also need to be able to scale as the compute systems grow. Data access will have to improve and the earth science community needs to move from the file discovery, display and then locally download paradigm to self describing data cubes and data arrays that are available as online resources from either major data repositories or in the cloud. In the new transformed world, rather than analysing satellite data scene by scene, sensor agnostic data cubes of calibrated earth observation data will enable researchers to move across data from multiple sensors at varying spatial data resolutions. In using geophysics to characterise basement and cover, rather than analysing individual gridded airborne geophysical data sets, and then combining the results, petascale computing will enable analysis of multiple data types, collected at varying

  8. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data

  9. Parcels v0.9: prototyping a Lagrangian ocean analysis framework for the petascale age

    NASA Astrophysics Data System (ADS)

    Lange, Michael; van Sebille, Erik

    2017-11-01

    As ocean general circulation models (OGCMs) move into the petascale age, where the output of single simulations exceeds petabytes of storage space, tools to analyse the output of these models will need to scale up too. Lagrangian ocean analysis, where virtual particles are tracked through hydrodynamic fields, is an increasingly popular way to analyse OGCM output, by mapping pathways and connectivity of biotic and abiotic particulates. However, the current software stack of Lagrangian ocean analysis codes is not dynamic enough to cope with the increasing complexity, scale and need for customization of use-cases. Furthermore, most community codes are developed for stand-alone use, making it a nontrivial task to integrate virtual particles at runtime of the OGCM. Here, we introduce the new Parcels code, which was designed from the ground up to be sufficiently scalable to cope with petascale computing. We highlight its API design that combines flexibility and customization with the ability to optimize for HPC workflows, following the paradigm of domain-specific languages. Parcels is primarily written in Python, utilizing the wide range of tools available in the scientific Python ecosystem, while generating low-level C code and using just-in-time compilation for performance-critical computation. We show a worked-out example of its API, and validate the accuracy of the code against seven idealized test cases. This version 0.9 of Parcels is focused on laying out the API, with future work concentrating on support for curvilinear grids, optimization, efficiency and at-runtime coupling with OGCMs.

  10. Multi-petascale highly efficient parallel supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaflop-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC). The ASIC nodes are interconnected by a five dimensional torus network that optimally maximize the throughput of packet communications between nodes and minimize latency. The network implements collective network and a global asynchronous network that provides global barrier and notification functions. Integrated in the node design include a list-based prefetcher. The memory system implements transaction memory, thread level speculation, and multiversioning cache that improves soft error rate at the same time andmore » supports DMA functionality allowing for parallel processing message-passing.« less

  11. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karbach, Carsten; Frings, Wolfgang

    2013-02-22

    This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP.more » The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form

  12. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results

    NASA Astrophysics Data System (ADS)

    Karimabadi, Homa

    2012-03-01

    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  13. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M

    2005-01-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less

  14. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques.more » Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the

  15. A dual communicator and dual grid-resolution algorithm for petascale simulations of turbulent mixing at high Schmidt number

    NASA Astrophysics Data System (ADS)

    Clay, M. P.; Buaria, D.; Gotoh, T.; Yeung, P. K.

    2017-10-01

    A new dual-communicator algorithm with very favorable performance characteristics has been developed for direct numerical simulation (DNS) of turbulent mixing of a passive scalar governed by an advection-diffusion equation. We focus on the regime of high Schmidt number (S c), where because of low molecular diffusivity the grid-resolution requirements for the scalar field are stricter than those for the velocity field by a factor √{ S c }. Computational throughput is improved by simulating the velocity field on a coarse grid of Nv3 points with a Fourier pseudo-spectral (FPS) method, while the passive scalar is simulated on a fine grid of Nθ3 points with a combined compact finite difference (CCD) scheme which computes first and second derivatives at eighth-order accuracy. A static three-dimensional domain decomposition and a parallel solution algorithm for the CCD scheme are used to avoid the heavy communication cost of memory transposes. A kernel is used to evaluate several approaches to optimize the performance of the CCD routines, which account for 60% of the overall simulation cost. On the petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign, scalability is improved substantially with a hybrid MPI-OpenMP approach in which a dedicated thread per NUMA domain overlaps communication calls with computational tasks performed by a separate team of threads spawned using OpenMP nested parallelism. At a target production problem size of 81923 (0.5 trillion) grid points on 262,144 cores, CCD timings are reduced by 34% compared to a pure-MPI implementation. Timings for 163843 (4 trillion) grid points on 524,288 cores encouragingly maintain scalability greater than 90%, although the wall clock time is too high for production runs at this size. Performance monitoring with CrayPat for problem sizes up to 40963 shows that the CCD routines can achieve nearly 6% of the peak flop rate. The new DNS code is built upon two existing FPS and CCD codes

  16. Freud: a software suite for high-throughput simulation analysis

    NASA Astrophysics Data System (ADS)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  17. Petascale computation of multi-physics seismic simulations

    NASA Astrophysics Data System (ADS)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.

    2017-04-01

    Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential

  18. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  19. I/O-aware bandwidth allocation for petascale computing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Zhou; Yang, Xu; Zhao, Dongfang

    In the Big Data era, the gap between the storage performance and an appli- cation's I/O requirement is increasing. I/O congestion caused by concurrent storage accesses from multiple applications is inevitable and severely harms the performance. Conventional approaches either focus on optimizing an ap- plication's access pattern individually or handle I/O requests on a low-level storage layer without any knowledge from the upper-level applications. In this paper, we present a novel I/O-aware bandwidth allocation framework to coordinate ongoing I/O requests on petascale computing systems. The motivation behind this innovation is that the resource management system has a holistic view ofmore » both the system state and jobs' activities and can dy- namically control the jobs' status or allocate resource on the y during their execution. We treat a job's I/O requests as periodical subjobs within its lifecycle and transform the I/O congestion issue into a classical scheduling problem. Based on this model, we propose a bandwidth management mech- anism as an extension to the existing scheduling system. We design several bandwidth allocation policies with different optimization objectives either on user-oriented metrics or system performance. We conduct extensive trace- based simulations using real job traces and I/O traces from a production IBM Blue Gene/Q system at Argonne National Laboratory. Experimental results demonstrate that our new design can improve job performance by more than 30%, as well as increasing system performance.« less

  20. Large-scale large eddy simulation of nuclear reactor flows: Issues and perspectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merzari, Elia; Obabko, Aleks; Fischer, Paul

    Numerical simulation has been an intrinsic part of nuclear engineering research since its inception. In recent years a transition is occurring toward predictive, first-principle-based tools such as computational fluid dynamics. Even with the advent of petascale computing, however, such tools still have significant limitations. In the present work some of these issues, and in particular the presence of massive multiscale separation, are discussed, as well as some of the research conducted to mitigate them. Petascale simulations at high fidelity (large eddy simulation/direct numerical simulation) were conducted with the massively parallel spectral element code Nek5000 on a series of representative problems.more » These simulations shed light on the requirements of several types of simulation: (1) axial flow around fuel rods, with particular attention to wall effects; (2) natural convection in the primary vessel; and (3) flow in a rod bundle in the presence of spacing devices. Finally, the focus of the work presented here is on the lessons learned and the requirements to perform these simulations at exascale. Additional physical insight gained from these simulations is also emphasized.« less

  1. Large-scale large eddy simulation of nuclear reactor flows: Issues and perspectives

    DOE PAGES

    Merzari, Elia; Obabko, Aleks; Fischer, Paul; ...

    2016-11-03

    Numerical simulation has been an intrinsic part of nuclear engineering research since its inception. In recent years a transition is occurring toward predictive, first-principle-based tools such as computational fluid dynamics. Even with the advent of petascale computing, however, such tools still have significant limitations. In the present work some of these issues, and in particular the presence of massive multiscale separation, are discussed, as well as some of the research conducted to mitigate them. Petascale simulations at high fidelity (large eddy simulation/direct numerical simulation) were conducted with the massively parallel spectral element code Nek5000 on a series of representative problems.more » These simulations shed light on the requirements of several types of simulation: (1) axial flow around fuel rods, with particular attention to wall effects; (2) natural convection in the primary vessel; and (3) flow in a rod bundle in the presence of spacing devices. Finally, the focus of the work presented here is on the lessons learned and the requirements to perform these simulations at exascale. Additional physical insight gained from these simulations is also emphasized.« less

  2. Quantum Monte Carlo for large chemical systems: implementing efficient strategies for petascale platforms and beyond.

    PubMed

    Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William

    2013-04-30

    Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC=Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC=Chem has been shown to be capable of running at the petascale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exascale platforms with a comparable level of efficiency is expected to be feasible. Copyright © 2013 Wiley Periodicals, Inc.

  3. In situ visualization for large-scale combustion simulations.

    PubMed

    Yu, Hongfeng; Wang, Chaoli; Grout, Ray W; Chen, Jacqueline H; Ma, Kwan-Liu

    2010-01-01

    As scientific supercomputing moves toward petascale and exascale levels, in situ visualization stands out as a scalable way for scientists to view the data their simulations generate. This full picture is crucial particularly for capturing and understanding highly intermittent transient phenomena, such as ignition and extinction events in turbulent combustion.

  4. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products ismore » inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.« less

  5. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  6. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  7. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  8. Interactive Volume Exploration of Petascale Microscopy Data Streams Using a Visualization-Driven Virtual Memory Approach.

    PubMed

    Hadwiger, M; Beyer, J; Jeong, Won-Ki; Pfister, H

    2012-12-01

    This paper presents the first volume visualization system that scales to petascale volumes imaged as a continuous stream of high-resolution electron microscopy images. Our architecture scales to dense, anisotropic petascale volumes because it: (1) decouples construction of the 3D multi-resolution representation required for visualization from data acquisition, and (2) decouples sample access time during ray-casting from the size of the multi-resolution hierarchy. Our system is designed around a scalable multi-resolution virtual memory architecture that handles missing data naturally, does not pre-compute any 3D multi-resolution representation such as an octree, and can accept a constant stream of 2D image tiles from the microscopes. A novelty of our system design is that it is visualization-driven: we restrict most computations to the visible volume data. Leveraging the virtual memory architecture, missing data are detected during volume ray-casting as cache misses, which are propagated backwards for on-demand out-of-core processing. 3D blocks of volume data are only constructed from 2D microscope image tiles when they have actually been accessed during ray-casting. We extensively evaluate our system design choices with respect to scalability and performance, compare to previous best-of-breed systems, and illustrate the effectiveness of our system for real microscopy data from neuroscience.

  9. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  10. Active Storage with Analytics Capabilities and I/O Runtime System for Petascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok

    Computational scientists must understand results from experimental, observational and computational simulation generated data to gain insights and perform knowledge discovery. As systems approach the petascale range, problems that were unimaginable a few years ago are within reach. With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive I/O, storage, acceleration of data manipulation, analysis, and mining tools. Scientists require techniques, tools and infrastructure to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis, statistical analysis and knowledgemore » discovery. The goal of this work is to enable more effective analysis of scientific datasets through the integration of enhancements in the I/O stack, from active storage support at the file system layer to MPI-IO and high-level I/O library layers. We propose to provide software components to accelerate data analytics, mining, I/O, and knowledge discovery for large-scale scientific applications, thereby increasing productivity of both scientists and the systems. Our approaches include 1) design the interfaces in high-level I/O libraries, such as parallel netCDF, for applications to activate data mining operations at the lower I/O layers; 2) Enhance MPI-IO runtime systems to incorporate the functionality developed as a part of the runtime system design; 3) Develop parallel data mining programs as part of runtime library for server-side file system in PVFS file system; and 4) Prototype an active storage cluster, which will utilize multicore CPUs, GPUs, and FPGAs to carry out the data mining workload.« less

  11. Advances and issues from the simulation of planetary magnetospheres with recent supercomputer systems

    NASA Astrophysics Data System (ADS)

    Fukazawa, K.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.

    2016-12-01

    will show the latest simulation results using the petascale supercomputer and problems from the use of these supercomputer systems.

  12. Tinker-HP: a massively parallel molecular dynamics package for multiscale simulations of large complex systems with advanced point dipole polarizable force fields.

    PubMed

    Lagardère, Louis; Jolly, Luc-Henri; Lipparini, Filippo; Aviat, Félix; Stamm, Benjamin; Jing, Zhifeng F; Harger, Matthew; Torabifard, Hedieh; Cisneros, G Andrés; Schnieders, Michael J; Gresh, Nohad; Maday, Yvon; Ren, Pengyu Y; Ponder, Jay W; Piquemal, Jean-Philip

    2018-01-28

    We present Tinker-HP, a massively MPI parallel package dedicated to classical molecular dynamics (MD) and to multiscale simulations, using advanced polarizable force fields (PFF) encompassing distributed multipoles electrostatics. Tinker-HP is an evolution of the popular Tinker package code that conserves its simplicity of use and its reference double precision implementation for CPUs. Grounded on interdisciplinary efforts with applied mathematics, Tinker-HP allows for long polarizable MD simulations on large systems up to millions of atoms. We detail in the paper the newly developed extension of massively parallel 3D spatial decomposition to point dipole polarizable models as well as their coupling to efficient Krylov iterative and non-iterative polarization solvers. The design of the code allows the use of various computer systems ranging from laboratory workstations to modern petascale supercomputers with thousands of cores. Tinker-HP proposes therefore the first high-performance scalable CPU computing environment for the development of next generation point dipole PFFs and for production simulations. Strategies linking Tinker-HP to Quantum Mechanics (QM) in the framework of multiscale polarizable self-consistent QM/MD simulations are also provided. The possibilities, performances and scalability of the software are demonstrated via benchmarks calculations using the polarizable AMOEBA force field on systems ranging from large water boxes of increasing size and ionic liquids to (very) large biosystems encompassing several proteins as well as the complete satellite tobacco mosaic virus and ribosome structures. For small systems, Tinker-HP appears to be competitive with the Tinker-OpenMM GPU implementation of Tinker. As the system size grows, Tinker-HP remains operational thanks to its access to distributed memory and takes advantage of its new algorithmic enabling for stable long timescale polarizable simulations. Overall, a several thousand-fold acceleration over

  13. Collaboratively Architecting a Scalable and Adaptable Petascale Infrastructure to Support Transdisciplinary Scientific Research for the Australian Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability

  14. Why the Petascale era will drive improvements in the management of the full lifecycle of earth science data.

    NASA Astrophysics Data System (ADS)

    Wyborn, L.

    2012-04-01

    The advent of the petascale era, in both storage and compute facilities, will offer new opportunities for earth scientists to transform the way they do their science and to undertake cross-disciplinary science at a global scale. No longer will data have to be averaged and subsampled: it can be analysed to its fullest resolution at national or even global scales. Much larger data volumes can be analysed in single passes and at higher resolution: large scale cross domain science is now feasible. However, in general, earth sciences have been slow to capitalise on the potential of these new petascale compute facilities: many struggle to even use terascale facilities. Our chances of using these new facilities will require a vast improvement in the management of the full life cycle of data: in reality it will need to be transformed. Many of our current issues with earth science data are historic and stem from the limitations of early data storage systems. As storage was so expensive, metadata was usually stored separate from the data and attached as a readme file. Likewise, attributes that defined uncertainty, reliability and traceability were recoded in lab note books and rarely stored with the data. Data were routinely transferred as files. The new opportunities require that the traditional discover, display and locally download and process paradigm is too limited. For data access and assimilation to be improved, data will need to be self describing. For heterogeneous data to be rapidly integrated attributes such as reliability, uncertainty and traceability will need to be systematically recorded with each observation. The petascale era also requires that individual data files be transformed and aggregated into calibrated data arrays or data cubes. Standards become critical and are the enablers of integration. These changes are common to almost every science discipline. What makes earth sciences unique is that many domains record time series data, particularly in the

  15. Experiment-scale molecular simulation study of liquid crystal thin films

    NASA Astrophysics Data System (ADS)

    Nguyen, Trung Dac; Carrillo, Jan-Michael Y.; Matheson, Michael A.; Brown, W. Michael

    2014-03-01

    Supercomputers have now reached a performance level adequate for studying thin films with molecular detail at the relevant scales. By exploiting the power of GPU accelerators on Titan, we have been able to perform simulations of characteristic liquid crystal films that provide remarkable qualitative agreement with experimental images. We have demonstrated that key features of spinodal instability can only be observed with sufficiently large system sizes, which were not accessible with previous simulation studies. Our study emphasizes the capability and significance of petascale simulations in providing molecular-level insights in thin film systems as well as other interfacial phenomena.

  16. Recent advancements in medical simulation: patient-specific virtual reality simulation.

    PubMed

    Willaert, Willem I M; Aggarwal, Rajesh; Van Herzeele, Isabelle; Cheshire, Nicholas J; Vermassen, Frank E

    2012-07-01

    Patient-specific virtual reality simulation (PSVR) is a new technological advancement that allows practice of upcoming real operations and complements the established role of VR simulation as a generic training tool. This review describes current developments in PSVR and draws parallels with other high-stake industries, such as aviation, military, and sports. A review of the literature was performed using PubMed and Internet search engines to retrieve data relevant to PSVR in medicine. All reports pertaining to PSVR were included. Reports on simulators that did not incorporate a haptic interface device were excluded from the review. Fifteen reports described 12 simulators that enabled PSVR. Medical procedures in the field of laparoscopy, vascular surgery, orthopedics, neurosurgery, and plastic surgery were included. In all cases, source data was two-dimensional CT or MRI data. Face validity was most commonly reported. Only one (vascular) simulator had undergone face, content, and construct validity. Of the 12 simulators, 1 is commercialized and 11 are prototypes. Five simulators have been used in conjunction with real patient procedures. PSVR is a promising technological advance within medicine. The majority of simulators are still in the prototype phase. As further developments unfold, the validity of PSVR will have to be examined much like generic VR simulation for training purposes. Nonetheless, similar to the aviation, military, and sport industries, operative performance and patient safety may be enhanced by the application of this novel technology.

  17. Petascale Many Body Methods for Complex Correlated Systems

    NASA Astrophysics Data System (ADS)

    Pruschke, Thomas

    2012-02-01

    Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.

  18. Simulator design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerald R.

    1992-01-01

    This simulation design task completion report documents the simulation techniques associated with the network models of both the Interim Service ISDN (integrated services digital network) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures. The ISIS network model design represents satellite systems like the Advanced Communication Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) program, moves all control and switching functions on-board the next generation ISDN communication satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete events simulation experiments will be performed with these models using various traffic scenarios, design parameters and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  19. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  20. Tri-FAST Hardware-in-the-Loop Simulation. Volume I. Tri-FAST Hardware-in-the-Loop Simulation at the Advanced Simulation Center

    DTIC Science & Technology

    1979-03-28

    TECHNICAL REPORT T-79-43 TRI- FAST HARDWARE-IN-THE-LOOP SIMULATION Volume 1: Trn FAST Hardware-In-the. Loop Simulation at the Advanced Simulation...Identify by block number) Tri- FAST Hardware-in-the-Loop ACSL Advanced Simulation Center Simulation RF Target Models I a. AfIACT ( sin -oveme skit N nem...e n tdositr by block number) The purpose of this report is to document the Tri- FAST missile simulation development and the seeker hardware-in-the

  1. Damaris: Addressing performance variability in data management for post-petascale simulations

    DOE PAGES

    Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck; ...

    2016-10-01

    With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less

  2. Damaris: Addressing performance variability in data management for post-petascale simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck

    With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less

  3. Advanced ST plasma scenario simulations for NSTX

    NASA Astrophysics Data System (ADS)

    Kessel, C. E.; Synakowski, E. J.; Bell, M. E.; Gates, D. A.; Harvey, R. W.; Kaye, S. M.; Mau, T. K.; Menard, J.; Phillips, C. K.; Taylor, G.; Wilson, R.; NSTX Research Team

    2005-08-01

    Integrated scenario simulations are done for NSTX that address four primary objectives for developing advanced spherical torus (ST) configurations: high β and high βN inductive discharges to study all aspects of ST physics in the high β regime; non-inductively sustained discharges for flattop times greater than the skin time to study the various current drive techniques; non-inductively sustained discharges at high β for flattop times much greater than a skin time which provides the integrated advanced ST target for NSTX and non-solenoidal startup and plasma current rampup. The simulations done here use the tokamak simulation code and are based on a discharge 109070. TRANSP analysis of the discharge provided the thermal diffusivities for electrons and ions, the neutral beam deposition profile and other characteristics. CURRAY is used to calculate the high harmonic fast wave (HHFW) heating depositions and current drive. GENRAY/CQL3D is used to establish the heating and CD deposition profiles for electron Bernstein waves (EBW). Analysis of the ideal MHD stability is done with JSOLVER, BALMSC and PEST2. The simulations indicate that the integrated advanced ST plasma is reachable, obtaining stable plasmas with βT ap 40% at βN's of 7.7-9, IP = 1.0 MA and BT = 0.35 T. The plasma is 100% non-inductive and has a flattop of four skin times. The resulting global energy confinement corresponds to a multiplier of H98(y),2 = 1.5. The simulations have demonstrated the importance of HHFW heating and CD, EBW off-axis CD, strong plasma shaping, density control and early heating/H-mode transition for producing and optimizing these plasma configurations.

  4. SciDAC GSEP: Gyrokinetic Simulation of Energetic Particle Turbulence and Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Zhihong

    Energetic particle (EP) confinement is a key physics issue for burning plasma experiment ITER, the crucial next step in the quest for clean and abundant energy, since ignition relies on self-heating by energetic fusion products (α-particles). Due to the strong coupling of EP with burning thermal plasmas, plasma confinement property in the ignition regime is one of the most uncertain factors when extrapolating from existing fusion devices to the ITER tokamak. EP population in current tokamaks are mostly produced by auxiliary heating such as neutral beam injection (NBI) and radio frequency (RF) heating. Remarkable progress in developing comprehensive EP simulationmore » codes and understanding basic EP physics has been made by two concurrent SciDAC EP projects GSEP funded by the Department of Energy (DOE) Office of Fusion Energy Science (OFES), which have successfully established gyrokinetic turbulence simulation as a necessary paradigm shift for studying the EP confinement in burning plasmas. Verification and validation have rapidly advanced through close collaborations between simulation, theory, and experiment. Furthermore, productive collaborations with computational scientists have enabled EP simulation codes to effectively utilize current petascale computers and emerging exascale computers. We review here key physics progress in the GSEP projects regarding verification and validation of gyrokinetic simulations, nonlinear EP physics, EP coupling with thermal plasmas, and reduced EP transport models. Advances in high performance computing through collaborations with computational scientists that enable these large scale electromagnetic simulations are also highlighted. These results have been widely disseminated in numerous peer-reviewed publications including many Phys. Rev. Lett. papers and many invited presentations at prominent fusion conferences such as the biennial International Atomic Energy Agency (IAEA) Fusion Energy Conference and the annual meeting

  5. Simulation training in neurosurgery: advances in education and practice

    PubMed Central

    Konakondla, Sanjay; Fong, Reginald; Schirmer, Clemens M

    2017-01-01

    The current simulation technology used for neurosurgical training leaves much to be desired. Significant efforts are thoroughly exhausted in hopes of developing simulations that translate to give learners the “real-life” feel. Though a respectable goal, this may not be necessary as the application for simulation in neurosurgical training may be most useful in early learners. The ultimate uniformly agreeable endpoint of improved outcome and patient safety drives these investments. We explore the development, availability, educational taskforces, cost burdens and the simulation advancements in neurosurgical training. The technologies can be directed at achieving early resident milestones placed by the Accreditation Council for Graduate Medical Education. We discuss various aspects of neurosurgery disciplines with specific technologic advances of simulation software. An overview of the scholarly landscape of the recent publications in the realm of medical simulation and virtual reality pertaining to neurologic surgery is provided. We analyze concurrent concept overlap between PubMed headings and provide a graphical overview of the associations between these terms. PMID:28765716

  6. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.

    2016-01-01

    MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  7. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  8. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  9. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Advanced Simulation H Appendix H to Part... Simulation Link to an amendment published at 78 FR 67846, Nov. 12, 2013. This appendix provides guidelines... Simulation Training Program For an operator to conduct Level C or D training under this appendix all required...

  10. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE PAGES

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...

    2016-01-01

    We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  11. Advanced Computer Simulations of Military Incinerators

    DTIC Science & Technology

    2004-12-01

    Reaction Engineering International (REI) has developed advanced computer simulation tools for analyzing chemical demilitarization incinerators. The...Manager, 2003a: Summary of Engineering Design Study Projectile Washout System (PWS) Testing. Assembled Chemical Weapons Alternatives (ACWA), Final... Engineering Design Studies for Demilitarization of Assembled Chemical Weapons at Pueblo Chemical Depot. O’Shea, L. et al, 2003: RIM 57 – Monitoring in

  12. From petascale to exascale, the future of simulated climate data (Invited)

    NASA Astrophysics Data System (ADS)

    Lawrence, B.; Juckes, M. N.

    2013-12-01

    Coleridge ought to have said: data, data, everywhere, and all the data centres groan, data data everywhere, nor any I should clone. Except of course, he didn't say it, and we do clone data! While we've been dealing with terabytes of simulated datasets, downloading ("cloning") and analysing, has been a plausible way forward. In doing so, we have set up systems that support four broad classes of activities: personal and institutional data analysis, federated data systems, and data portals. We use metadata to manage the migration of data between these (and their communities) and we have built software systems. However, our metadata and software solutions are fragile, often based on soft money, and loose governance arrangements. We often download data with minimal provenance, and often many of us download the same data. In the not too distant future we can imagine exabytes of data being produced, and all these problems will get worse. Arguably we have no plausible methods of effectively exploiting such data - particularly if the analysis requires intercomparison. Yet of course, we know full well that intercomparison is at the heart of climate science. In this talk, we review the current status of simulation data management, with special emphasis on accessibility and usability. We talk about file formats, bundles of files, real and virtual, and simulation metadata. We introduce the InfraStructure for the European Network for Earth Simulation (IS-ENES) and its relationship with the Earth System Grid Federation (ESGF) as well as JASMIN, the UK Joint Analysis System. There will be a small digression on parallel data analysis - locally and distributed. we then progress to the near term problems (and solutions) for climate data before scoping out the problems of the future, both for data handling, and the models that produce the data. The way we think about data, computing, models, even ensemble design, may need to change.

  13. Simulating advanced life support systems to test integrated control approaches

    NASA Astrophysics Data System (ADS)

    Kortenkamp, D.; Bell, S.

    Simulations allow for testing of life support control approaches before hardware is designed and built. Simulations also allow for the safe exploration of alternative control strategies during life support operation. As such, they are an important component of any life support research program and testbed. This paper describes a specific advanced life support simulation being created at NASA Johnson Space Center. It is a discrete-event simulation that is dynamic and stochastic. It simulates all major components of an advanced life support system, including crew (with variable ages, weights and genders), biomass production (with scalable plantings of ten different crops), water recovery, air revitalization, food processing, solid waste recycling and energy production. Each component is modeled as a producer of certain resources and a consumer of certain resources. The control system must monitor (via sensors) and control (via actuators) the flow of resources throughout the system to provide life support functionality. The simulation is written in an object-oriented paradigm that makes it portable, extensible and reconfigurable.

  14. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  15. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Advanced Simulation H Appendix H to Part... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Pt. 121, App. H Appendix H to Part 121—Advanced... ensure that all instructors and check airmen used in appendix H training and checking are highly...

  16. Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA

    NASA Astrophysics Data System (ADS)

    Messer, O. E. B.; Harris, J. A.; Hix, W. R.; Lentz, E. J.; Bruenn, S. W.; Mezzacappa, A.

    2018-04-01

    Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport, and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.

  17. SSAGES: Software Suite for Advanced General Ensemble Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less

  18. Quantum Monte Carlo Endstation for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lubos Mitas

    2011-01-26

    NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlomore » code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in

  19. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2017-12-13

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  20. Center for Advanced Modeling and Simulation Intern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gertman, Vanessa

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  1. SSAGES: Software Suite for Advanced General Ensemble Simulations.

    PubMed

    Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  2. SSAGES: Software Suite for Advanced General Ensemble Simulations

    NASA Astrophysics Data System (ADS)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  3. Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messer, Bronson; Harris, James Austin; Hix, William Raphael

    Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport,more » and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.« less

  4. Alignment and Initial Operation of an Advanced Solar Simulator

    NASA Technical Reports Server (NTRS)

    Jaworske, Donald A.; Jefferies, Kent S.; Mason, Lee S.

    1996-01-01

    A solar simulator utilizing nine 30-kW xenon arc lamps was built to provide radiant power for testing a solar dynamic space power system in a thermal vacuum environment. The advanced solar simulator achieved the following values specific to the solar dynamic system: (1) a subtense angle of 1 deg; (2) the ability to vary solar simulator intensity up to 1.7 kW/sq m; (3) a beam diameter of 4.8 m; and (4) uniformity of illumination on the order of +/-10%. The flexibility of the solar simulator design allows for other potential uses of the facility.

  5. Advanced Helmet Mounted Display (AHMD) for simulator applications

    NASA Astrophysics Data System (ADS)

    Sisodia, Ashok; Riser, Andrew; Bayer, Michael; McGuire, James P.

    2006-05-01

    The Advanced Helmet Mounted Display (AHMD), augmented reality visual system first presented at last year's Cockpit and Future Displays for Defense and Security conference, has now been evaluated in a number of military simulator applications and by L-3 Link Simulation and Training. This paper presents the preliminary results of these evaluations and describes current and future simulator and training applications for HMD technology. The AHMD blends computer-generated data (symbology, synthetic imagery, enhanced imagery) with the actual and simulated visible environment. The AHMD is designed specifically for highly mobile deployable, minimum resource demanding reconfigurable virtual training systems to satisfy the military's in-theater warrior readiness objective. A description of the innovative AHMD system and future enhancements will be discussed.

  6. UCLA Final Technical Report for the "Community Petascale Project for Accelerator Science and Simulation”.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Warren

    The UCLA Plasma Simulation Group is a major partner of the “Community Petascale Project for Accelerator Science and Simulation”. This is the final technical report. We include an overall summary, a list of publications, progress for the most recent year, and individual progress reports for each year. We have made tremendous progress during the three years. SciDAC funds have contributed to the development of a large number of skeleton codes that illustrate how to write PIC codes with a hierarchy of parallelism. These codes cover 2D and 3D as well as electrostatic solvers (which are used in beam dynamics codesmore » and quasi-static codes) and electromagnetic solvers (which are used in plasma based accelerator codes). We also used these ideas to develop a GPU enabled version of OSIRIS. SciDAC funds were also contributed to the development of strategies to eliminate the Numerical Cerenkov Instability (NCI) which is an issue when carrying laser wakefield accelerator (LWFA) simulations in a boosted frame and when quantifying the emittance and energy spread of self-injected electron beams. This work included the development of a new code called UPIC-EMMA which is an FFT based electromagnetic PIC code and to new hybrid algorithms in OSIRIS. A new hybrid (PIC in r-z and gridless in φ) algorithm was implemented into OSIRIS. In this algorithm the fields and current are expanded into azimuthal harmonics and the complex amplitude for each harmonic is calculated separately. The contributions from each harmonic are summed and then used to push the particles. This algorithm permits modeling plasma based acceleration with some 3D effects but with the computational load of an 2D r-z PIC code. We developed a rigorously charge conserving current deposit for this algorithm. Very recently, we made progress in combining the speed up from the quasi-3D algorithm with that from the Lorentz boosted frame. SciDAC funds also contributed to the improvement and speed up of the quasi

  7. Stochastic simulation of uranium migration at the Hanford 300 Area.

    PubMed

    Hammond, Glenn E; Lichtner, Peter C; Rockhold, Mark L

    2011-03-01

    This work focuses on the quantification of groundwater flow and subsequent U(VI) transport uncertainty due to heterogeneity in the sediment permeability at the Hanford 300 Area. U(VI) migration at the site is simulated with multiple realizations of stochastically-generated high resolution permeability fields and comparisons are made of cumulative water and U(VI) flux to the Columbia River. The massively parallel reactive flow and transport code PFLOTRAN is employed utilizing 40,960 processor cores on DOE's petascale Jaguar supercomputer to simultaneously execute 10 transient, variably-saturated groundwater flow and U(VI) transport simulations within 3D heterogeneous permeability fields using the code's multi-realization simulation capability. Simulation results demonstrate that the cumulative U(VI) flux to the Columbia River is less responsive to fine scale heterogeneity in permeability and more sensitive to the distribution of permeability within the river hyporheic zone and mean permeability of larger-scale geologic structures at the site. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Petascale turbulence simulation using a highly parallel fast multipole method on GPUs

    NASA Astrophysics Data System (ADS)

    Yokota, Rio; Barba, L. A.; Narumi, Tetsu; Yasuoka, Kenji

    2013-03-01

    This paper reports large-scale direct numerical simulations of homogeneous-isotropic fluid turbulence, achieving sustained performance of 1.08 petaflop/s on GPU hardware using single precision. The simulations use a vortex particle method to solve the Navier-Stokes equations, with a highly parallel fast multipole method (FMM) as numerical engine, and match the current record in mesh size for this application, a cube of 40963 computational points solved with a spectral method. The standard numerical approach used in this field is the pseudo-spectral method, relying on the FFT algorithm as the numerical engine. The particle-based simulations presented in this paper quantitatively match the kinetic energy spectrum obtained with a pseudo-spectral method, using a trusted code. In terms of parallel performance, weak scaling results show the FMM-based vortex method achieving 74% parallel efficiency on 4096 processes (one GPU per MPI process, 3 GPUs per node of the TSUBAME-2.0 system). The FFT-based spectral method is able to achieve just 14% parallel efficiency on the same number of MPI processes (using only CPU cores), due to the all-to-all communication pattern of the FFT algorithm. The calculation time for one time step was 108 s for the vortex method and 154 s for the spectral method, under these conditions. Computing with 69 billion particles, this work exceeds by an order of magnitude the largest vortex-method calculations to date.

  9. Optimizing STEM Education with Advanced ICTs and Simulations

    ERIC Educational Resources Information Center

    Levin, Ilya, Ed.; Tsybulsky, Dina, Ed.

    2017-01-01

    The role of technology in educational settings has become increasingly prominent in recent years. When utilized effectively, these tools provide a higher quality of learning for students. "Optimizing STEM Education With Advanced ICTs and Simulations" is an innovative reference source for the latest scholarly research on the integration…

  10. Preface to advances in numerical simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Parker, Scott E.; Chacon, Luis

    2016-10-01

    This Journal of Computational Physics Special Issue, titled ;Advances in Numerical Simulation of Plasmas,; presents a snapshot of the international state of the art in the field of computational plasma physics. The articles herein are a subset of the topics presented as invited talks at the 24th International Conference on the Numerical Simulation of Plasmas (ICNSP), August 12-14, 2015 in Golden, Colorado. The choice of papers was highly selective. The ICNSP is held every other year and is the premier scientific meeting in the field of computational plasma physics.

  11. Hybrid and electric advanced vehicle systems (heavy) simulation

    NASA Technical Reports Server (NTRS)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  12. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    PubMed

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  13. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    PubMed Central

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  14. Interim Service ISDN Satellite (ISIS) simulator development for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The simulation development associated with the network models of both the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures is documented. The ISIS Network Model design represents satellite systems like the Advanced Communications Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) Program, moves all control and switching functions on-board the next generation ISDN communications satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete event simulation experiments will be performed with these models using various traffic scenarios, design parameters, and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  15. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.

    PubMed

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C

    2009-10-13

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.

  16. Advances in free-energy-based simulations of protein folding and ligand binding.

    PubMed

    Perez, Alberto; Morrone, Joseph A; Simmerling, Carlos; Dill, Ken A

    2016-02-01

    Free-energy-based simulations are increasingly providing the narratives about the structures, dynamics and biological mechanisms that constitute the fabric of protein science. Here, we review two recent successes. It is becoming practical: first, to fold small proteins with free-energy methods without knowing substructures and second, to compute ligand-protein binding affinities, not just their binding poses. Over the past 40 years, the timescales that can be simulated by atomistic MD are doubling every 1.3 years--which is faster than Moore's law. Thus, these advances are not simply due to the availability of faster computers. Force fields, solvation models and simulation methodology have kept pace with computing advancements, and are now quite good. At the tip of the spear recently are GPU-based computing, improved fast-solvation methods, continued advances in force fields, and conformational sampling methods that harness external information. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Tinker-HP: a massively parallel molecular dynamics package for multiscale simulations of large complex systems with advanced point dipole polarizable force fields† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c7sc04531j

    PubMed Central

    Lagardère, Louis; Jolly, Luc-Henri; Lipparini, Filippo; Aviat, Félix; Stamm, Benjamin; Jing, Zhifeng F.; Harger, Matthew; Torabifard, Hedieh; Cisneros, G. Andrés; Schnieders, Michael J.; Gresh, Nohad; Maday, Yvon; Ren, Pengyu Y.; Ponder, Jay W.

    2017-01-01

    We present Tinker-HP, a massively MPI parallel package dedicated to classical molecular dynamics (MD) and to multiscale simulations, using advanced polarizable force fields (PFF) encompassing distributed multipoles electrostatics. Tinker-HP is an evolution of the popular Tinker package code that conserves its simplicity of use and its reference double precision implementation for CPUs. Grounded on interdisciplinary efforts with applied mathematics, Tinker-HP allows for long polarizable MD simulations on large systems up to millions of atoms. We detail in the paper the newly developed extension of massively parallel 3D spatial decomposition to point dipole polarizable models as well as their coupling to efficient Krylov iterative and non-iterative polarization solvers. The design of the code allows the use of various computer systems ranging from laboratory workstations to modern petascale supercomputers with thousands of cores. Tinker-HP proposes therefore the first high-performance scalable CPU computing environment for the development of next generation point dipole PFFs and for production simulations. Strategies linking Tinker-HP to Quantum Mechanics (QM) in the framework of multiscale polarizable self-consistent QM/MD simulations are also provided. The possibilities, performances and scalability of the software are demonstrated via benchmarks calculations using the polarizable AMOEBA force field on systems ranging from large water boxes of increasing size and ionic liquids to (very) large biosystems encompassing several proteins as well as the complete satellite tobacco mosaic virus and ribosome structures. For small systems, Tinker-HP appears to be competitive with the Tinker-OpenMM GPU implementation of Tinker. As the system size grows, Tinker-HP remains operational thanks to its access to distributed memory and takes advantage of its new algorithmic enabling for stable long timescale polarizable simulations. Overall, a several thousand-fold acceleration over

  18. Probabilistic Photometric Redshifts in the Era of Petascale Astronomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrasco Kind, Matias

    2014-01-01

    crucial to enable the development of precision cosmology in the era of petascale astronomical surveys.« less

  19. Monte Carlo simulation models of breeding-population advancement.

    Treesearch

    J.N. King; G.R. Johnson

    1993-01-01

    Five generations of population improvement were modeled using Monte Carlo simulations. The model was designed to address questions that are important to the development of an advanced generation breeding population. Specifically we addressed the effects on both gain and effective population size of different mating schemes when creating a recombinant population for...

  20. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  1. Psychometric and Evidentiary Advances, Opportunities, and Challenges for Simulation-Based Assessment

    ERIC Educational Resources Information Center

    Levy, Roy

    2013-01-01

    This article characterizes the advances, opportunities, and challenges for psychometrics of simulation-based assessments through a lens that views assessment as evidentiary reasoning. Simulation-based tasks offer the prospect for student experiences that differ from traditional assessment. Such tasks may be used to support evidentiary arguments…

  2. Evaluation of the Inertial Response of Variable-Speed Wind Turbines Using Advanced Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholbrock, Andrew K; Muljadi, Eduard; Gevorgian, Vahan

    In this paper, we focus on the temporary frequency support effect provided by wind turbine generators (WTGs) through the inertial response. With the implemented inertial control methods, the WTG is capable of increasing its active power output by releasing parts of the stored kinetic energy when the frequency excursion occurs. The active power can be boosted temporarily above the maximum power points, but the rotor speed deceleration follows and an active power output deficiency occurs during the restoration of rotor kinetic energy. We evaluate and compare the inertial response induced by two distinct inertial control methods using advanced simulation. Inmore » the first stage, the proposed inertial control methods are analyzed in offline simulation. Using an advanced wind turbine simulation program, FAST with TurbSim, the response of the researched wind turbine is comprehensively evaluated under turbulent wind conditions, and the impact on the turbine mechanical components are assessed. In the second stage, the inertial control is deployed on a real 600kW wind turbine - Controls Advanced Research Turbine, 3-bladed (CART3), which further verifies the inertial control through a hardware-in-the-loop (HIL) simulation. Various inertial control methods can be effectively evaluated based on the proposed two-stage simulation platform, which combines the offline simulation and real-time HIL simulation. The simulation results also provide insights in designing inertial control for WTGs.« less

  3. Numerical characterization of landing gear aeroacoustics using advanced simulation and analysis techniques

    NASA Astrophysics Data System (ADS)

    Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.

    2017-09-01

    With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.

  4. Co-Simulation for Advanced Process Design and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephen E. Zitney

    2009-01-01

    Meeting the increasing demand for clean, affordable, and secure energy is arguably the most important challenge facing the world today. Fossil fuels can play a central role in a portfolio of carbon-neutral energy options provided CO{sub 2} emissions can be dramatically reduced by capturing CO{sub 2} and storing it safely and effectively. Fossil energy industry faces the challenge of meeting aggressive design goals for next-generation power plants with CCS. Process designs will involve large, highly-integrated, and multipurpose systems with advanced equipment items with complex geometries and multiphysics. APECS is enabling software to facilitate effective integration, solution, and analysis of high-fidelitymore » process/equipment (CFD) co-simulations. APECS helps to optimize fluid flow and related phenomena that impact overall power plant performance. APECS offers many advanced capabilities including ROMs, design optimization, parallel execution, stochastic analysis, and virtual plant co-simulations. NETL and its collaborative R&D partners are using APECS to reduce the time, cost, and technical risk of developing high-efficiency, zero-emission power plants with CCS.« less

  5. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas

    2009-01-01

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less

  6. Man-vehicle systems research facility advanced aircraft flight simulator throttle mechanism

    NASA Technical Reports Server (NTRS)

    Kurasaki, S. S.; Vallotton, W. C.

    1985-01-01

    The Advanced Aircraft Flight Simulator is equipped with a motorized mechanism that simulates a two engine throttle control system that can be operated via a computer driven performance management system or manually by the pilots. The throttle control system incorporates features to simulate normal engine operations and thrust reverse and vary the force feel to meet a variety of research needs. While additional testing to integrate the work required is principally now in software design, since the mechanical aspects function correctly. The mechanism is an important part of the flight control system and provides the capability to conduct human factors research of flight crews with advanced aircraft systems under various flight conditions such as go arounds, coupled instrument flight rule approaches, normal and ground operations and emergencies that would or would not normally be experienced in actual flight.

  7. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  8. Advanced Simulation of Coupled Earthquake and Tsunami Events

    NASA Astrophysics Data System (ADS)

    Behrens, Joern

    2013-04-01

    Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.

  9. High-Fidelity Simulation for Advanced Cardiac Life Support Training

    PubMed Central

    Davis, Lindsay E.; Storjohann, Tara D.; Spiegel, Jacqueline J.; Beiber, Kellie M.

    2013-01-01

    Objective. To determine whether a high-fidelity simulation technique compared with lecture would produce greater improvement in advanced cardiac life support (ACLS) knowledge, confidence, and overall satisfaction with the training method. Design. This sequential, parallel-group, crossover trial randomized students into 2 groups distinguished by the sequence of teaching technique delivered for ACLS instruction (ie, classroom lecture vs high-fidelity simulation exercise). Assessment. Test scores on a written examination administered at baseline and after each teaching technique improved significantly from baseline in all groups but were highest when lecture was followed by simulation. Simulation was associated with a greater degree of overall student satisfaction compared with lecture. Participation in a simulation exercise did not improve pharmacy students’ knowledge of ACLS more than attending a lecture, but it was associated with improved student confidence in skills and satisfaction with learning and application. Conclusions. College curricula should incorporate simulation to complement but not replace lecture for ACLS education. PMID:23610477

  10. High-fidelity simulation for advanced cardiac life support training.

    PubMed

    Davis, Lindsay E; Storjohann, Tara D; Spiegel, Jacqueline J; Beiber, Kellie M; Barletta, Jeffrey F

    2013-04-12

    OBJECTIVE. To determine whether a high-fidelity simulation technique compared with lecture would produce greater improvement in advanced cardiac life support (ACLS) knowledge, confidence, and overall satisfaction with the training method. DESIGN. This sequential, parallel-group, crossover trial randomized students into 2 groups distinguished by the sequence of teaching technique delivered for ACLS instruction (ie, classroom lecture vs high-fidelity simulation exercise). ASSESSMENT. Test scores on a written examination administered at baseline and after each teaching technique improved significantly from baseline in all groups but were highest when lecture was followed by simulation. Simulation was associated with a greater degree of overall student satisfaction compared with lecture. Participation in a simulation exercise did not improve pharmacy students' knowledge of ACLS more than attending a lecture, but it was associated with improved student confidence in skills and satisfaction with learning and application. CONCLUSIONS. College curricula should incorporate simulation to complement but not replace lecture for ACLS education.

  11. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  12. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less

  13. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meisner, Robert; McCoy, Michel; Archer, Bill

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  14. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less

  15. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  16. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    NASA Technical Reports Server (NTRS)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  17. Technical Advancements in Simulator-Based Weapons Team Training.

    DTIC Science & Technology

    1991-04-01

    Acting Head H.C. OKRASKI, Director Advanced Simulation Concepts Research and Engineering Division Department SPECIAL REPORT 91-003 GOVERNMENT RIGHTS...City. State, and ZWCode)I12350 Research Parkway Orlando-, FL 32826-3224 SS... NAME OF FUNDING / SPONSORING 8 b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT...necessary and idmntify’ by block number) The research and development reported here represents one phase of a broader effort to improve the

  18. Preliminary simulation of an advanced, hingless rotor XV-15 tilt-rotor aircraft

    NASA Technical Reports Server (NTRS)

    Mcveigh, M. A.

    1976-01-01

    The feasibility of the tilt-rotor concept was verified through investigation of the performance, stability and handling qualities of the XV-15 tilt rotor. The rotors were replaced by advanced-technology fiberglass/composite hingless rotors of larger diameter, combined with an advanced integrated fly-by-wire control system. A parametric simulation model of the HRXV-15 was developed, model was used to define acceptable preliminary ranges of primary and secondary control schedules as functions of the flight parameters, to evaluate performance, flying qualities and structural loads, and to have a Boeing-Vertol pilot conduct a simulated flight test evaluation of the aircraft.

  19. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  20. A Distributed Simulation Facility to Support Human Factors Research in Advanced Air Transportation Technology

    NASA Technical Reports Server (NTRS)

    Amonlirdviman, Keith; Farley, Todd C.; Hansman, R. John, Jr.; Ladik, John F.; Sherer, Dana Z.

    1998-01-01

    A distributed real-time simulation of the civil air traffic environment developed to support human factors research in advanced air transportation technology is presented. The distributed environment is based on a custom simulation architecture designed for simplicity and flexibility in human experiments. Standard Internet protocols are used to create the distributed environment, linking all advanced cockpit simulator, all Air Traffic Control simulator, and a pseudo-aircraft control and simulation management station. The pseudo-aircraft control station also functions as a scenario design tool for coordinating human factors experiments. This station incorporates a pseudo-pilot interface designed to reduce workload for human operators piloting multiple aircraft simultaneously in real time. The application of this distributed simulation facility to support a study of the effect of shared information (via air-ground datalink) on pilot/controller shared situation awareness and re-route negotiation is also presented.

  1. Using 100G Network Technology in Support of Petascale Science

    NASA Technical Reports Server (NTRS)

    Gary, James P.

    2011-01-01

    NASA in collaboration with a number of partners conducted a set of individual experiments and demonstrations during SC 10 that collectively were titled "Using 100G Network Technology in Support of Petascale Science". The partners included the iCAIR, Internet2, LAC, MAX, National LambdaRail (NLR), NOAA and SCinet Research Sandbox (SRS) as well as the vendors Ciena, Cisco, ColorChip, cPacket, Extreme Networks, Fusion-io, HP and Panduit who most generously allowed some of their leading edge 40G/100G optical transport, Ethernet switch and Internet Protocol router equipment and file server technologies to be involved. The experiments and demonstrations featured different vendor-provided 40G/100G network technology solutions for full-duplex 40G and 100G LAN data flows across SRS-deployed single-node fiber-pairs among the Exhibit Booths of NASA, the National Center for Data lining, NOAA and the SCinet Network Operations Center, as well as between the NASA Exhibit Booth in New Orleans and the Starlight Communications Exchange facility in Chicago across special SC 10- only 80- and 100-Gbps wide area network links provisioned respectively by the NLR and Internet2, then on to GSFC across a 40-Gbps link. provisioned by the Mid-Atlantic Crossroads. The networks and vendor equipment were load-stressed by sets of NASA/GSFC High End Computer Network Team-built, relatively inexpensive, net-test-workstations that are capable of demonstrating greater than 100Gbps uni-directional nuttcp-enabled memory-to-memory data transfers, greater than 80-Gbps aggregate--bidirectional memory-to-memory data transfers, and near 40-Gbps uni-directional disk-to-disk file copying. This paper will summarize the background context, key accomplishments and some significances of these experiments and demonstrations.

  2. PoPLAR: Portal for Petascale Lifescience Applications and Research

    PubMed Central

    2013-01-01

    Background We are focusing specifically on fast data analysis and retrieval in bioinformatics that will have a direct impact on the quality of human health and the environment. The exponential growth of data generated in biology research, from small atoms to big ecosystems, necessitates an increasingly large computational component to perform analyses. Novel DNA sequencing technologies and complementary high-throughput approaches--such as proteomics, genomics, metabolomics, and meta-genomics--drive data-intensive bioinformatics. While individual research centers or universities could once provide for these applications, this is no longer the case. Today, only specialized national centers can deliver the level of computing resources required to meet the challenges posed by rapid data growth and the resulting computational demand. Consequently, we are developing massively parallel applications to analyze the growing flood of biological data and contribute to the rapid discovery of novel knowledge. Methods The efforts of previous National Science Foundation (NSF) projects provided for the generation of parallel modules for widely used bioinformatics applications on the Kraken supercomputer. We have profiled and optimized the code of some of the scientific community's most widely used desktop and small-cluster-based applications, including BLAST from the National Center for Biotechnology Information (NCBI), HMMER, and MUSCLE; scaled them to tens of thousands of cores on high-performance computing (HPC) architectures; made them robust and portable to next-generation architectures; and incorporated these parallel applications in science gateways with a web-based portal. Results This paper will discuss the various developmental stages, challenges, and solutions involved in taking bioinformatics applications from the desktop to petascale with a front-end portal for very-large-scale data analysis in the life sciences. Conclusions This research will help to bridge the gap

  3. Evaluation of the Inertial Response of Variable-Speed Wind Turbines Using Advanced Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholbrock, Andrew K; Muljadi, Eduard; Gevorgian, Vahan

    In this paper, we focus on the temporary frequency support effect provided by wind turbine generators (WTGs) through the inertial response. With the implemented inertial control methods, the WTG is capable of increasing its active power output by releasing parts of the stored kinetic energy when the frequency excursion occurs. The active power can be boosted temporarily above the maximum power points, but the rotor speed deceleration follows and an active power output deficiency occurs during the restoration of rotor kinetic energy. In this paper, we evaluate and compare the inertial response induced by two distinct inertial control methods usingmore » advanced simulation. In the first stage, the proposed inertial control methods are analyzed in offline simulation. Using an advanced wind turbine simulation program, FAST with TurbSim, the response of the researched wind turbine is comprehensively evaluated under turbulent wind conditions, and the impact on the turbine mechanical components are assessed. In the second stage, the inertial control is deployed on a real 600-kW wind turbine, the three-bladed Controls Advanced Research Turbine, which further verifies the inertial control through a hardware-in-the-loop simulation. Various inertial control methods can be effectively evaluated based on the proposed two-stage simulation platform, which combines the offline simulation and real-time hardware-in-the-loop simulation. The simulation results also provide insights in designing inertial control for WTGs.« less

  4. Graphics simulation and training aids for advanced teleoperation

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Schenker, Paul S.; Bejczy, Antal K.

    1993-01-01

    Graphics displays can be of significant aid in accomplishing a teleoperation task throughout all three phases of off-line task analysis and planning, operator training, and online operation. In the first phase, graphics displays provide substantial aid to investigate work cell layout, motion planning with collision detection and with possible redundancy resolution, and planning for camera views. In the second phase, graphics displays can serve as very useful tools for introductory training of operators before training them on actual hardware. In the third phase, graphics displays can be used for previewing planned motions and monitoring actual motions in any desired viewing angle, or, when communication time delay prevails, for providing predictive graphics overlay on the actual camera view of the remote site to show the non-time-delayed consequences of commanded motions in real time. This paper addresses potential space applications of graphics displays in all three operational phases of advanced teleoperation. Possible applications are illustrated with techniques developed and demonstrated in the Advanced Teleoperation Laboratory at JPL. The examples described include task analysis and planning of a simulated Solar Maximum Satellite Repair task, a novel force-reflecting teleoperation simulator for operator training, and preview and predictive displays for on-line operations.

  5. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  6. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  7. Simulation-trained junior residents perform better than general surgeons on advanced laparoscopic cases.

    PubMed

    Boza, Camilo; León, Felipe; Buckel, Erwin; Riquelme, Arnoldo; Crovari, Fernando; Martínez, Jorge; Aggarwal, Rajesh; Grantcharov, Teodor; Jarufe, Nicolás; Varas, Julián

    2017-01-01

    Multiple simulation training programs have demonstrated that effective transfer of skills can be attained and applied into a more complex scenario, but evidence regarding transfer to the operating room is limited. To assess junior residents trained with simulation performing an advanced laparoscopic procedure in the OR and compare results to those of general surgeons without simulation training and expert laparoscopic surgeons. Experimental study: After a validated 16-session advanced laparoscopy simulation training program, junior trainees were compared to general surgeons (GS) with no simulation training and expert bariatric surgeons (BS) in performing a stapled jejuno-jejunostomy (JJO) in the OR. Global rating scale (GRS) and specific rating scale scores, operative time and the distance traveled by both hands measured with a tracking device, were assessed. In addition, all perioperative and immediate postoperative morbidities were registered. Ten junior trainees, 12 GS and 5 BS experts were assessed performing a JJO in the OR. All trainees completed the entire JJO in the OR without any takeovers by the BS. Six (50 %) BS takeovers took place in the GS group. Trainees had significantly better results in all measured outcomes when compared to GS with considerable higher GRS median [19.5 (18.8-23.5) vs. 12 (9-13.8) p < 0.001] and lower operative time. One morbidity was registered; a patient in the trainees group was readmitted at postoperative day 10 for mechanical ileus that resolved with medical treatment. This study demonstrated transfer of advanced laparoscopic skills acquired through a simulated training program in novice surgical residents to the OR.

  8. Development of a VOR/DME model for an advanced concepts simulator

    NASA Technical Reports Server (NTRS)

    Steinmetz, G. G.; Bowles, R. L.

    1984-01-01

    The report presents a definition of a VOR/DME, airborne and ground systems simulation model. This description was drafted in response to a need in the creation of an advanced concepts simulation in which flight station design for the 1980 era can be postulated and examined. The simulation model described herein provides a reasonable representation of VOR/DME station in the continental United States including area coverage by type and noise errors. The detail in which the model has been cast provides the interested researcher with a moderate fidelity level simulator tool for conducting research and evaluation of navigator algorithms. Assumptions made within the development are listed and place certain responsibilities (data bases, communication with other simulation modules, uniform round earth, etc.) upon the researcher.

  9. CHARMM-GUI PDB manipulator for advanced modeling and simulations of proteins containing nonstandard residues.

    PubMed

    Jo, Sunhwan; Cheng, Xi; Islam, Shahidul M; Huang, Lei; Rui, Huan; Zhu, Allen; Lee, Hui Sun; Qi, Yifei; Han, Wei; Vanommeslaeghe, Kenno; MacKerell, Alexander D; Roux, Benoît; Im, Wonpil

    2014-01-01

    CHARMM-GUI, http://www.charmm-gui.org, is a web-based graphical user interface to prepare molecular simulation systems and input files to facilitate the usage of common and advanced simulation techniques. Since it is originally developed in 2006, CHARMM-GUI has been widely adopted for various purposes and now contains a number of different modules designed to setup a broad range of simulations including free energy calculation and large-scale coarse-grained representation. Here, we describe functionalities that have recently been integrated into CHARMM-GUI PDB Manipulator, such as ligand force field generation, incorporation of methanethiosulfonate spin labels and chemical modifiers, and substitution of amino acids with unnatural amino acids. These new features are expected to be useful in advanced biomolecular modeling and simulation of proteins. © 2014 Elsevier Inc. All rights reserved.

  10. Analysis of PV Advanced Inverter Functions and Setpoints under Time Series Simulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seuss, John; Reno, Matthew J.; Broderick, Robert Joseph

    Utilities are increasingly concerned about the potential negative impacts distributed PV may have on the operational integrity of their distribution feeders. Some have proposed novel methods for controlling a PV system's grid - tie inverter to mitigate poten tial PV - induced problems. This report investigates the effectiveness of several of these PV advanced inverter controls on improving distribution feeder operational metrics. The controls are simulated on a large PV system interconnected at several locations within two realistic distribution feeder models. Due to the time - domain nature of the advanced inverter controls, quasi - static time series simulations aremore » performed under one week of representative variable irradiance and load data for each feeder. A para metric study is performed on each control type to determine how well certain measurable network metrics improve as a function of the control parameters. This methodology is used to determine appropriate advanced inverter settings for each location on the f eeder and overall for any interconnection location on the feeder.« less

  11. Advanced simulation study on bunch gap transient effect

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuya; Akai, Kazunori

    2016-06-01

    Bunch phase shift along the train due to a bunch gap transient is a concern in high-current colliders. In KEKB operation, the measured phase shift along the train agreed well with a simulation and a simple analytical form in most part of the train. However, a rapid phase change was observed at the leading part of the train, which was not predicted by the simulation or by the analytical form. In order to understand the cause of this observation, we have developed an advanced simulation, which treats the transient loading in each of the cavities of the three-cavity system of the accelerator resonantly coupled with energy storage (ARES) instead of the equivalent single cavities used in the previous simulation, operating in the accelerating mode. In this paper, we show that the new simulation reproduces the observation, and clarify that the rapid phase change at the leading part of the train is caused by a transient loading in the three-cavity system of ARES. KEKB is being upgraded to SuperKEKB, which is aiming at 40 times higher luminosity than KEKB. The gap transient in SuperKEKB is investigated using the new simulation, and the result shows that the rapid phase change at the leading part of the train is much larger due to higher beam currents. We will also present measures to mitigate possible luminosity reduction or beam performance deterioration due to the rapid phase change caused by the gap transient.

  12. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  13. MAESTRO: Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology

    NASA Astrophysics Data System (ADS)

    Barthe, Jean; Hugon, Régis; Nicolai, Jean Philippe

    2007-12-01

    The integrated project MAESTRO (Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology) under contract with the European Commission in life sciences FP6 (LSHC-CT-2004-503564), concerns innovative research to develop and validate in clinical conditions, advanced methods and equipment needed in cancer treatment for new modalities in high-conformal external radiotherapy using electrons, photons and protons beams of high energy.

  14. Advanced Simulation in Undergraduate Pilot Training: Systems Integration. Final Report (February 1972-March 1975).

    ERIC Educational Resources Information Center

    Larson, D. F.; Terry, C.

    The Advanced Simulator for Undergraduate Pilot Training (ASUPT) was designed to investigate the role of simulation in the future Undergraduate Pilot Training (UPT) program. The problem addressed in this report was one of integrating two unlike components into one synchronized system. These two components were the Basic T-37 Simulators and their…

  15. The Advanced Gamma-ray Imaging System (AGIS): Simulation Studies

    NASA Astrophysics Data System (ADS)

    Fegan, Stephen; Buckley, J. H.; Bugaev, S.; Funk, S.; Konopelko, A.; Maier, G.; Vassiliev, V. V.; Simulation Studies Working Group; AGIS Collaboration

    2008-03-01

    The Advanced Gamma-ray Imaging System (AGIS) is a concept for the next generation instrument in ground-based very high energy gamma-ray astronomy. It has the goal of achieving significant improvement in sensitivity over current experiments. We present the results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  16. Simulation of a synergistic six-post motion system on the flight simulator for advanced aircraft at NASA-Ames

    NASA Technical Reports Server (NTRS)

    Bose, S. C.; Parris, B. L.

    1977-01-01

    Motion system drive philosophy and corresponding real-time software have been developed for the purpose of simulating the characteristics of a typical synergistic Six-Post Motion System (SPMS) on the Flight Simulator for Advanced Aircraft (FSAA) at NASA-Ames which is a non-synergistic motion system. This paper gives a brief description of these two types of motion systems and the general methods of producing motion cues of the FSAA. An actuator extension transformation which allows the simulation of a typical SPMS by appropriate drive washout and variable position limiting is described.

  17. Reducing I/O variability using dynamic I/O path characterization in petascale storage systems

    DOE PAGES

    Son, Seung Woo; Sehrish, Saba; Liao, Wei-keng; ...

    2016-11-01

    In petascale systems with a million CPU cores, scalable and consistent I/O performance is becoming increasingly difficult to sustain mainly because of I/O variability. Furthermore, the I/O variability is caused by concurrently running processes/jobs competing for I/O or a RAID rebuild when a disk drive fails. We present a mechanism that stripes across a selected subset of I/O nodes with the lightest workload at runtime to achieve the highest I/O bandwidth available in the system. In this paper, we propose a probing mechanism to enable application-level dynamic file striping to mitigate I/O variability. We also implement the proposed mechanism inmore » the high-level I/O library that enables memory-to-file data layout transformation and allows transparent file partitioning using subfiling. Subfiling is a technique that partitions data into a set of files of smaller size and manages file access to them, making data to be treated as a single, normal file to users. Here, we demonstrate that our bandwidth probing mechanism can successfully identify temporally slower I/O nodes without noticeable runtime overhead. Experimental results on NERSC’s systems also show that our approach isolates I/O variability effectively on shared systems and improves overall collective I/O performance with less variation.« less

  18. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  19. Advanced radiometric and interferometric milimeter-wave scene simulations

    NASA Technical Reports Server (NTRS)

    Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.

    1993-01-01

    Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.

  20. Final report for Texas A&M University Group Contribution to DE-FG02-09ER25949/DE-SC0002505: Topology for Statistical Modeling of Petascale Data (and ASCR-funded collaboration between Sandia National Labs, Texas A&M University and University of Utah)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rojas, Joseph Maurice

    We summarize the contributions of the Texas A\\&M University Group to the project (DE-FG02-09ER25949/DE-SC0002505: Topology for Statistical Modeling of Petascale Data - an ASCR-funded collaboration between Sandia National Labs, Texas A\\&M U, and U Utah) during 6/9/2011 -- 2/27/2013.

  1. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  2. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  3. Development of Kinetic Mechanisms for Next-Generation Fuels and CFD Simulation of Advanced Combustion Engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitz, William J.; McNenly, Matt J.; Whitesides, Russell

    Predictive chemical kinetic models are needed to represent next-generation fuel components and their mixtures with conventional gasoline and diesel fuels. These kinetic models will allow the prediction of the effect of alternative fuel blends in CFD simulations of advanced spark-ignition and compression-ignition engines. Enabled by kinetic models, CFD simulations can be used to optimize fuel formulations for advanced combustion engines so that maximum engine efficiency, fossil fuel displacement goals, and low pollutant emission goals can be achieved.

  4. The role of numerical simulation for the development of an advanced HIFU system

    NASA Astrophysics Data System (ADS)

    Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro

    2014-10-01

    High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.

  5. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  6. Advances in simulation of wave interactions with extended MHD phenomena

    NASA Astrophysics Data System (ADS)

    Batchelor, D.; Abla, G.; D'Azevedo, E.; Bateman, G.; Bernholdt, D. E.; Berry, L.; Bonoli, P.; Bramley, R.; Breslau, J.; Chance, M.; Chen, J.; Choi, M.; Elwasif, W.; Foley, S.; Fu, G.; Harvey, R.; Jaeger, E.; Jardin, S.; Jenkins, T.; Keyes, D.; Klasky, S.; Kruger, S.; Ku, L.; Lynch, V.; McCune, D.; Ramos, J.; Schissel, D.; Schnack, D.; Wright, J.

    2009-07-01

    The Integrated Plasma Simulator (IPS) provides a framework within which some of the most advanced, massively-parallel fusion modeling codes can be interoperated to provide a detailed picture of the multi-physics processes involved in fusion experiments. The presentation will cover four topics: 1) recent improvements to the IPS, 2) application of the IPS for very high resolution simulations of ITER scenarios, 3) studies of resistive and ideal MHD stability in tokamk discharges using IPS facilities, and 4) the application of RF power in the electron cyclotron range of frequencies to control slowly growing MHD modes in tokamaks and initial evaluations of optimized location for RF power deposition.

  7. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  8. Advances in edge-diffraction modeling for virtual-acoustic simulations

    NASA Astrophysics Data System (ADS)

    Calamia, Paul Thomas

    In recent years there has been growing interest in modeling sound propagation in complex, three-dimensional (3D) virtual environments. With diverse applications for the military, the gaming industry, psychoacoustics researchers, architectural acousticians, and others, advances in computing power and 3D audio-rendering techniques have driven research and development aimed at closing the gap between the auralization and visualization of virtual spaces. To this end, this thesis focuses on improving the physical and perceptual realism of sound-field simulations in virtual environments through advances in edge-diffraction modeling. To model sound propagation in virtual environments, acoustical simulation tools commonly rely on geometrical-acoustics (GA) techniques that assume asymptotically high frequencies, large flat surfaces, and infinitely thin ray-like propagation paths. Such techniques can be augmented with diffraction modeling to compensate for the effect of surface size on the strength and directivity of a reflection, to allow for propagation around obstacles and into shadow zones, and to maintain soundfield continuity across reflection and shadow boundaries. Using a time-domain, line-integral formulation of the Biot-Tolstoy-Medwin (BTM) diffraction expression, this thesis explores various aspects of diffraction calculations for virtual-acoustic simulations. Specifically, we first analyze the periodic singularity of the BTM integrand and describe the relationship between the singularities and higher-order reflections within wedges with open angle less than 180°. Coupled with analytical approximations for the BTM expression, this analysis allows for accurate numerical computations and a continuous sound field in the vicinity of an arbitrary wedge geometry insonified by a point source. Second, we describe an edge-subdivision strategy that allows for fast diffraction calculations with low error relative to a numerically more accurate solution. Third, to address

  9. A Funding Simulation for Use in an Advanced Experimental Laboratory Class.

    ERIC Educational Resources Information Center

    Falkenberg, Virginia P.

    1981-01-01

    Describes a funding simulation for use with college students in an advanced experimental psychology laboratory. Students write an original research paper and submit it to the professor--the "funding agency"--as a grant proposal. Projects are funded with grade points with which the student director purchases help from unfunded classmates. (RM)

  10. Large blast and thermal simulator advanced concept driver design by computational fluid dynamics. Final report, 1987-1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opalka, K.O.

    1989-08-01

    The construction of a large test facility has been proposed for simulating the blast and thermal environment resulting from nuclear explosions. This facility would be used to test the survivability and vulnerability of military equipment such as trucks, tanks, and helicopters in a simulated thermal and blast environment, and to perform research into nuclear blast phenomenology. The proposed advanced design concepts, heating of driver gas and fast-acting throat valves for wave shaping, are described and the results of CFD studies to advance these new technical concepts fro simulating decaying blast waves are reported.

  11. The Advanced Gamma-ray Imaging System (AGIS)-Simulation Studies

    NASA Astrophysics Data System (ADS)

    Maier, G.; Buckley, J.; Bugaev, V.; Fegan, S.; Funk, S.; Konopelko, A.; Vassiliev, V. V.

    2008-12-01

    The Advanced Gamma-ray Imaging System (AGIS) is a US-led concept for a next-generation instrument in ground-based very-high-energy gamma-ray astronomy. The most important design requirement for AGIS is a sensitivity of about 10 times greater than current observatories like Veritas, H.E.S.S or MAGIC. We present results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  12. ADVANCED UTILITY SIMULATION MODEL, DESCRIPTION OF THE NATIONAL LOOP (VERSION 3.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  13. Proposing "the burns suite" as a novel simulation tool for advancing the delivery of burns education.

    PubMed

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2014-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. We explored this concept in a burns scenario in a novel, low-cost, high-fidelity, portable, immersive simulation environment (referred to as distributed simulation). This contextualized simulation/distributed simulation combination was named "The Burns Suite" (TBS). A pediatric burn resuscitation scenario was selected after high trainee demand. It was designed on Advanced Trauma and Life Support and Emergency Management of Severe Burns principles and refined using expert opinion through cognitive task analysis. TBS contained "realism" props, briefed nurses, and a simulated patient. Novices and experts were recruited. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's α was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twelve participants completed TBS scenario. Mean face and content validity ratings were high (4.6 and 4.5, respectively; range, 4-5). The internal consistency of questions was high. Qualitative data analysis revealed that participants felt 1) the experience was "real" and they were "able to behave as if in a real resuscitation environment," and 2) TBS "addressed what Advanced Trauma and Life Support and Emergency Management of Severe Burns didn't" (including the efficacy of incorporating nontechnical skills). TBS provides a novel, effective simulation tool to significantly advance the delivery of burns education. Recreating clinical challenge is crucial to optimize simulation training. This low-cost approach also has major implications for surgical education, particularly during increasing financial austerity. Alternative scenarios and/or procedures can be recreated within TBS, providing a diverse educational immersive simulation experience.

  14. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  15. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M

  16. Recent advances in superconducting-mixer simulations

    NASA Technical Reports Server (NTRS)

    Withington, S.; Kennedy, P. R.

    1992-01-01

    Over the last few years, considerable progress have been made in the development of techniques for fabricating high-quality superconducting circuits, and this success, together with major advances in the theoretical understanding of quantum detection and mixing at millimeter and submillimeter wavelengths, has made the development of CAD techniques for superconducting nonlinear circuits an important new enterprise. For example, arrays of quasioptical mixers are now being manufactured, where the antennas, matching networks, filters and superconducting tunnel junctions are all fabricated by depositing niobium and a variety of oxides on a single quartz substrate. There are no adjustable tuning elements on these integrated circuits, and therefore, one must be able to predict their electrical behavior precisely. This requirement, together with a general interest in the generic behavior of devices such as direct detectors and harmonic mixers, has lead us to develop a range of CAD tools for simulating the large-signal, small-signal, and noise behavior of superconducting tunnel junction circuits.

  17. Installation of Computerized Procedure System and Advanced Alarm System in the Human Systems Simulation Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya Lee; Spielman, Zachary Alexander; Rice, Brandon Charles

    2016-04-01

    This report describes the installation of two advanced control room technologies, an advanced alarm system and a computerized procedure system, into the Human Systems Simulation Laboratory (HSSL). Installation of these technologies enables future phases of this research by providing a platform to systematically evaluate the effect of these technologies on operator and plant performance.

  18. Acquisition of a Multi-Domain Advanced Real-Time Simulator to Support DoD-focused Interdisciplinary Research at CSUB

    DTIC Science & Technology

    2017-10-17

    Report: Acquisition of a Multi-Domain Advanced Real- Time Simulator to Support DoD-focused Interdisciplinary Research at CSUB The views, opinions and...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...University - Bakersfield Title: Acquisition of a Multi-Domain Advanced Real- Time Simulator to Support DoD-focused Interdisciplinary Research at CSUB Report

  19. Advanced Techniques for Simulating the Behavior of Sand

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2009-12-01

    research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.

  20. PRATHAM: Parallel Thermal Hydraulics Simulations using Advanced Mesoscopic Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Abhijit S; Jain, Prashant K; Mudrich, Jaime A

    2012-01-01

    At the Oak Ridge National Laboratory, efforts are under way to develop a 3D, parallel LBM code called PRATHAM (PaRAllel Thermal Hydraulic simulations using Advanced Mesoscopic Methods) to demonstrate the accuracy and scalability of LBM for turbulent flow simulations in nuclear applications. The code has been developed using FORTRAN-90, and parallelized using the message passing interface MPI library. Silo library is used to compact and write the data files, and VisIt visualization software is used to post-process the simulation data in parallel. Both the single relaxation time (SRT) and multi relaxation time (MRT) LBM schemes have been implemented in PRATHAM.more » To capture turbulence without prohibitively increasing the grid resolution requirements, an LES approach [5] is adopted allowing large scale eddies to be numerically resolved while modeling the smaller (subgrid) eddies. In this work, a Smagorinsky model has been used, which modifies the fluid viscosity by an additional eddy viscosity depending on the magnitude of the rate-of-strain tensor. In LBM, this is achieved by locally varying the relaxation time of the fluid.« less

  1. Advanced computational simulations of water waves interacting with wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  2. Benchmarking of Advanced Control Strategies for a Simulated Hydroelectric System

    NASA Astrophysics Data System (ADS)

    Finotti, S.; Simani, S.; Alvisi, S.; Venturini, M.

    2017-01-01

    This paper analyses and develops the design of advanced control strategies for a typical hydroelectric plant during unsteady conditions, performed in the Matlab and Simulink environments. The hydraulic system consists of a high water head and a long penstock with upstream and downstream surge tanks, and is equipped with a Francis turbine. The nonlinear characteristics of hydraulic turbine and the inelastic water hammer effects were considered to calculate and simulate the hydraulic transients. With reference to the control solutions addressed in this work, the proposed methodologies rely on data-driven and model-based approaches applied to the system under monitoring. Extensive simulations and comparisons serve to determine the best solution for the development of the most effective, robust and reliable control tool when applied to the considered hydraulic system.

  3. Advancements in Electromagnetic Wave Backscattering Simulations: Applications in Active Lidar Remote Sensing Involving Aerosols

    NASA Astrophysics Data System (ADS)

    Bi, L.

    2016-12-01

    Atmospheric remote sensing based on the Lidar technique fundamentally relies on knowledge of the backscattering of light by particulate matters in the atmosphere. This talk starts with a review of the current capabilities of electromagnetic wave scattering simulations to determine the backscattering optical properties of irregular particles, such as the backscatterer and depolarization ratio. This will be followed by a discussion of possible pitfalls in the relevant simulations. The talk will then be concluded with reports on the latest advancements in computational techniques. In addition, we summarize the laws of the backscattering optical properties of aerosols with respect to particle geometries, particle sizes, and mixing rules. These advancements will be applied to the analysis of the Lidar observation data to reveal the state and possible microphysical processes of various aerosols.

  4. Advanced vehicle technology simulation and research outreach to STEM programs : research report summary

    DOT National Transportation Integrated Search

    2017-05-30

    The University of Iowa (UI) and the leaders of the MyCarDoesWhat campaign partnered with the National Advanced Driving Simulator (NADS) miniSim and the UI Mobile Museum to build an interactive exhibit as part of the overall museum for visitors to exp...

  5. Rupture mechanism of liquid crystal thin films realized by large-scale molecular simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Trung D; Carrillo, Jan-Michael Y; Brown, W Michael

    2014-01-01

    The ability of liquid crystal (LC) molecules to respond to changes in their environment makes them an interesting candidate for thin film applications, particularly in bio-sensing, bio-mimicking devices, and optics. Yet the understanding of the (in)stability of this family of thin films has been limited by the inherent challenges encountered by experiment and continuum models. Using unprecedented largescale molecular dynamics (MD) simulations, we address the rupture origin of LC thin films wetting a solid substrate at length scales similar to those in experiment. Our simulations show the key signatures of spinodal instability in isotropic and nematic films on top ofmore » thermal nucleation, and importantly, for the first time, evidence of a common rupture mechanism independent of initial thickness and LC orientational ordering. We further demonstrate that the primary driving force for rupture is closely related to the tendency of the LC mesogens to recover their local environment in the bulk state. Our study not only provides new insights into the rupture mechanism of liquid crystal films, but also sets the stage for future investigations of thin film systems using peta-scale molecular dynamics simulations.« less

  6. Direct Numerical Simulation of Turbulent Multi-Stage Autoignition Relevant to Engine Conditions

    NASA Astrophysics Data System (ADS)

    Chen, Jacqueline

    2017-11-01

    Due to the unrivaled energy density of liquid hydrocarbon fuels combustion will continue to provide over 80% of the world's energy for at least the next fifty years. Hence, combustion needs to be understood and controlled to optimize combustion systems for efficiency to prevent further climate change, to reduce emissions and to ensure U.S. energy security. In this talk I will discuss recent progress in direct numerical simulations of turbulent combustion focused on providing fundamental insights into key `turbulence-chemistry' interactions that underpin the development of next generation fuel efficient, fuel flexible engines for transportation and power generation. Petascale direct numerical simulation (DNS) of multi-stage mixed-mode turbulent combustion in canonical configurations have elucidated key physics that govern autoignition and flame stabilization in engines and provide benchmark data for combustion model development under the conditions of advanced engines which operate near combustion limits to maximize efficiency and minimize emissions. Mixed-mode combustion refers to premixed or partially-premixed flames propagating into stratified autoignitive mixtures. Multi-stage ignition refers to hydrocarbon fuels with negative temperature coefficient behavior that undergo sequential low- and high-temperature autoignition. Key issues that will be discussed include: 1) the role of mixing in shear driven turbulence on the dynamics of multi-stage autoignition and cool flame propagation in diesel environments, 2) the role of thermal and composition stratification on the evolution of the balance of mixed combustion modes - flame propagation versus spontaneous ignition - which determines the overall combustion rate in autoignition processes, and 3) the role of cool flames on lifted flame stabilization. Finally prospects for DNS of turbulent combustion at the exascale will be discussed in the context of anticipated heterogeneous machine architectures. sponsored by DOE

  7. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  8. The new Langley Research Center advanced real-time simulation (ARTS) system

    NASA Technical Reports Server (NTRS)

    Crawford, D. J.; Cleveland, J. I., II

    1986-01-01

    Based on a survey of current local area network technology with special attention paid to high bandwidth and very low transport delay requirements, NASA's Langley Research Center designed a new simulation subsystem using the computer automated measurement and control (CAMAC) network. This required significant modifications to the standard CAMAC system and development of a network switch, a clocking system, new conversion equipment, new consoles, supporting software, etc. This system is referred to as the advanced real-time simulation (ARTS) system. It is presently being built at LaRC. This paper provides a functional and physical description of the hardware and a functional description of the software. The requirements which drove the design are presented as well as present performance figures and status.

  9. Advanced Simulation in Undergraduate Pilot Training: Automatic Instructional System. Final Report for the Period March 1971-January 1975.

    ERIC Educational Resources Information Center

    Faconti, Victor; Epps, Robert

    The Advanced Simulator for Undergraduate Pilot Training (ASUPT) was designed to investigate the role of simulation in the future Undergraduate Pilot Training (UPT) program. The Automated Instructional System designed for the ASUPT simulator was described in this report. The development of the Automated Instructional System for ASUPT was based upon…

  10. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tome, Carlos N; Caro, J A; Lebensohn, R A

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less

  11. Ray Grout | NREL

    Science.gov Websites

    cross flow from peta-scale, high-fidelity simulations in collaboration with the gas turbine industry. A stratified combustion in the stabilization of flames above a jet in cross flow. Earlier work involved using

  12. Advanced graphical user interface for multi-physics simulations using AMST

    NASA Astrophysics Data System (ADS)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  13. ADVANCED UTILITY SIMULATION MODEL, MULTI-PERIOD MULTI-STATE MODULE DESIGN DOCUMENTATION (VERSION 1.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  14. ADVANCED UTILITY SIMULATION MODEL DOCUMENTATION OF SYSTEM DESIGN STATE LEVEL MODEL (VERSION 1.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  15. A graphical workstation based part-task flight simulator for preliminary rapid evaluation of advanced displays

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Kuchar, James; Hahn, Edward; Pritchett, A.; Hansman, R. John

    1994-01-01

    Advances in avionics and display technology are significantly changing the cockpit environment in current transport aircraft. The MIT Aeronautical Systems Lab (ASL) developed a part-task flight simulator specifically to study the effects of these new technologies on flight crew situational awareness and performance. The simulator is based on a commercially-available graphics workstation, and can be rapidly reconfigured to meet the varying demands of experimental studies. The simulator was successfully used to evaluate graphical microbursts alerting displays, electronic instrument approach plates, terrain awareness and alerting displays, and ATC routing amendment delivery through digital datalinks.

  16. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL

  17. Systematic Review of Patient-Specific Surgical Simulation: Toward Advancing Medical Education.

    PubMed

    Ryu, Won Hyung A; Dharampal, Navjit; Mostafa, Ahmed E; Sharlin, Ehud; Kopp, Gail; Jacobs, William Bradley; Hurlbert, Robin John; Chan, Sonny; Sutherland, Garnette R

    Simulation-based education has been shown to be an effective tool to teach foundational technical skills in various surgical specialties. However, most of the current simulations are limited to generic scenarios and do not allow continuation of the learning curve beyond basic technical skills to prepare for more advanced expertise, such as patient-specific surgical planning. The objective of this study was to evaluate the current medical literature with respect to the utilization and educational value of patient-specific simulations for surgical training. We performed a systematic review of the literature using Pubmed, Embase, and Scopus focusing on themes of simulation, patient-specific, surgical procedure, and education. The study included randomized controlled trials, cohort studies, and case-control studies published between 2005 and 2016. Two independent reviewers (W.H.R. and N.D) conducted the study appraisal, data abstraction, and quality assessment of the studies. The search identified 13 studies that met the inclusion criteria; 7 studies employed computer simulations and 6 studies used 3-dimensional (3D) synthetic models. A number of surgical specialties evaluated patient-specific simulation, including neurosurgery, vascular surgery, orthopedic surgery, and interventional radiology. However, most studies were small in size and primarily aimed at feasibility assessments and early validation. Early evidence has shown feasibility and utility of patient-specific simulation for surgical education. With further development of this technology, simulation-based education may be able to support training of higher-level competencies outside the clinical settingto aid learners in their development of surgical skills. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  18. Atmospheric Corrosion Behavior and Mechanism of a Ni-Advanced Weathering Steel in Simulated Tropical Marine Environment

    NASA Astrophysics Data System (ADS)

    Wu, Wei; Zeng, Zhongping; Cheng, Xuequn; Li, Xiaogang; Liu, Bo

    2017-12-01

    Corrosion behavior of Ni-advanced weathering steel, as well as carbon steel and conventional weathering steel, in a simulated tropical marine atmosphere was studied by field exposure and indoor simulation tests. Meanwhile, morphology and composition of corrosion products formed on the exposed steels were surveyed through scanning electron microscopy, energy-dispersive x-ray spectroscopy and x-ray diffraction. Results indicated that the additive Ni in weathering steel played an important role during the corrosion process, which took part in the formation of corrosion products, enriched in the inner rust layer and promoted the transformation from loose γ-FeOOH to dense α-FeOOH. As a result, the main aggressive ion, i.e., Cl-, was effectively separated in the outer rust layer which leads to the lowest corrosion rate among these tested steels. Thus, the resistance of Ni-advanced weathering steel to atmospheric corrosion was significantly improved in a simulated tropical marine environment.

  19. Advanced Models and Algorithms for Self-Similar IP Network Traffic Simulation and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Radev, Dimitar; Lokshina, Izabella

    2010-11-01

    The paper examines self-similar (or fractal) properties of real communication network traffic data over a wide range of time scales. These self-similar properties are very different from the properties of traditional models based on Poisson and Markov-modulated Poisson processes. Advanced fractal models of sequentional generators and fixed-length sequence generators, and efficient algorithms that are used to simulate self-similar behavior of IP network traffic data are developed and applied. Numerical examples are provided; and simulation results are obtained and analyzed.

  20. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; hide

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  1. Petascale self-consistent electromagnetic computations using scalable and accurate algorithms for complex structures

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Abell, D.; Amundson, J.; Bruhwiler, D. L.; Busby, R.; Carlsson, J. A.; Dimitrov, D. A.; Kashdan, E.; Messmer, P.; Nieter, C.; Smithe, D. N.; Spentzouris, P.; Stoltz, P.; Trines, R. M.; Wang, H.; Werner, G. R.

    2006-09-01

    As the size and cost of particle accelerators escalate, high-performance computing plays an increasingly important role; optimization through accurate, detailed computermodeling increases performance and reduces costs. But consequently, computer simulations face enormous challenges. Early approximation methods, such as expansions in distance from the design orbit, were unable to supply detailed accurate results, such as in the computation of wake fields in complex cavities. Since the advent of message-passing supercomputers with thousands of processors, earlier approximations are no longer necessary, and it is now possible to compute wake fields, the effects of dampers, and self-consistent dynamics in cavities accurately. In this environment, the focus has shifted towards the development and implementation of algorithms that scale to large numbers of processors. So-called charge-conserving algorithms evolve the electromagnetic fields without the need for any global solves (which are difficult to scale up to many processors). Using cut-cell (or embedded) boundaries, these algorithms can simulate the fields in complex accelerator cavities with curved walls. New implicit algorithms, which are stable for any time-step, conserve charge as well, allowing faster simulation of structures with details small compared to the characteristic wavelength. These algorithmic and computational advances have been implemented in the VORPAL7 Framework, a flexible, object-oriented, massively parallel computational application that allows run-time assembly of algorithms and objects, thus composing an application on the fly.

  2. A graphical workstation based part-task flight simulator for preliminary rapid evaluation of advanced displays

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Kuchar, James; Hahn, Edward; Pritchett, Amy; Hansman, R. J.

    1992-01-01

    Advances in avionics and display technology are significantly changing the cockpit environment in current transport aircraft. The MIT Aeronautical Systems Lab (ASL) has developed a part-task flight simulator specifically to study the effects of these new technologies on flight crew situational awareness and performance. The simulator is based on a commercially-available graphics workstation, and can be rapidly reconfigured to meet the varying demands of experimental studies. The simulator has been successfully used to evaluate graphical microburst alerting displays, electronic instrument approach plates, terrain awareness and alerting displays, and ATC routing amendment delivery through digital datalinks.

  3. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  4. Technical Basis for Physical Fidelity of NRC Control Room Training Simulators for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minsk, Brian S.; Branch, Kristi M.; Bates, Edward K.

    2009-10-09

    The objective of this study is to determine how simulator physical fidelity influences the effectiveness of training the regulatory personnel responsible for examination and oversight of operating personnel and inspection of technical systems at nuclear power reactors. It seeks to contribute to the U.S. Nuclear Regulatory Commission’s (NRC’s) understanding of the physical fidelity requirements of training simulators. The goal of the study is to provide an analytic framework, data, and analyses that inform NRC decisions about the physical fidelity requirements of the simulators it will need to train its staff for assignment at advanced reactors. These staff are expected tomore » come from increasingly diverse educational and experiential backgrounds.« less

  5. Stroke code simulation benefits advanced practice providers similar to neurology residents.

    PubMed

    Khan, Muhib; Baird, Grayson L; Price, Theresa; Tubergen, Tricia; Kaskar, Omran; De Jesus, Michelle; Zachariah, Joseph; Oostema, Adam; Scurek, Raymond; Coleman, Robert R; Sherman, Wendy; Hingtgen, Cynthia; Abdelhak, Tamer; Smith, Brien; Silver, Brian

    2018-04-01

    Advanced practice providers (APPs) are important members of stroke teams. Stroke code simulations offer valuable experience in the evaluation and treatment of stroke patients without compromising patient care. We hypothesized that simulation training would increase APP confidence, comfort level, and preparedness in leading a stroke code similar to neurology residents. This is a prospective quasi-experimental, pretest/posttest study. Nine APPs and 9 neurology residents participated in 3 standardized simulated cases to determine need for IV thrombolysis, thrombectomy, and blood pressure management for intracerebral hemorrhage. Emergency medicine physicians and neurologists were preceptors. APPs and residents completed a survey before and after the simulation. Generalized mixed modeling assuming a binomial distribution was used to evaluate change. On a 5-point Likert scale (1 = strongly disagree and 5 = strongly agree), confidence in leading a stroke code increased from 2.4 to 4.2 ( p < 0.05) among APPs. APPs reported improved comfort level in rapidly assessing a stroke patient for thrombolytics (3.1-4.2; p < 0.05), making the decision to give thrombolytics (2.8 vs 4.2; p < 0.05), and assessing a patient for embolectomy (2.4-4.0; p < 0.05). There was no difference in the improvement observed in all the survey questions as compared to neurology residents. Simulation training is a beneficial part of medical education for APPs and should be considered in addition to traditional didactics and clinical training. Further research is needed to determine whether simulation education of APPs results in improved treatment times and outcomes of acute stroke patients.

  6. Hierarchical Petascale Simulation Framework For Stress Corrosion Cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grama, Ananth

    2013-12-18

    A number of major accomplishments resulted from the project. These include: • Data Structures, Algorithms, and Numerical Methods for Reactive Molecular Dynamics. We have developed a range of novel data structures, algorithms, and solvers (amortized ILU, Spike) for use with ReaxFF and charge equilibration. • Parallel Formulations of ReactiveMD (Purdue ReactiveMolecular Dynamics Package, PuReMD, PuReMD-GPU, and PG-PuReMD) for Messaging, GPU, and GPU Cluster Platforms. We have developed efficient serial, parallel (MPI), GPU (Cuda), and GPU Cluster (MPI/Cuda) implementations. Our implementations have been demonstrated to be significantly better than the state of the art, both in terms of performance and scalability.more » • Comprehensive Validation in the Context of Diverse Applications. We have demonstrated the use of our software in diverse systems, including silica-water, silicon-germanium nanorods, and as part of other projects, extended it to applications ranging from explosives (RDX) to lipid bilayers (biomembranes under oxidative stress). • Open Source Software Packages for Reactive Molecular Dynamics. All versions of our soft- ware have been released over the public domain. There are over 100 major research groups worldwide using our software. • Implementation into the Department of Energy LAMMPS Software Package. We have also integrated our software into the Department of Energy LAMMPS software package.« less

  7. Comprehensive simulation-enhanced training curriculum for an advanced minimally invasive procedure: a randomized controlled trial.

    PubMed

    Zevin, Boris; Dedy, Nicolas J; Bonrath, Esther M; Grantcharov, Teodor P

    2017-05-01

    There is no comprehensive simulation-enhanced training curriculum to address cognitive, psychomotor, and nontechnical skills for an advanced minimally invasive procedure. 1) To develop and provide evidence of validity for a comprehensive simulation-enhanced training (SET) curriculum for an advanced minimally invasive procedure; (2) to demonstrate transfer of acquired psychomotor skills from a simulation laboratory to live porcine model; and (3) to compare training outcomes of SET curriculum group and chief resident group. University. This prospective single-blinded, randomized, controlled trial allocated 20 intermediate-level surgery residents to receive either conventional training (control) or SET curriculum training (intervention). The SET curriculum consisted of cognitive, psychomotor, and nontechnical training modules. Psychomotor skills in a live anesthetized porcine model in the OR was the primary outcome. Knowledge of advanced minimally invasive and bariatric surgery and nontechnical skills in a simulated OR crisis scenario were the secondary outcomes. Residents in the SET curriculum group went on to perform a laparoscopic jejunojejunostomy in the OR. Cognitive, psychomotor, and nontechnical skills of SET curriculum group were also compared to a group of 12 chief surgery residents. SET curriculum group demonstrated superior psychomotor skills in a live porcine model (56 [47-62] versus 44 [38-53], P<.05) and superior nontechnical skills (41 [38-45] versus 31 [24-40], P<.01) compared with conventional training group. SET curriculum group and conventional training group demonstrated equivalent knowledge (14 [12-15] versus 13 [11-15], P = 0.47). SET curriculum group demonstrated equivalent psychomotor skills in the live porcine model and in the OR in a human patient (56 [47-62] versus 63 [61-68]; P = .21). SET curriculum group demonstrated inferior knowledge (13 [11-15] versus 16 [14-16]; P<.05), equivalent psychomotor skill (63 [61-68] versus 68 [62-74]; P

  8. Supercomputing Sheds Light on the Dark Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Heitmann, Katrin

    2012-11-15

    At Argonne National Laboratory, scientists are using supercomputers to shed light on one of the great mysteries in science today, the Dark Universe. With Mira, a petascale supercomputer at the Argonne Leadership Computing Facility, a team led by physicists Salman Habib and Katrin Heitmann will run the largest, most complex simulation of the universe ever attempted. By contrasting the results from Mira with state-of-the-art telescope surveys, the scientists hope to gain new insights into the distribution of matter in the universe, advancing future investigations of dark energy and dark matter into a new realm. The team's research was named amore » finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.« less

  9. Numerical simulation of abutment pressure redistribution during face advance

    NASA Astrophysics Data System (ADS)

    Klishin, S. V.; Lavrikov, S. V.; Revuzhenko, A. F.

    2017-12-01

    The paper presents numerical simulation data on the abutment pressure redistribution in rock mass during face advance, including isolines of maximum shear stress and pressure epures. The stress state of rock in the vicinity of a breakage heading is calculated by the finite element method using a 2D nonlinear model of a structurally heterogeneous medium with regard to plasticity and internal self-balancing stress. The thus calculated stress field is used as input data for 3D discrete element modeling of the process. The study shows that the abutment pressure increases as the roof span extends and that the distance between the face breast and the peak point of this pressure depends on the elastoplastic properties and internal self-balancing stress of a rock medium.

  10. Beam Loss Simulation and Collimator System Configurations for the Advanced Photon Source Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, A.; Borland, M.

    The proposed multi-bend achromat lattice for the Advanced Photon Source upgrade (APS-U) has a design emittance of less than 70 pm. The Touschek loss rate is high: compared with the current APS ring, which has an average beam lifetime ~ 10 h, the simulated beam lifetime for APS-U is only ~2 h when operated in the high flux mode (I=200 mA in 48 bunches). An additional consequence of the short lifetime is that injection must be more frequent, which provides another potential source of particle loss. In order to provide information for the radiation shielding system evaluation and to avoidmore » particle loss in sensitive locations around the ring (for example, insertion device straight sections), simulations of the detailed beam loss distribution have been performed. Several possible collimation configurations have been simulated and compared.« less

  11. Advancements in Afterbody Radiative Heating Simulations for Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Panesi, Marco; Brandis, Aaron M.

    2016-01-01

    Four advancements to the simulation of backshell radiative heating for Earth entry are presented. The first of these is the development of a flow field model that treats electronic levels of the dominant backshell radiator, N, as individual species. This is shown to allow improvements in the modeling of electron-ion recombination and two-temperature modeling, which are shown to increase backshell radiative heating by 10 to 40%. By computing the electronic state populations of N within the flow field solver, instead of through the quasi-steady state approximation in the radiation code, the coupling of radiative transition rates to the species continuity equations for the levels of N, including the impact of non-local absorption, becomes feasible. Implementation of this additional level of coupling between the flow field and radiation codes represents the second advancement presented in this work, which is shown to increase the backshell radiation by another 10 to 50%. The impact of radiative transition rates due to non-local absorption indicates the importance of accurate radiation transport in the relatively complex flow geometry of the backshell. This motivates the third advancement, which is the development of a ray-tracing radiation transport approach to compute the radiative transition rates and divergence of the radiative flux at every point for coupling to the flow field, therefore allowing the accuracy of the commonly applied tangent-slab approximation to be assessed for radiative source terms. For the sphere considered at lunar-return conditions, the tangent-slab approximation is shown to provide a sufficient level of accuracy for the radiative source terms, even for backshell cases. This is in contrast to the agreement between the two approaches for computing the radiative flux to the surface, which differ by up to 40%. The final advancement presented is the development of a nonequilibrium model for NO radiation, which provides significant backshell

  12. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  13. Validation of GATE Monte Carlo simulations of the GE Advance/Discovery LS PET scanners.

    PubMed

    Schmidtlein, C Ross; Kirov, Assen S; Nehmeh, Sadek A; Erdi, Yusuf E; Humm, John L; Amols, Howard I; Bidaut, Luc M; Ganin, Alex; Stearns, Charles W; McDaniel, David L; Hamacher, Klaus A

    2006-01-01

    The recently developed GATE (GEANT4 application for tomographic emission) Monte Carlo package, designed to simulate positron emission tomography (PET) and single photon emission computed tomography (SPECT) scanners, provides the ability to model and account for the effects of photon noncollinearity, off-axis detector penetration, detector size and response, positron range, photon scatter, and patient motion on the resolution and quality of PET images. The objective of this study is to validate a model within GATE of the General Electric (GE) Advance/Discovery Light Speed (LS) PET scanner. Our three-dimensional PET simulation model of the scanner consists of 12 096 detectors grouped into blocks, which are grouped into modules as per the vendor's specifications. The GATE results are compared to experimental data obtained in accordance with the National Electrical Manufactures Association/Society of Nuclear Medicine (NEMA/SNM), NEMA NU 2-1994, and NEMA NU 2-2001 protocols. The respective phantoms are also accurately modeled thus allowing us to simulate the sensitivity, scatter fraction, count rate performance, and spatial resolution. In-house software was developed to produce and analyze sinograms from the simulated data. With our model of the GE Advance/Discovery LS PET scanner, the ratio of the sensitivities with sources radially offset 0 and 10 cm from the scanner's main axis are reproduced to within 1% of measurements. Similarly, the simulated scatter fraction for the NEMA NU 2-2001 phantom agrees to within less than 3% of measured values (the measured scatter fractions are 44.8% and 40.9 +/- 1.4% and the simulated scatter fraction is 43.5 +/- 0.3%). The simulated count rate curves were made to match the experimental curves by using deadtimes as fit parameters. This resulted in deadtime values of 625 and 332 ns at the Block and Coincidence levels, respectively. The experimental peak true count rate of 139.0 kcps and the peak activity concentration of 21.5 k

  14. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  15. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  16. Advancements in hardware-in-the-loop simulations at the U.S. Army Aviation and Missile Command

    NASA Astrophysics Data System (ADS)

    Buford, James A.; Jolly, Alexander C.; Mobley, Scott B.; Sholes, William J.

    2000-07-01

    A greater awareness of and increased interest in the use of modeling and simulation (M&S) has been demonstrated at many levels within the Department of Defense (DoD) and all the Armed Services agencies in recent years. M&S application is regarded as a viable means of lowering the life cycle costs of missile defense and tactical missile weapon system acquisition beginning with studies of new concepts of war-fighting through user training and post-deployment support. The Aviation and Missile Research, Engineering, and Development Center (AMRDEC) of the U.S. Army Aviation and Missile Command (AMCOM) has an extensive history of applying all types of M&S to weapons system development and has been a particularly strong advocate of hardware-in-the-loop (HWIL) simulation and test for many years. Over the past 40 years AMRDEC has developed and maintained the Advanced Simulation Center (ASC) which provides world-class, high fidelity, specific and dedicated HWIL simulation and test capabilities for the Army's missile defense and tactical missile program offices in both the infrared and radio frequency sensor domains. The ASC facility uses M&S to conduct daily HWIL missile simulations and tests to support flight tests, missile/system development, independent verification and validation of weapon system embedded software and simulations, and missile/system performance against current and future threat environments. This paper describes the ASC role, recaps the past year, describes the HWIL components and advancements, and outlines the path-ahead for the ASC in terms of both missile and complete system HWIL simulations and test with a focus on the imaging infrared systems.

  17. [Objective surgery -- advanced robotic devices and simulators used for surgical skill assessment].

    PubMed

    Suhánszki, Norbert; Haidegger, Tamás

    2014-12-01

    Robotic assistance became a leading trend in minimally invasive surgery, which is based on the global success of laparoscopic surgery. Manual laparoscopy requires advanced skills and capabilities, which is acquired through tedious learning procedure, while da Vinci type surgical systems offer intuitive control and advanced ergonomics. Nevertheless, in either case, the key issue is to be able to assess objectively the surgeons' skills and capabilities. Robotic devices offer radically new way to collect data during surgical procedures, opening the space for new ways of skill parameterization. This may be revolutionary in MIS training, given the new and objective surgical curriculum and examination methods. The article reviews currently developed skill assessment techniques for robotic surgery and simulators, thoroughly inspecting their validation procedure and utility. In the coming years, these methods will become the mainstream of Western surgical education.

  18. Motion-base simulator results of advanced supersonic transport handling qualities with active controls

    NASA Technical Reports Server (NTRS)

    Feather, J. B.; Joshi, D. S.

    1981-01-01

    Handling qualities of the unaugmented advanced supersonic transport (AST) are deficient in the low-speed, landing approach regime. Consequently, improvement in handling with active control augmentation systems has been achieved using implicit model-following techniques. Extensive fixed-based simulator evaluations were used to validate these systems prior to tests with full motion and visual capabilities on a six-axis motion-base simulator (MBS). These tests compared the handling qualities of the unaugmented AST with several augmented configurations to ascertain the effectiveness of these systems. Cooper-Harper ratings, tracking errors, and control activity data from the MBS tests have been analyzed statistically. The results show the fully augmented AST handling qualities have been improved to an acceptable level.

  19. A demonstration of motion base design alternatives for the National Advanced Driving Simulator

    NASA Technical Reports Server (NTRS)

    Mccauley, Michael E.; Sharkey, Thomas J.; Sinacori, John B.; Laforce, Soren; Miller, James C.; Cook, Anthony

    1992-01-01

    A demonstration of the capability of NASA's Vertical Motion Simulator to simulate two alternative motion base designs for the National Advanced Driving simulator (NADS) is reported. The VMS is located at ARC. The motion base conditions used in this demonstration were as follows: (1) a large translational motion base; and (2) a motion base design with limited translational capability. The latter had translational capability representative of a typical synergistic motion platform. These alternatives were selected to test the prediction that large amplitude translational motion would result in a lower incidence or severity of simulator induced sickness (SIS) than would a limited translational motion base. A total of 10 drivers performed two tasks, slaloms and quick-stops, using each of the motion bases. Physiological, objective, and subjective measures were collected. No reliable differences in SIS between the motion base conditions was found in this demonstration. However, in light of the cost considerations and engineering challenges associated with implementing a large translation motion base, performance of a formal study is recommended.

  20. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    EPA Science Inventory

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  1. TECA: Petascale pattern recognition for climate science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, .; Byna, Surendra; Vishwanath, Venkatram

    Climate Change is one of the most pressing challenges facing humanity in the 21st century. Climate simulations provide us with a unique opportunity to examine effects of anthropogenic emissions. Highresolution climate simulations produce “Big Data”: contemporary climate archives are ≈ 5PB in size and we expect future archives to measure on the order of Exa-Bytes. In this work, we present the successful application of TECA (Toolkit for Extreme Climate Analysis) framework, for extracting extreme weather patterns such as Tropical Cyclones, Atmospheric Rivers and Extra-Tropical Cyclones from TB-sized simulation datasets. TECA has been run at full-scale on Cray XE6 and IBMmore » BG/Q systems, and has reduced the runtime for pattern detection tasks from years to hours. TECA has been utilized to evaluate the performance of various computational models in reproducing the statistics of extreme weather events, and for characterizing the change in frequency of storm systems in the future.« less

  2. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  3. Efficient Multi-Dimensional Simulation of Quantum Confinement Effects in Advanced MOS Devices

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Rafferty, Conor S.; Ancona, Mario G.; Yu, Zhi-Ping

    2000-01-01

    We investigate the density-gradient (DG) transport model for efficient multi-dimensional simulation of quantum confinement effects in advanced MOS devices. The formulation of the DG model is described as a quantum correction to the classical drift-diffusion model. Quantum confinement effects are shown to be significant in sub-100nm MOSFETs. In thin-oxide MOS capacitors, quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion or quantum effects in simulations dramatically improves the match between C-V simulations and measurements for oxide thickness down to 2 nm. Significant quantum corrections also occur in the I-V characteristics of short-channel (30 to 100 nm) n-MOSFETs, with current drive reduced by up to 70%. This effect is shown to result from reduced inversion charge due to quantum confinement of electrons in the channel. Also, subthreshold slope is degraded by 15 to 20 mV/decade with the inclusion of quantum effects via the density-gradient model, and short channel effects (in particular, drain-induced barrier lowering) are noticeably increased.

  4. Simulation for Supporting Scale-Up of a Fluidized Bed Reactor for Advanced Water Oxidation

    PubMed Central

    Abdul Raman, Abdul Aziz; Daud, Wan Mohd Ashri Wan

    2014-01-01

    Simulation of fluidized bed reactor (FBR) was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP). The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure) in the reactor. Simulation was implemented for 2.8 L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe3+ and Fe2+ mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40–90 mg/L within 60 min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10 L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment. PMID:25309949

  5. Computational sciences in the upstream oil and gas industry

    PubMed Central

    Halsey, Thomas C.

    2016-01-01

    The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785

  6. Recent Advances in the Theory and Simulation of Model Colloidal Microphase Formers.

    PubMed

    Zhuang, Yuan; Charbonneau, Patrick

    2016-08-18

    This mini-review synthesizes our understanding of the equilibrium behavior of particle-based models with short-range attractive and long-range repulsive (SALR) interactions. These models, which can form stable periodic microphases, aim to reproduce the essence of colloidal suspensions with competing interparticle interactions. Ordered structures, however, have yet to be obtained in experiments. In order to better understand the hurdles to periodic microphase assembly, marked theoretical and simulation advances have been made over the past few years. Here, we present recent progress in the study of microphases in models with SALR interactions using liquid-state theory and density-functional theory as well as numerical simulations. Combining these various approaches provides a description of periodic microphases, and gives insights into the rich phenomenology of the surrounding disordered regime. Ongoing research directions in the thermodynamics of models with SALR interactions are also presented.

  7. Can virtual reality simulation be used for advanced bariatric surgical training?

    PubMed

    Lewis, Trystan M; Aggarwal, Rajesh; Kwasnicki, Richard M; Rajaretnam, Niro; Moorthy, Krishna; Ahmed, Ahmed; Darzi, Ara

    2012-06-01

    Laparoscopic bariatric surgery is a safe and effective way of treating morbid obesity. However, the operations are technically challenging and training opportunities for junior surgeons are limited. This study aims to assess whether virtual reality (VR) simulation is an effective adjunct for training and assessment of laparoscopic bariatric technical skills. Twenty bariatric surgeons of varying experience (Five experienced, five intermediate, and ten novice) were recruited to perform a jejuno-jejunostomy on both cadaveric tissue and on the bariatric module of the Lapmentor VR simulator (Simbionix Corporation, Cleveland, OH). Surgical performance was assessed using validated global rating scales (GRS) and procedure specific video rating scales (PSRS). Subjects were also questioned about the appropriateness of VR as a training tool for surgeons. Construct validity of the VR bariatric module was demonstrated with a significant difference in performance between novice and experienced surgeons on the VR jejuno-jejunostomy module GRS (median 11-15.5; P = .017) and PSRS (median 11-13; P = .003). Content validity was demonstrated with surgeons describing the VR bariatric module as useful and appropriate for training (mean Likert score 4.45/7) and they would highly recommend VR simulation to others for bariatric training (mean Likert score 5/7). Face and concurrent validity were not established. This study shows that the bariatric module on a VR simulator demonstrates construct and content validity. VR simulation appears to be an effective method for training of advanced bariatric technical skills for surgeons at the start of their bariatric training. However, assessment of technical skills should still take place on cadaveric tissue. Copyright © 2012. Published by Mosby, Inc.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.

    Enabled by petascale supercomputing, the next generation of computer models for wind energy will simulate a vast range of scales and physics, spanning from turbine structural dynamics and blade-scale turbulence to mesoscale atmospheric flow. A single model covering all scales and physics is not feasible. Thus, these simulations will require the coupling of different models/codes, each for different physics, interacting at their domain boundaries.

  9. Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.

    2015-01-01

    Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.

  10. The role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma

    PubMed Central

    Kunimatsu-Sanuki, Shiho; Iwase, Aiko; Araie, Makoto; Aoki, Yuki; Hara, Takeshi; Fukuchi, Takeo; Udagawa, Sachiko; Ohkubo, Shinji; Sugiyama, Kazuhisa; Matsumoto, Chota; Nakazawa, Toru; Yamaguchi, Takuhiro; Ono, Hiroshi

    2017-01-01

    Background/aims To assess the role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma. Methods Normal subjects and patients with glaucoma with mean deviation <–12 dB in both eyes (Humphrey Field Analyzer 24-2 SITA-S program) used a driving simulator (DS; Honda Motor, Tokyo). Two scenarios in which oncoming cars turned right crossing the driver's path were chosen. We compared the binocular integrated visual field (IVF) in the patients who were involved in collisions and those who were not. We performed a multivariate logistic regression analysis; the dependent parameter was collision involvement, and the independent parameters were age, visual acuity and mean sensitivity of the IVF subfields. Results The study included 43 normal subjects and 100 patients with advanced glaucoma. And, 5 of the 100 patients with advanced glaucoma experienced simulator sickness during the main test and were thus excluded. In total, 95 patients with advanced glaucoma and 43 normal subjects completed the main test of DS. Advanced glaucoma patients had significantly more collisions than normal patients in one or both DS scenarios (p<0.001). The patients with advanced glaucoma who were involved in collisions were older (p=0.050) and had worse visual acuity in the better eye (p<0.001) and had lower mean IVF sensitivity in the inferior hemifield, both 0°–12° and 13°–24° in comparison with who were not involved in collisions (p=0.012 and p=0.034). A logistic regression analysis revealed that collision involvement was significantly associated with decreased inferior IVF mean sensitivity from 13° to 24° (p=0.041), in addition to older age and lower visual acuity (p=0.018 and p<0.001). Conclusions Our data suggest that the inferior hemifield was associated with the incidence of motor vehicle collisions with oncoming cars in patients with advanced glaucoma. PMID:28400370

  11. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  12. Radiocesium interaction with clay minerals: Theory and simulation advances Post-Fukushima.

    PubMed

    Okumura, Masahiko; Kerisit, Sebastien; Bourg, Ian C; Lammers, Laura N; Ikeda, Takashi; Sassi, Michel; Rosso, Kevin M; Machida, Masahiko

    2018-04-14

    Insights at the microscopic level of the process of radiocesium adsorption and interaction with clay mineral particles have improved substantially over the past several years, triggered by pressing social issues such as management of huge amounts of waste soil accumulated after the Fukushima Dai-ichi nuclear power plant accident. In particular, computer-based molecular modeling supported by advanced hardware and algorithms has proven to be a powerful approach. Its application can now generally encompass the full complexity of clay particle adsorption sites from basal surfaces to interlayers with inserted water molecules, to edges including fresh and weathered frayed ones. On the other hand, its methodological schemes are now varied from traditional force-field molecular dynamics on large-scale realizations composed of many thousands of atoms including water molecules to first-principles methods on smaller models in rather exacting fashion. In this article, we overview new understanding enabled by simulations across methodological variations, focusing on recent insights that connect with experimental observations, namely: 1) the energy scale for cesium adsorption on the basal surface, 2) progress in understanding the structure of clay edges, which is difficult to probe experimentally, 3) cesium adsorption properties at hydrated interlayer sites, 4) the importance of the size relationship between the ionic radius of cesium and the interlayer distance at frayed edge sites, 5) the migration of cesium into deep interlayer sites, and 6) the effects of nuclear decay of radiocesium. Key experimental observations that motivate these simulation advances are also summarized. Furthermore, some directions toward future solutions of waste soil management are discussed based on the obtained microscopic insights. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Radiocesium interaction with clay minerals: Theory and simulation advances Post–Fukushima

    DOE PAGES

    Okumura, Masahiko; Kerisit, Sebastien; Bourg, Ian C.; ...

    2018-03-14

    Insights at the microscopic level of the process of radiocesium adsorption and interaction with clay mineral particles have improved substantially over the past several years, triggered by pressing social issues such as management of huge amounts of waste soil accumulated after the Fukushima Dai–ichi nuclear power plant accident. In particular, computer–based molecular modeling supported by advanced hardware and algorithms has proven to be a powerful approach. Its application can now generally encompass the full complexity of clay particle adsorption sites from basal surfaces to interlayers with inserted water molecules, to edges including fresh and weathered frayed ones. On the othermore » hand, its methodological schemes are now varied from traditional force–field molecular dynamics on large–scale realizations composed of many thousands of atoms including water molecules to first–principles methods on smaller models in rather exacting fashion. In this article, we overview new understanding enabled by simulations across methodological variations, focusing on recent insights that connect with experimental observations, namely: 1) the energy scale for cesium adsorption on the basal surface, 2) progress in understanding the structure of clay edges, which is difficult to probe experimentally, 3) cesium adsorption properties at hydrated interlayer sites, 4) the importance of the size relationship between the ionic radius of cesium and the interlayer distance at frayed edge sites, 5) the migration of cesium into deep interlayer sites, and 6) the effects of nuclear decay of radiocesium. Key experimental observations that motivate these simulation advances are also summarized. Furthermore, some directions toward future solutions of waste soil management are discussed based on the obtained microscopic insights.« less

  14. Radiocesium interaction with clay minerals: Theory and simulation advances Post–Fukushima

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okumura, Masahiko; Kerisit, Sebastien; Bourg, Ian C.

    Insights at the microscopic level of the process of radiocesium adsorption and interaction with clay mineral particles have improved substantially over the past several years, triggered by pressing social issues such as management of huge amounts of waste soil accumulated after the Fukushima Dai–ichi nuclear power plant accident. In particular, computer–based molecular modeling supported by advanced hardware and algorithms has proven to be a powerful approach. Its application can now generally encompass the full complexity of clay particle adsorption sites from basal surfaces to interlayers with inserted water molecules, to edges including fresh and weathered frayed ones. On the othermore » hand, its methodological schemes are now varied from traditional force–field molecular dynamics on large–scale realizations composed of many thousands of atoms including water molecules to first–principles methods on smaller models in rather exacting fashion. In this article, we overview new understanding enabled by simulations across methodological variations, focusing on recent insights that connect with experimental observations, namely: 1) the energy scale for cesium adsorption on the basal surface, 2) progress in understanding the structure of clay edges, which is difficult to probe experimentally, 3) cesium adsorption properties at hydrated interlayer sites, 4) the importance of the size relationship between the ionic radius of cesium and the interlayer distance at frayed edge sites, 5) the migration of cesium into deep interlayer sites, and 6) the effects of nuclear decay of radiocesium. Key experimental observations that motivate these simulation advances are also summarized. Furthermore, some directions toward future solutions of waste soil management are discussed based on the obtained microscopic insights.« less

  15. Advanced Simulation and Computing: A Summary Report to the Director's Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less

  16. Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to the Enterprise

    DTIC Science & Technology

    2010-04-29

    Technology: From the Office Larry Smith Software Technology Support Center to the Enterprise 517 SMXS/MXDEA 6022 Fir Avenue Hill AFB, UT 84056 801...2010 to 00-00-2010 4. TITLE AND SUBTITLE Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to

  17. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  18. Advances in snow cover distributed modelling via ensemble simulations and assimilation of satellite data

    NASA Astrophysics Data System (ADS)

    Revuelto, J.; Dumont, M.; Tuzet, F.; Vionnet, V.; Lafaysse, M.; Lecourt, G.; Vernay, M.; Morin, S.; Cosme, E.; Six, D.; Rabatel, A.

    2017-12-01

    Nowadays snowpack models show a good capability in simulating the evolution of snow in mountain areas. However singular deviations of meteorological forcing and shortcomings in the modelling of snow physical processes, when accumulated on time along a snow season, could produce large deviations from real snowpack state. The evaluation of these deviations is usually assessed with on-site observations from automatic weather stations. Nevertheless the location of these stations could strongly influence the results of these evaluations since local topography may have a marked influence on snowpack evolution. Despite the evaluation of snowpack models with automatic weather stations usually reveal good results, there exist a lack of large scale evaluations of simulations results on heterogeneous alpine terrain subjected to local topographic effects.This work firstly presents a complete evaluation of the detailed snowpack model Crocus over an extended mountain area, the Arve upper catchment (western European Alps). This catchment has a wide elevation range with a large area above 2000m a.s.l. and/or glaciated. The evaluation compares results obtained with distributed and semi-distributed simulations (the latter nowadays used on the operational forecasting). Daily observations of the snow covered area from MODIS satellite sensor, seasonal glacier surface mass balance evolution measured in more than 65 locations and the galciers annual equilibrium line altitude from Landsat/Spot/Aster satellites, have been used for model evaluation. Additionally the latest advances in producing ensemble snowpack simulations for assimilating satellite reflectance data over extended areas will be presented. These advances comprises the generation of an ensemble of downscaled high-resolution meteorological forcing from meso-scale meteorological models and the application of a particle filter scheme for assimilating satellite observations. Despite the results are prefatory, they show a good

  19. Efficient Multi-Dimensional Simulation of Quantum Confinement Effects in Advanced MOS Devices

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Ancona, Mario G.; Rafferty, Conor S.; Yu, Zhiping

    2000-01-01

    We investigate the density-gradient (DG) transport model for efficient multi-dimensional simulation of quantum confinement effects in advanced MOS devices. The formulation of the DG model is described as a quantum correction ot the classical drift-diffusion model. Quantum confinement effects are shown to be significant in sub-100nm MOSFETs. In thin-oxide MOS capacitors, quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion of quantum effects may reduce gate capacitance by 25% or more. As a result, the inclusion of quantum effects in simulations dramatically improves the match between C-V simulations and measurements for oxide thickness down to 2 nm. Significant quantum corrections also occur in the I-V characteristics of short-channel (30 to 100 nm) n-MOSFETs, with current drive reduced by up to 70%. This effect is shown to result from reduced inversion charge due to quantum confinement of electrons in the channel. Also, subthreshold slope is degraded by 15 to 20 mV/decade with the inclusion of quantum effects via the density-gradient model, and short channel effects (in particular, drain-induced barrier lowering) are noticeably increased.

  20. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. A numerical investigation on the efficiency of range extending systems using Advanced Vehicle Simulator

    NASA Astrophysics Data System (ADS)

    Varnhagen, Scott; Same, Adam; Remillard, Jesse; Park, Jae Wan

    2011-03-01

    Series plug-in hybrid electric vehicles of varying engine configuration and battery capacity are modeled using Advanced Vehicle Simulator (ADVISOR). The performance of these vehicles is analyzed on the bases of energy consumption and greenhouse gas emissions on the tank-to-wheel and well-to-wheel paths. Both city and highway driving conditions are considered during the simulation. When simulated on the well-to-wheel path, it is shown that the range extender with a Wankel rotary engine consumes less energy and emits fewer greenhouse gases compared to the other systems with reciprocating engines during many driving cycles. The rotary engine has a higher power-to-weight ratio and lower noise, vibration and harshness compared to conventional reciprocating engines, although performs less efficiently. The benefits of a Wankel engine make it an attractive option for use as a range extender in a plug-in hybrid electric vehicle.

  2. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  3. Advanced Grid Simulator for Multi-Megawatt Power Converter Testing and Certification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koralewicz, Przemyslaw; Gevorgian, Vahan; Wallen, Robb

    2017-02-16

    Grid integration testing of inverter-coupled renewable energy technologies is an essential step in the qualification of renewable energy and energy storage systems to ensure the stability of the power system. New types of devices must be thoroughly tested and validated for compliance with relevant grid codes and interconnection requirements. For this purpose, highly specialized custom-made testing equipment is needed to emulate various types of realistic grid conditions that are required by certification bodies or for research purposes. For testing multi-megawatt converters, a high power grid simulator capable of creating controlled grid conditions and meeting both power quality and dynamic characteristicsmore » is needed. This paper describes the new grid simulator concept based on ABB's medium voltage ACS6000 drive technology that utilizes advanced modulation and control techniques to create an unique testing platform for various multi-megawatt power converter systems. Its performance is demonstrated utilizing the test results obtained during commissioning activities at the National Renewable Energy Laboratory in Colorado, USA.« less

  4. Impact of an Advanced Cardiac Life Support Simulation Laboratory Experience on Pharmacy Student Confidence and Knowledge.

    PubMed

    Maxwell, Whitney D; Mohorn, Phillip L; Haney, Jason S; Phillips, Cynthia M; Lu, Z Kevin; Clark, Kimberly; Corboy, Alex; Ragucci, Kelly R

    2016-10-25

    Objective. To assess the impact of an advanced cardiac life support (ACLS) simulation on pharmacy student confidence and knowledge. Design. Third-year pharmacy students participated in a simulation experience that consisted of team roles training, high-fidelity ACLS simulations, and debriefing. Students completed a pre/postsimulation confidence and knowledge assessment. Assessment. Overall, student knowledge assessment scores and student confidence scores improved significantly. Student confidence and knowledge changes from baseline were not significantly correlated. Conversely, a significant, weak positive correlation between presimulation studying and both presimulation confidence and presimulation knowledge was discovered. Conclusions. Overall, student confidence and knowledge assessment scores in ACLS significantly improved from baseline; however, student confidence and knowledge were not significantly correlated.

  5. TID Simulation of Advanced CMOS Devices for Space Applications

    NASA Astrophysics Data System (ADS)

    Sajid, Muhammad

    2016-07-01

    This paper focuses on Total Ionizing Dose (TID) effects caused by accumulation of charges at silicon dioxide, substrate/silicon dioxide interface, Shallow Trench Isolation (STI) for scaled CMOS bulk devices as well as at Buried Oxide (BOX) layer in devices based on Silicon-On-Insulator (SOI) technology to be operated in space radiation environment. The radiation induced leakage current and corresponding density/concentration electrons in leakage current path was presented/depicted for 180nm, 130nm and 65nm NMOS, PMOS transistors based on CMOS bulk as well as SOI process technologies on-board LEO and GEO satellites. On the basis of simulation results, the TID robustness analysis for advanced deep sub-micron technologies was accomplished up to 500 Krad. The correlation between the impact of technology scaling and magnitude of leakage current with corresponding total dose was established utilizing Visual TCAD Genius program.

  6. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  7. Advances in Turbulent Combustion Dynamics Simulations in Bluff-Body Stabilized Flames-Body Stabilized Flames

    DTIC Science & Technology

    2015-11-30

    Master’s Thesis 3. DATES COVERED (From - To) 01 Nov 2015 – 30 Nov 2015 4. TITLE AND SUBTITLE Advances in Turbulent Combustion Dynamics Simulations...the three main aspects of bluff-body stabilized flames: stationary combustion , lean blow-out, and thermo-acoustic instabilities. For the cases of...stationary combustion and lean blow-out, an improved version of the Linear Eddy Model approach is used, while in the case of thermo-acoustic

  8. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0) TAPE

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  9. Advanced thermal energy management: A thermal test bed and heat pipe simulation

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1986-01-01

    Work initiated on a common-module thermal test simulation was continued, and a second project on heat pipe simulation was begun. The test bed, constructed from surplus Skylab equipment, was modeled and solved for various thermal load and flow conditions. Low thermal load caused the radiator fluid, Coolanol 25, to thicken due to its temperature avoided by using a regenerator-heat-exchanger. Other possible solutions modeled include a radiator heater and shunting heat from the central thermal bus to the radiator. Also, module air temperature can become excessive with high avionics load. A second preoject concerning advanced heat pipe concepts was initiated. A program was written which calculates fluid physical properties, liquid and vapor pressure in the evaporator and condenser, fluid flow rates, and thermal flux. The program is directed to evaluating newer heat pipe wicks and geometries, especially water in an artery surrounded by six vapor channels. Effects of temperature, groove and slot dimensions, and wick properties are reported.

  10. The Numerical Technique for the Landslide Tsunami Simulations Based on Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Kozelkov, A. S.

    2017-12-01

    The paper presents an integral technique simulating all phases of a landslide-driven tsunami. The technique is based on the numerical solution of the system of Navier-Stokes equations for multiphase flows. The numerical algorithm uses a fully implicit approximation method, in which the equations of continuity and momentum conservation are coupled through implicit summands of pressure gradient and mass flow. The method we propose removes severe restrictions on the time step and allows simulation of tsunami propagation to arbitrarily large distances. The landslide origin is simulated as an individual phase being a Newtonian fluid with its own density and viscosity and separated from the water and air phases by an interface. The basic formulas of equation discretization and expressions for coefficients are presented, and the main steps of the computation procedure are described in the paper. To enable simulations of tsunami propagation across wide water areas, we propose a parallel algorithm of the technique implementation, which employs an algebraic multigrid method. The implementation of the multigrid method is based on the global level and cascade collection algorithms that impose no limitations on the paralleling scale and make this technique applicable to petascale systems. We demonstrate the possibility of simulating all phases of a landslide-driven tsunami, including its generation, propagation and uprush. The technique has been verified against the problems supported by experimental data. The paper describes the mechanism of incorporating bathymetric data to simulate tsunamis in real water areas of the world ocean. Results of comparison with the nonlinear dispersion theory, which has demonstrated good agreement, are presented for the case of a historical tsunami of volcanic origin on the Montserrat Island in the Caribbean Sea.

  11. The role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma.

    PubMed

    Kunimatsu-Sanuki, Shiho; Iwase, Aiko; Araie, Makoto; Aoki, Yuki; Hara, Takeshi; Fukuchi, Takeo; Udagawa, Sachiko; Ohkubo, Shinji; Sugiyama, Kazuhisa; Matsumoto, Chota; Nakazawa, Toru; Yamaguchi, Takuhiro; Ono, Hiroshi

    2017-07-01

    To assess the role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma. Normal subjects and patients with glaucoma with mean deviation <-12 dB in both eyes (Humphrey Field Analyzer 24-2 SITA-S program) used a driving simulator (DS; Honda Motor, Tokyo). Two scenarios in which oncoming cars turned right crossing the driver's path were chosen. We compared the binocular integrated visual field (IVF) in the patients who were involved in collisions and those who were not. We performed a multivariate logistic regression analysis; the dependent parameter was collision involvement, and the independent parameters were age, visual acuity and mean sensitivity of the IVF subfields. The study included 43 normal subjects and 100 patients with advanced glaucoma. And, 5 of the 100 patients with advanced glaucoma experienced simulator sickness during the main test and were thus excluded. In total, 95 patients with advanced glaucoma and 43 normal subjects completed the main test of DS. Advanced glaucoma patients had significantly more collisions than normal patients in one or both DS scenarios (p<0.001). The patients with advanced glaucoma who were involved in collisions were older (p=0.050) and had worse visual acuity in the better eye (p<0.001) and had lower mean IVF sensitivity in the inferior hemifield, both 0°-12° and 13°-24° in comparison with who were not involved in collisions (p=0.012 and p=0.034). A logistic regression analysis revealed that collision involvement was significantly associated with decreased inferior IVF mean sensitivity from 13° to 24° (p=0.041), in addition to older age and lower visual acuity (p=0.018 and p<0.001). Our data suggest that the inferior hemifield was associated with the incidence of motor vehicle collisions with oncoming cars in patients with advanced glaucoma. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a

  12. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  13. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    NASA Astrophysics Data System (ADS)

    Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars

    2014-05-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.

  14. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  15. A report documenting the completion of the Los Alamos National Laboratory portion of the ASC level II milestone ""Visualization on the supercomputing platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, James P; Patchett, John M; Lo, Li - Ta

    2011-01-24

    This report provides documentation for the completion of the Los Alamos portion of the ASC Level II 'Visualization on the Supercomputing Platform' milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratory and Los Alamos National Laboratory. The milestone text is shown in Figure 1 with the Los Alamos portions highlighted in boldfaced text. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is the most computationally intensive portion of the visualization process. Formore » terascale platforms, commodity clusters with graphics processors (GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the perfromance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. In conclusion, we improved CPU-based rendering performance by a a factor of 2-10 times on our tests. In addition, we evaluated CPU and CPU-based rendering performance. We encourage production visualization experts to consider

  16. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    computers  will  require extensive  research and development  to have a chance of  reaching  the  exascale   level.  Even  if  exascale   level machines  can...generations of petascale and then  exascale   level  computing  capability.  This  includes  both  the  hardware  and  the  complex  software  that  may  be...required  for  the  architectures  needed  for  exacscale  capability.  The  challenges  are  extremely  daunting,  especially  at  the  exascale

  17. Generation of large scale urban environments to support advanced sensor and seeker simulation

    NASA Astrophysics Data System (ADS)

    Giuliani, Joseph; Hershey, Daniel; McKeown, David, Jr.; Willis, Carla; Van, Tan

    2009-05-01

    One of the key aspects for the design of a next generation weapon system is the need to operate in cluttered and complex urban environments. Simulation systems rely on accurate representation of these environments and require automated software tools to construct the underlying 3D geometry and associated spectral and material properties that are then formatted for various objective seeker simulation systems. Under an Air Force Small Business Innovative Research (SBIR) contract, we have developed an automated process to generate 3D urban environments with user defined properties. These environments can be composed from a wide variety of source materials, including vector source data, pre-existing 3D models, and digital elevation models, and rapidly organized into a geo-specific visual simulation database. This intermediate representation can be easily inspected in the visible spectrum for content and organization and interactively queried for accuracy. Once the database contains the required contents, it can then be exported into specific synthetic scene generation runtime formats, preserving the relationship between geometry and material properties. To date an exporter for the Irma simulation system developed and maintained by AFRL/Eglin has been created and a second exporter to Real Time Composite Hardbody and Missile Plume (CHAMP) simulation system for real-time use is currently being developed. This process supports significantly more complex target environments than previous approaches to database generation. In this paper we describe the capabilities for content creation for advanced seeker processing algorithms simulation and sensor stimulation, including the overall database compilation process and sample databases produced and exported for the Irma runtime system. We also discuss the addition of object dynamics and viewer dynamics within the visual simulation into the Irma runtime environment.

  18. Reactivity Initiated Accident Simulation to Inform Transient Testing of Candidate Advanced Cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R; Wysocki, Aaron J; Terrani, Kurt A

    2016-01-01

    Abstract. Advanced cladding materials with potentially enhanced accident tolerance will yield different light water reactor performance and safety characteristics than the present zirconium-based cladding alloys. These differences are due to different cladding material properties and responses to the transient, and to some extent, reactor physics, thermal, and hydraulic characteristics. Some of the differences in reactors physics characteristics will be driven by the fundamental properties (e.g., absorption in iron for an iron-based cladding) and others will be driven by design modifications necessitated by the candidate cladding materials (e.g., a larger fuel pellet to compensate for parasitic absorption). Potential changes in thermalmore » hydraulic limits after transition from the current zirconium-based cladding to the advanced materials will also affect the transient response of the integral fuel. This paper leverages three-dimensional reactor core simulation capabilities to inform on appropriate experimental test conditions for candidate advanced cladding materials in a control rod ejection event. These test conditions are using three-dimensional nodal kinetics simulations of a reactivity initiated accident (RIA) in a representative state-of-the-art pressurized water reactor with both nuclear-grade iron-chromium-aluminum (FeCrAl) and silicon carbide based (SiC-SiC) cladding materials. The effort yields boundary conditions for experimental mechanical tests, specifically peak cladding strain during the power pulse following the rod ejection. The impact of candidate cladding materials on the reactor kinetics behavior of RIA progression versus reference zirconium cladding is predominantly due to differences in: (1) fuel mass/volume/specific power density, (2) spectral effects due to parasitic neutron absorption, (3) control rod worth due to hardened (or softened) spectrum, and (4) initial conditions due to power peaking and neutron transport cross sections in

  19. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individualmore » work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.« less

  20. Design and development of a virtual reality simulator for advanced cardiac life support training.

    PubMed

    Vankipuram, Akshay; Khanal, Prabal; Ashby, Aaron; Vankipuram, Mithra; Gupta, Ashish; DrummGurnee, Denise; Josey, Karen; Smith, Marshall

    2014-07-01

    The use of virtual reality (VR) training tools for medical education could lead to improvements in the skills of clinicians while providing economic incentives for healthcare institutions. The use of VR tools can also mitigate some of the drawbacks currently associated with providing medical training in a traditional clinical environment such as scheduling conflicts and the need for specialized equipment (e.g., high-fidelity manikins). This paper presents the details of the framework and the development methodology associated with a VR-based training simulator for advanced cardiac life support, a time critical, team-based medical scenario. In addition, we also report the key findings of a usability study conducted to assess the efficacy of various features of this VR simulator through a postuse questionnaire administered to various care providers. The usability questionnaires were completed by two groups that used two different versions of the VR simulator. One version consisted of the VR trainer with it all its features and a minified version with certain immersive features disabled. We found an increase in usability scores from the minified group to the full VR group.

  1. Time-domain hybrid method for simulating large amplitude motions of ships advancing in waves

    NASA Astrophysics Data System (ADS)

    Liu, Shukui; Papanikolaou, Apostolos D.

    2011-03-01

    Typical results obtained by a newly developed, nonlinear time domain hybrid method for simulating large amplitude motions of ships advancing with constant forward speed in waves are presented. The method is hybrid in the way of combining a time-domain transient Green function method and a Rankine source method. The present approach employs a simple double integration algorithm with respect to time to simulate the free-surface boundary condition. During the simulation, the diffraction and radiation forces are computed by pressure integration over the mean wetted surface, whereas the incident wave and hydrostatic restoring forces/moments are calculated on the instantaneously wetted surface of the hull. Typical numerical results of application of the method to the seakeeping performance of a standard containership, namely the ITTC S175, are herein presented. Comparisons have been made between the results from the present method, the frequency domain 3D panel method (NEWDRIFT) of NTUA-SDL and available experimental data and good agreement has been observed for all studied cases between the results of the present method and comparable other data.

  2. Advanced simulation technology used to reduce accident rates through a better understanding of human behaviors and human perception

    NASA Astrophysics Data System (ADS)

    Manser, Michael P.; Hancock, Peter A.

    1996-06-01

    Human beings and technology have attained a mutually dependent and symbiotic relationship. It is easy to recognize how each depends on the other for survival. It is also easy to see how technology advances due to human activities. However, the role technology plays in advancing humankind is seldom examined. This presentation examines two research areas where the role of advanced visual simulation systems play an integral and essential role in understanding human perception and behavior. The ultimate goal of this research is the betterment of humankind through reduced accident and death rates in transportation environments. The first research area examined involved the estimation of time-to-contact. A high-fidelity wrap-around simulator (RAS) was used to examine people's ability to estimate time-to- contact. The ability of people to estimate the amount of time before an oncoming vehicle will collide with them is a necessary skill for avoiding collisions. A vehicle approached participants at one of three velocities, and while en route to the participant, the vehicle disappeared. The participants' task was to respond when they felt the accuracy of time-to-contact estimates and the practical applications of the result. The second area of research investigates the effects of various visual stimuli on underground transportation tunnel walls for the perception of vehicle speed. A RAS is paramount in creating visual patterns in peripheral vision. Flat-screen or front-screen simulators do not have this ability. Results are discussed in terms of speed perception and the application of these results to real world environments.

  3. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  4. Simulation of Swap-Out Reliability For The Advance Photon Source Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.

    2017-06-01

    The proposed upgrade of the Advanced Photon Source (APS) to a multibend-achromat lattice relies on the use of swap-out injection to accommodate the small dynamic acceptance, allow use of unusual insertion devices, and minimize collective effects at high single-bunch charge. This, combined with the short beam lifetime, will make injector reliability even more important than it is for top-up operation. We used historical data for the APS injector complex to obtain probability distributions for injector up-time and down-time durations. Using these distributions, we simulated several years of swap-out operation for the upgraded lattice for several operatingmodes. The results indicate thatmore » obtaining very high availability of beam in the storage ring will require improvements to injector reliability.« less

  5. Simulation and Advanced Practice Nursing Education

    ERIC Educational Resources Information Center

    Blue, Dawn I.

    2016-01-01

    This quantitative study compared changes in level of confidence resulting from participation in simulation or traditional instructional methods for BSN (Bachelor of Science in Nursing) to DNP (Doctor of Nursing Practice) students in a nurse practitioner course when they entered the clinical practicum. Simulation has been used in many disciplines…

  6. A randomized, controlled trial of in situ pediatric advanced life support recertification ("pediatric advanced life support reconstructed") compared with standard pediatric advanced life support recertification for ICU frontline providers*.

    PubMed

    Kurosawa, Hiroshi; Ikeyama, Takanari; Achuff, Patricia; Perkel, Madeline; Watson, Christine; Monachino, Annemarie; Remy, Daphne; Deutsch, Ellen; Buchanan, Newton; Anderson, Jodee; Berg, Robert A; Nadkarni, Vinay M; Nishisaki, Akira

    2014-03-01

    Recent evidence shows poor retention of Pediatric Advanced Life Support provider skills. Frequent refresher training and in situ simulation are promising interventions. We developed a "Pediatric Advanced Life Support-reconstructed" recertification course by deconstructing the training into six 30-minute in situ simulation scenario sessions delivered over 6 months. We hypothesized that in situ Pediatric Advanced Life Support-reconstructed implementation is feasible and as effective as standard Pediatric Advanced Life Support recertification. A prospective randomized, single-blinded trial. Single-center, large, tertiary PICU in a university-affiliated children's hospital. Nurses and respiratory therapists in PICU. Simulation-based modular Pediatric Advanced Life Support recertification training. Simulation-based pre- and postassessment sessions were conducted to evaluate participants' performance. Video-recorded sessions were rated by trained raters blinded to allocation. The primary outcome was skill performance measured by a validated Clinical Performance Tool, and secondary outcome was behavioral performance measured by a Behavioral Assessment Tool. A mixed-effect model was used to account for baseline differences. Forty participants were prospectively randomized to Pediatric Advanced Life Support reconstructed versus standard Pediatric Advanced Life Support with no significant difference in demographics. Clinical Performance Tool score was similar at baseline in both groups and improved after Pediatric Advanced Life Support reconstructed (pre, 16.3 ± 4.1 vs post, 22.4 ± 3.9; p < 0.001), but not after standard Pediatric Advanced Life Support (pre, 14.3 ± 4.7 vs post, 14.9 ± 4.4; p =0.59). Improvement of Clinical Performance Tool was significantly higher in Pediatric Advanced Life Support reconstructed compared with standard Pediatric Advanced Life Support (p = 0.006). Behavioral Assessment Tool improved in both groups: Pediatric Advanced Life Support

  7. Assessment of driving-related performance in chronic whiplash using an advanced driving simulator.

    PubMed

    Takasaki, Hiroshi; Treleaven, Julia; Johnston, Venerina; Rakotonirainy, Andry; Haines, Andrew; Jull, Gwendolen

    2013-11-01

    Driving is often nominated as problematic by individuals with chronic whiplash associated disorders (WAD), yet driving-related performance has not been evaluated objectively. The purpose of this study was to test driving-related performance in persons with chronic WAD against healthy controls of similar age, gender and driving experience to determine if driving-related performance in the WAD group was sufficiently impaired to recommend fitness to drive assessment. Driving-related performance was assessed using an advanced driving simulator during three driving scenarios; freeway, residential and a central business district (CBD). Total driving duration was approximately 15min. Five driving tasks which could cause a collision (critical events) were included in the scenarios. In addition, the effect of divided attention (identify red dots projected onto side or rear view mirrors) was assessed three times in each scenario. Driving performance was measured using the simulator performance index (SPI) which is calculated from 12 measures. z-Scores for all SPI measures were calculated for each WAD subject based on mean values of the control subjects. The z-scores were then averaged for the WAD group. A z-score of ≤-2 indicated a driving failing grade in the simulator. The number of collisions over the five critical events was compared between the WAD and control groups as was reaction time and missed response ratio in identifying the red dots. Seventeen WAD and 26 control subjects commenced the driving assessment. Demographic data were comparable between the groups. All subjects completed the freeway scenario but four withdrew during the residential and eight during the CBD scenario because of motion sickness. All scenarios were completed by 14 WAD and 17 control subjects. Mean z-scores for the SPI over the three scenarios was statistically lower in the WAD group (-0.3±0.3; P<0.05) but the score was not below the cut-off point for safe driving. There were no

  8. Advanced Simulation and Computing Business Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rummel, E.

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsiblemore » for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.« less

  9. Message passing interface and multithreading hybrid for parallel molecular docking of large databases on petascale high performance computing machines.

    PubMed

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2013-04-30

    A mixed parallel scheme that combines message passing interface (MPI) and multithreading was implemented in the AutoDock Vina molecular docking program. The resulting program, named VinaLC, was tested on the petascale high performance computing (HPC) machines at Lawrence Livermore National Laboratory. To exploit the typical cluster-type supercomputers, thousands of docking calculations were dispatched by the master process to run simultaneously on thousands of slave processes, where each docking calculation takes one slave process on one node, and within the node each docking calculation runs via multithreading on multiple CPU cores and shared memory. Input and output of the program and the data handling within the program were carefully designed to deal with large databases and ultimately achieve HPC on a large number of CPU cores. Parallel performance analysis of the VinaLC program shows that the code scales up to more than 15K CPUs with a very low overhead cost of 3.94%. One million flexible compound docking calculations took only 1.4 h to finish on about 15K CPUs. The docking accuracy of VinaLC has been validated against the DUD data set by the re-docking of X-ray ligands and an enrichment study, 64.4% of the top scoring poses have RMSD values under 2.0 Å. The program has been demonstrated to have good enrichment performance on 70% of the targets in the DUD data set. An analysis of the enrichment factors calculated at various percentages of the screening database indicates VinaLC has very good early recovery of actives. Copyright © 2013 Wiley Periodicals, Inc.

  10. Advances in Global Full Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Modrak, R. T.; Orsvuran, R.; Smith, J. A.; Komatitsch, D.; Peter, D. B.

    2017-12-01

    Information about Earth's interior comes from seismograms recorded at its surface. Seismic imaging based on spectral-element and adjoint methods has enabled assimilation of this information for the construction of 3D (an)elastic Earth models. These methods account for the physics of wave excitation and propagation by numerically solving the equations of motion, and require the execution of complex computational procedures that challenge the most advanced high-performance computing systems. Current research is petascale; future research will require exascale capabilities. The inverse problem consists of reconstructing the characteristics of the medium from -often noisy- observations. A nonlinear functional is minimized, which involves both the misfit to the measurements and a Tikhonov-type regularization term to tackle inherent ill-posedness. Achieving scalability for the inversion process on tens of thousands of multicore processors is a task that offers many research challenges. We initiated global "adjoint tomography" using 253 earthquakes and produced the first-generation model named GLAD-M15, with a transversely isotropic model parameterization. We are currently running iterations for a second-generation anisotropic model based on the same 253 events. In parallel, we continue iterations for a transversely isotropic model with a larger dataset of 1,040 events to determine higher-resolution plume and slab images. A significant part of our research has focused on eliminating I/O bottlenecks in the adjoint tomography workflow. This has led to the development of a new Adaptable Seismic Data Format based on HDF5, and post-processing tools based on the ADIOS library developed by Oak Ridge National Laboratory. We use the Ensemble Toolkit for workflow stabilization & management to automate the workflow with minimal human interaction.

  11. Precision Casting via Advanced Simulation and Manufacturing

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A two-year program was conducted to develop and commercially implement selected casting manufacturing technologies to enable significant reductions in the costs of castings, increase the complexity and dimensional accuracy of castings, and reduce the development times for delivery of high quality castings. The industry-led R&D project was cost shared with NASA's Aerospace Industry Technology Program (AITP). The Rocketdyne Division of Boeing North American, Inc. served as the team lead with participation from Lockheed Martin, Ford Motor Company, Howmet Corporation, PCC Airfoils, General Electric, UES, Inc., University of Alabama, Auburn University, Robinson, Inc., Aracor, and NASA-LeRC. The technical effort was organized into four distinct tasks. The accomplishments reported herein. Task 1.0 developed advanced simulation technology for core molding. Ford headed up this task. On this program, a specialized core machine was designed and built. Task 2.0 focused on intelligent process control for precision core molding. Howmet led this effort. The primary focus of these experimental efforts was to characterize the process parameters that have a strong impact on dimensional control issues of injection molded cores during their fabrication. Task 3.0 developed and applied rapid prototyping to produce near net shape castings. Rocketdyne was responsible for this task. CAD files were generated using reverse engineering, rapid prototype patterns were fabricated using SLS and SLA, and castings produced and evaluated. Task 4.0 was aimed at developing technology transfer. Rocketdyne coordinated this task. Casting related technology, explored and evaluated in the first three tasks of this program, was implemented into manufacturing processes.

  12. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  13. NASA Advanced Supercomputing Facility Expansion

    NASA Technical Reports Server (NTRS)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  14. Parameter identification studies on the NASA/Ames Research Center Advanced Concepts Flight Simulator. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Mckavitt, Thomas P., Jr.

    1990-01-01

    The results of an aircraft parameters identification study conducted on the National Aeronautics and Space Administration/Ames Research Center Advanced Concepts Flight Simulator (ACFS) in conjunction with the Navy-NASA Joint Institute of Aeronautics are given. The ACFS is a commercial airline simulator with a design based on future technology. The simulator is used as a laboratory for human factors research and engineering as applied to the commercial airline industry. Parametric areas examined were engine pressure ratio (EPR), optimum long range cruise Mach number, flap reference speed, and critical take-off speeds. Results were compared with corresponding parameters of the Boeing 757 and 767 aircraft. This comparison identified two areas where improvements can be made: (1) low maximum lift coefficients (on the order of 20-25 percent less than those of a 757); and (2) low optimum cruise Mach numbers. Recommendations were made to those anticipated with the application of future technologies.

  15. Correlation of Simulation Examination to Written Test Scores for Advanced Cardiac Life Support Testing: Prospective Cohort Study.

    PubMed

    Strom, Suzanne L; Anderson, Craig L; Yang, Luanna; Canales, Cecilia; Amin, Alpesh; Lotfipour, Shahram; McCoy, C Eric; Osborn, Megan Boysen; Langdorf, Mark I

    2015-11-01

    Traditional Advanced Cardiac Life Support (ACLS) courses are evaluated using written multiple-choice tests. High-fidelity simulation is a widely used adjunct to didactic content, and has been used in many specialties as a training resource as well as an evaluative tool. There are no data to our knowledge that compare simulation examination scores with written test scores for ACLS courses. To compare and correlate a novel high-fidelity simulation-based evaluation with traditional written testing for senior medical students in an ACLS course. We performed a prospective cohort study to determine the correlation between simulation-based evaluation and traditional written testing in a medical school simulation center. Students were tested on a standard acute coronary syndrome/ventricular fibrillation cardiac arrest scenario. Our primary outcome measure was correlation of exam results for 19 volunteer fourth-year medical students after a 32-hour ACLS-based Resuscitation Boot Camp course. Our secondary outcome was comparison of simulation-based vs. written outcome scores. The composite average score on the written evaluation was substantially higher (93.6%) than the simulation performance score (81.3%, absolute difference 12.3%, 95% CI [10.6-14.0%], p<0.00005). We found a statistically significant moderate correlation between simulation scenario test performance and traditional written testing (Pearson r=0.48, p=0.04), validating the new evaluation method. Simulation-based ACLS evaluation methods correlate with traditional written testing and demonstrate resuscitation knowledge and skills. Simulation may be a more discriminating and challenging testing method, as students scored higher on written evaluation methods compared to simulation.

  16. Characterization and Simulation of the Thermoacoustic Instability Behavior of an Advanced, Low Emissions Combustor Prototype

    NASA Technical Reports Server (NTRS)

    DeLaat, John C.; Paxson, Daniel E.

    2008-01-01

    Extensive research is being done toward the development of ultra-low-emissions combustors for aircraft gas turbine engines. However, these combustors have an increased susceptibility to thermoacoustic instabilities. This type of instability was recently observed in an advanced, low emissions combustor prototype installed in a NASA Glenn Research Center test stand. The instability produces pressure oscillations that grow with increasing fuel/air ratio, preventing full power operation. The instability behavior makes the combustor a potentially useful test bed for research into active control methods for combustion instability suppression. The instability behavior was characterized by operating the combustor at various pressures, temperatures, and fuel and air flows representative of operation within an aircraft gas turbine engine. Trends in instability behavior versus operating condition have been identified and documented, and possible explanations for the trends provided. A simulation developed at NASA Glenn captures the observed instability behavior. The physics-based simulation includes the relevant physical features of the combustor and test rig, employs a Sectored 1-D approach, includes simplified reaction equations, and provides time-accurate results. A computationally efficient method is used for area transitions, which decreases run times and allows the simulation to be used for parametric studies, including control method investigations. Simulation results show that the simulation exhibits a self-starting, self-sustained combustion instability and also replicates the experimentally observed instability trends versus operating condition. Future plans are to use the simulation to investigate active control strategies to suppress combustion instabilities and then to experimentally demonstrate active instability suppression with the low emissions combustor prototype, enabling full power, stable operation.

  17. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  18. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  19. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  20. An advanced constitutive model in the sheet metal forming simulation: the Teodosiu microstructural model and the Cazacu Barlat yield criterion

    NASA Astrophysics Data System (ADS)

    Alves, J. L.; Oliveira, M. C.; Menezes, L. F.

    2004-06-01

    Two constitutive models used to describe the plastic behavior of sheet metals in the numerical simulation of sheet metal forming process are studied: a recently proposed advanced constitutive model based on the Teodosiu microstructural model and the Cazacu Barlat yield criterion is compared with a more classical one, based on the Swift law and the Hill 1948 yield criterion. These constitutive models are implemented into DD3IMP, a finite element home code specifically developed to simulate sheet metal forming processes, which generically is a 3-D elastoplastic finite element code with an updated Lagrangian formulation, following a fully implicit time integration scheme, large elastoplastic strains and rotations. Solid finite elements and parametric surfaces are used to model the blank sheet and tool surfaces, respectively. Some details of the numerical implementation of the constitutive models are given. Finally, the theory is illustrated with the numerical simulation of the deep drawing of a cylindrical cup. The results show that the proposed advanced constitutive model predicts with more exactness the final shape (medium height and ears profile) of the formed part, as one can conclude from the comparison with the experimental results.

  1. On-time reliability impacts of advanced traveler information services (ATIS). Volume II, Extensions and applications of the simulated yoked study concept

    DOT National Transportation Integrated Search

    2002-03-01

    In a simulated yoke study, estimates of roadway travel times are archived from web-based Advanced Traveler Information Systems (ATIS) and used to recreate hypothetical, retrospective paired driving trials between travelers with and without ATIS. Prev...

  2. Simulations of Liners and Test Objects for a New Atlas Advanced Radiography Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. V. Morgan; S. Iversen; R. A. Hilko

    2002-06-01

    The Advanced Radiographic Source (ARS) will improve the data significantly due to its smaller source width. Because of the enhanced ARS output, larger source-to-object distances are a reality. The harder ARS source will allow radiography of thick high-Z targets. The five different spectral simulations resulted in similar imaging detector weighted transmission. This work used a limited set of test objects and imaging detectors. Other test objects and imaging detectors could possibly change the MVp-sensitivity result. The effect of material motion blur must be considered for the ARS due to the expected smaller X-ray source size. This study supports the originalmore » 1.5-MVp value.« less

  3. Analog-digital simulation of transient-induced logic errors and upset susceptibility of an advanced control system

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Choi, G.; Iyer, R. K.

    1990-01-01

    A simulation study is described which predicts the susceptibility of an advanced control system to electrical transients resulting in logic errors, latched errors, error propagation, and digital upset. The system is based on a custom-designed microprocessor and it incorporates fault-tolerant techniques. The system under test and the method to perform the transient injection experiment are described. Results for 2100 transient injections are analyzed and classified according to charge level, type of error, and location of injection.

  4. Advancing renal education: hybrid simulation, using simulated patients to enhance realism in haemodialysis education.

    PubMed

    Dunbar-Reid, Kylie; Sinclair, Peter M; Hudson, Denis

    2015-06-01

    Simulation is a well-established and proven teaching method, yet its use in renal education is not widely reported. Criticisms of simulation-based teaching include limited realism and a lack of authentic patient interaction. This paper discusses the benefits and challenges of high-fidelity simulation and suggests hybrid simulation as a complementary model to existing simulation programmes. Through the use of a simulated patient, hybrid simulation can improve the authenticity of renal simulation-based education while simultaneously teaching and assessing technologically enframed caring. © 2015 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  5. Molecular dynamics simulations: advances and applications

    PubMed Central

    Hospital, Adam; Goñi, Josep Ramon; Orozco, Modesto; Gelpí, Josep L

    2015-01-01

    Molecular dynamics simulations have evolved into a mature technique that can be used effectively to understand macromolecular structure-to-function relationships. Present simulation times are close to biologically relevant ones. Information gathered about the dynamic properties of macromolecules is rich enough to shift the usual paradigm of structural bioinformatics from studying single structures to analyze conformational ensembles. Here, we describe the foundations of molecular dynamics and the improvements made in the direction of getting such ensemble. Specific application of the technique to three main issues (allosteric regulation, docking, and structure refinement) is discussed. PMID:26604800

  6. Parallel processing of real-time dynamic systems simulation on OSCAR (Optimally SCheduled Advanced multiprocessoR)

    NASA Technical Reports Server (NTRS)

    Kasahara, Hironori; Honda, Hiroki; Narita, Seinosuke

    1989-01-01

    Parallel processing of real-time dynamic systems simulation on a multiprocessor system named OSCAR is presented. In the simulation of dynamic systems, generally, the same calculation are repeated every time step. However, we cannot apply to Do-all or the Do-across techniques for parallel processing of the simulation since there exist data dependencies from the end of an iteration to the beginning of the next iteration and furthermore data-input and data-output are required every sampling time period. Therefore, parallelism inside the calculation required for a single time step, or a large basic block which consists of arithmetic assignment statements, must be used. In the proposed method, near fine grain tasks, each of which consists of one or more floating point operations, are generated to extract the parallelism from the calculation and assigned to processors by using optimal static scheduling at compile time in order to reduce large run time overhead caused by the use of near fine grain tasks. The practicality of the scheme is demonstrated on OSCAR (Optimally SCheduled Advanced multiprocessoR) which has been developed to extract advantageous features of static scheduling algorithms to the maximum extent.

  7. An advanced simulator for orthopedic surgical training.

    PubMed

    Cecil, J; Gupta, Avinash; Pirela-Cruz, Miguel

    2018-02-01

    The purpose of creating the virtual reality (VR) simulator is to facilitate and supplement the training opportunities provided to orthopedic residents. The use of VR simulators has increased rapidly in the field of medical surgery for training purposes. This paper discusses the creation of the virtual surgical environment (VSE) for training residents in an orthopedic surgical process called less invasive stabilization system (LISS) surgery which is used to address fractures of the femur. The overall methodology included first obtaining an understanding of the LISS plating process through interactions with expert orthopedic surgeons and developing the information centric models. The information centric models provided a structured basis to design and build the simulator. Subsequently, the haptic-based simulator was built. Finally, the learning assessments were conducted in a medical school. The results from the learning assessments confirm the effectiveness of the VSE for teaching medical residents and students. The scope of the assessment was to ensure (1) the correctness and (2) the usefulness of the VSE. Out of 37 residents/students who participated in the test, 32 showed improvements in their understanding of the LISS plating surgical process. A majority of participants were satisfied with the use of teaching Avatars and haptic technology. A paired t test was conducted to test the statistical significance of the assessment data which showed that the data were statistically significant. This paper demonstrates the usefulness of adopting information centric modeling approach in the design and development of the simulator. The assessment results underscore the potential of using VR-based simulators in medical education especially in orthopedic surgery.

  8. Advanced End-to-end Simulation for On-board Processing (AESOP)

    NASA Technical Reports Server (NTRS)

    Mazer, Alan S.

    1994-01-01

    Developers of data compression algorithms typically use their own software together with commercial packages to implement, evaluate and demonstrate their work. While convenient for an individual developer, this approach makes it difficult to build on or use another's work without intimate knowledge of each component. When several people or groups work on different parts of the same problem, the larger view can be lost. What's needed is a simple piece of software to stand in the gap and link together the efforts of different people, enabling them to build on each other's work, and providing a base for engineers and scientists to evaluate the parts as a cohesive whole and make design decisions. AESOP (Advanced End-to-end Simulation for On-board Processing) attempts to meet this need by providing a graphical interface to a developer-selected set of algorithms, interfacing with compiled code and standalone programs, as well as procedures written in the IDL and PV-Wave command languages. As a proof of concept, AESOP is outfitted with several data compression algorithms integrating previous work on different processors (AT&T DSP32C, TI TMS320C30, SPARC). The user can specify at run-time the processor on which individual parts of the compression should run. Compressed data is then fed through simulated transmission and uncompression to evaluate the effects of compression parameters, noise and error correction algorithms. The following sections describe AESOP in detail. Section 2 describes fundamental goals for usability. Section 3 describes the implementation. Sections 4 through 5 describe how to add new functionality to the system and present the existing data compression algorithms. Sections 6 and 7 discuss portability and future work.

  9. Advanced adaptive computational methods for Navier-Stokes simulations in rotorcraft aerodynamics

    NASA Technical Reports Server (NTRS)

    Stowers, S. T.; Bass, J. M.; Oden, J. T.

    1993-01-01

    A phase 2 research and development effort was conducted in area transonic, compressible, inviscid flows with an ultimate goal of numerically modeling complex flows inherent in advanced helicopter blade designs. The algorithms and methodologies therefore are classified as adaptive methods, which are error estimation techniques for approximating the local numerical error, and automatically refine or unrefine the mesh so as to deliver a given level of accuracy. The result is a scheme which attempts to produce the best possible results with the least number of grid points, degrees of freedom, and operations. These types of schemes automatically locate and resolve shocks, shear layers, and other flow details to an accuracy level specified by the user of the code. The phase 1 work involved a feasibility study of h-adaptive methods for steady viscous flows, with emphasis on accurate simulation of vortex initiation, migration, and interaction. Phase 2 effort focused on extending these algorithms and methodologies to a three-dimensional topology.

  10. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  11. Advanced Beamline Design for Fermilab's Advanced Superconducting Test Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prokop, Christopher

    2014-01-01

    The Advanced Superconducting Test Accelerator (ASTA) at Fermilab is a new electron accelerator currently in the commissioning stage. In addition to testing superconducting accelerating cavities for future accelerators, it is foreseen to support a variety of Advanced Accelerator R&D (AARD) experiments. Producing the required electron bunches with the expected flexibility is challenging. The goal of this dissertation is to explore via numerical simulations new accelerator beamlines that can enable the advanced manipulation of electron bunches. The work especially includes the design of a low-energy bunch compressor and a study of transverse-to-longitudinal phase space exchangers.

  12. Active-learning diabetes simulation in an advanced pharmacy practice experience to develop patient empathy.

    PubMed

    Whitley, Heather P

    2012-12-12

    To develop and integrate an active-learning diabetes simulation into an advanced pharmacy practice experience to improve pharmacy students' empathy toward patients with diabetes mellitus. Students simulated the experience of having diabetes mellitus by conducting activities commonly prescribed to those with this disease state for 7 days, after which they submitted a standardized diabetes log and narrative reflection. Interpretive phenomenology design with thematic analysis was used to determine the impact of this experience on the students. As shown in student reflections, 95% developed empathy, 97% found the experience beneficial, and 67% improved their ability to relate to and counsel patients. Most (95%) found difficulty adhering to the regimen. On average, students consumed 179 grams of carbohydrates per day and exercised 5 days or 215 minutes per week. Additionally, 69% decided to modify their personal habits to become healthier. Inclusion of the 7-day active-learning exercise greatly impacted student pharmacists' self-reported empathy toward and ability to relate to patients with diabetes mellitus. Completion of this experience may result in long-lasting personal behavior modifications.

  13. Blending technology in teaching advanced health assessment in a family nurse practitioner program: using personal digital assistants in a simulation laboratory.

    PubMed

    Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia

    2012-09-01

    This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.

  14. Does an Advanced Pelvic Simulation Curriculum Improve Resident Performance on a Pediatric and Adolescent Gynecology Focused Objective Structured Clinical Examination? A Cohort Study.

    PubMed

    Dumont, Tania; Hakim, Julie; Black, Amanda; Fleming, Nathalie

    2016-06-01

    To determine the effect of an advanced pelvic simulation curriculum on resident performance on a pediatric and adolescent gynecology (PAG) focused objective structured clinical examination (OSCE). Obstetrics and gynecology residents in a single academic Canadian center participated in a PAG simulation curriculum. An OSCE on prepubertal vaginal bleeding was administered at the biannual OSCE examination 2 months before the simulation curriculum and again 3 months after the simulation curriculum. Academic half-day at the University of Ottawa Skills and Simulation Centre. Obstetrics and gynecology residents from the University of Ottawa. Participants completed 4 stations teaching PAG-appropriate history-taking, genital examination, Tanner staging, vaginal sampling and flushing, hymenectomy, vaginoscopy, laparoscopic adnexal detorsion, and approach to the child and/or adolescent. Advanced pelvic models were used for procedure-specific stations. The primary outcome measure was change in mean score on a prepubertal vaginal bleeding OSCE station. Secondary outcome measures were changes in individual component scores. Fourteen residents completed the simulation curriculum and the PAG OSCE at the 2 separate time points (before and after simulation curriculum). The mean OSCE score before the simulation curriculum was 54.6% (20.5 of 37) and mean score after the curriculum was 78.1% (28.9 of 37; P < .001). Significant score increases were found in history-taking, examination, differential diagnosis, identification of organism, surgical procedures, and identification of foreign body (P < .01 for all). This innovative PAG simulation curriculum significantly increased residents' knowledge in PAG history-taking, examination skills, operative procedures, and approach to the child and/or adolescent. Obstetrics and Gynecology Program Directors should consider incorporating PAG simulation training into their curriculum to ensure that residents meet their learning objectives and

  15. Advances in High-Fidelity Multi-Physics Simulation Techniques

    DTIC Science & Technology

    2008-01-01

    predictor - corrector method is used to advance the solution in time. 33 x (m) y (m ) 0 1 2 3.00001 0 1 2 3 4 5 40 x 50 Grid 3 Figure 17: Typical...Unclassified c . THIS PAGE Unclassified 17. LIMITATION OF ABSTRACT: SAR 18. NUMBER OF PAGES 60 Datta Gaitonde 19b. TELEPHONE...advanced parallel computing platforms. The motivation to develop high-fidelity algorithms derives from considerations in various areas of current

  16. Design and Test of Advanced Thermal Simulators for an Alkali Metal-Cooled Reactor Simulator

    NASA Technical Reports Server (NTRS)

    Garber, Anne E.; Dickens, Ricky E.

    2011-01-01

    The Early Flight Fission Test Facility (EFF-TF) at NASA Marshall Space Flight Center (MSFC) has as one of its primary missions the development and testing of fission reactor simulators for space applications. A key component in these simulated reactors is the thermal simulator, designed to closely mimic the form and function of a nuclear fuel pin using electric heating. Continuing effort has been made to design simple, robust, inexpensive thermal simulators that closely match the steady-state and transient performance of a nuclear fuel pin. A series of these simulators have been designed, developed, fabricated and tested individually and in a number of simulated reactor systems at the EFF-TF. The purpose of the thermal simulators developed under the Fission Surface Power (FSP) task is to ensure that non-nuclear testing can be performed at sufficiently high fidelity to allow a cost-effective qualification and acceptance strategy to be used. Prototype thermal simulator design is founded on the baseline Fission Surface Power reactor design. Recent efforts have been focused on the design, fabrication and test of a prototype thermal simulator appropriate for use in the Technology Demonstration Unit (TDU). While designing the thermal simulators described in this paper, effort were made to improve the axial power profile matching of the thermal simulators. Simultaneously, a search was conducted for graphite materials with higher resistivities than had been employed in the past. The combination of these two efforts resulted in the creation of thermal simulators with power capacities of 2300-3300 W per unit. Six of these elements were installed in a simulated core and tested in the alkali metal-cooled Fission Surface Power Primary Test Circuit (FSP-PTC) at a variety of liquid metal flow rates and temperatures. This paper documents the design of the thermal simulators, test program, and test results.

  17. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter Andrew

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less

  18. Use of Simulation Technology in Dental Education.

    ERIC Educational Resources Information Center

    Buchanan, Judith Ann

    2001-01-01

    Discusses the impact of current simulation laboratories on dental education and reviews advanced technology simulation that has recently become available or is in the developmental stage. Addresses the abilities of advanced technology simulation, its advantages and disadvantages, and its potential to affect dental education. (EV)

  19. To simulate or not to simulate: what are the questions?

    PubMed

    Dudai, Yadin; Evers, Kathinka

    2014-10-22

    Simulation is a powerful method in science and engineering. However, simulation is an umbrella term, and its meaning and goals differ among disciplines. Rapid advances in neuroscience and computing draw increasing attention to large-scale brain simulations. What is the meaning of simulation, and what should the method expect to achieve? We discuss the concept of simulation from an integrated scientific and philosophical vantage point and pinpoint selected issues that are specific to brain simulation.

  20. An Aerodynamic Performance Evaluation of the NASA/Ames Research Center Advanced Concepts Flight Simulator. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Donohue, Paul F.

    1987-01-01

    The results of an aerodynamic performance evaluation of the National Aeronautics and Space Administration (NASA)/Ames Research Center Advanced Concepts Flight Simulator (ACFS), conducted in association with the Navy-NASA Joint Institute of Aeronautics, are presented. The ACFS is a full-mission flight simulator which provides an excellent platform for the critical evaluation of emerging flight systems and aircrew performance. The propulsion and flight dynamics models were evaluated using classical flight test techniques. The aerodynamic performance model of the ACFS was found to realistically represent that of current day, medium range transport aircraft. Recommendations are provided to enhance the capabilities of the ACFS to a level forecast for 1995 transport aircraft. The graphical and tabular results of this study will establish a performance section of the ACFS Operation's Manual.

  1. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  2. Recent advances in large-eddy simulation of spray and coal combustion

    NASA Astrophysics Data System (ADS)

    Zhou, L. X.

    2013-07-01

    Large-eddy simulation (LES) is under its rapid development and is recognized as a possible second generation of CFD methods used in engineering. Spray and coal combustion is widely used in power, transportation, chemical and metallurgical, iron and steel making, aeronautical and astronautical engineering, hence LES of spray and coal two-phase combustion is particularly important for engineering application. LES of two-phase combustion attracts more and more attention; since it can give the detailed instantaneous flow and flame structures and more exact statistical results than those given by the Reynolds averaged modeling (RANS modeling). One of the key problems in LES is to develop sub-grid scale (SGS) models, including SGS stress models and combustion models. Different investigators proposed or adopted various SGS models. In this paper the present author attempts to review the advances in studies on LES of spray and coal combustion, including the studies done by the present author and his colleagues. Different SGS models adopted by different investigators are described, some of their main results are summarized, and finally some research needs are discussed.

  3. Early bedside care during preclinical medical education: can technology-enhanced patient simulation advance the Flexnerian ideal?

    PubMed

    Gordon, James A; Hayden, Emily M; Ahmed, Rami A; Pawlowski, John B; Khoury, Kimberly N; Oriol, Nancy E

    2010-02-01

    Flexner wanted medical students to study at the patient bedside-a remarkable innovation in his time-so that they could apply science to clinical care under the watchful eye of senior physicians. Ever since his report, medical schools have reserved the latter years of their curricula for such an "advanced" apprenticeship, providing clinical clerkship experiences only after an initial period of instruction in basic medical sciences. Although Flexner codified the segregation of preclinical and clinical instruction, he was committed to ensuring that both domains were integrated into a modern medical education. The aspiration to fully integrate preclinical and clinical instruction continues to drive medical education reform even to this day. In this article, the authors revisit the original justification for sequential preclinical-clinical instruction and argue that modern, technology-enhanced patient simulation platforms are uniquely powerful for fostering simultaneous integration of preclinical-clinical content in a way that Flexner would have applauded. To date, medical educators tend to focus on using technology-enhanced medical simulation in clinical and postgraduate medical education; few have devoted significant attention to using immersive clinical simulation among preclinical students. The authors present an argument for the use of dynamic robot-mannequins in teaching basic medical science, and describe their experience with simulator-based preclinical instruction at Harvard Medical School. They discuss common misconceptions and barriers to the approach, describe their curricular responses to the technique, and articulate a unifying theory of cognitive and emotional learning that broadens the view of what is possible, feasible, and desirable with simulator-based medical education.

  4. Development of Advanced Coatings for Laser Modifications Through Process and Materials Simulation

    NASA Astrophysics Data System (ADS)

    Martukanitz, R. P.; Babu, S. S.

    2004-06-01

    A simulation-based system is currently being constructed to aid in the development of advanced coating systems for laser cladding and surface alloying. The system employs loosely coupled material and process models that allow rapid determination of material compatibility over a wide range of processing conditions. The primary emphasis is on the development and identification of composite coatings for improved wear and corrosion resistance. The material model utilizes computational thermodynamics and kinetic analysis to establish phase stability and extent of diffusional reactions that may result from the thermal response of the material during virtual processing. The process model is used to develop accurate thermal histories associated with the laser surface modification process and provides critical input for the non-isothermal materials simulations. These techniques were utilized to design a laser surface modification experiment that utilized the addition of stainless steel alloy 431 and TiC produced using argon and argon and nitrogen shielding. The deposits representing alloy 431 and TiC powder produced in argon resulted in microstructures retaining some TiC particles and an increase in hardness when compared to deposits produced using only the 431 powder. Laser deposits representing alloy 431 and TiC powder produced with a mixture of argon and nitrogen shielding gas resulted in microstructures retaining some TiC particles, as well as fine precipitates of Ti(CN) formed during cooling and a further increase in hardness of the deposit.

  5. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    NASA Astrophysics Data System (ADS)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  6. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  7. Advance prototype silver ion water bactericide system

    NASA Technical Reports Server (NTRS)

    Jasionowski, W. J.; Allen, E. T.

    1974-01-01

    An advance prototype unit was designed and fabricated to treat anticipated fuel cell water. The unit is a single canister that contains a membrane-type prefilter and a silver bromide contacting bed. A seven day baseline simulated mission test was performed; the performance was satisfactory and the effluent water was within all specifications for potability. After random vibrations another seven day simulated mission test was performed, and results indicate that simulated launch vibrations have no effects on the design and performance of the advanced prototype. Bench tests and accelerated breadboard tests were conducted to define the characteristics of an upgraded model of the advance prototype unit which would have 30 days of operating capability. A preliminary design of a silver ion generator for the shuttle orbiter was also prepared.

  8. Simulation of Thin-Film Damping and Thermal Mechanical Noise Spectra for Advanced Micromachined Microphone Structures.

    PubMed

    Hall, Neal A; Okandan, Murat; Littrell, Robert; Bicen, Baris; Degertekin, F Levent

    2008-06-01

    In many micromachined sensors the thin (2-10 μm thick) air film between a compliant diaphragm and backplate electrode plays a dominant role in shaping both the dynamic and thermal noise characteristics of the device. Silicon microphone structures used in grating-based optical-interference microphones have recently been introduced that employ backplates with minimal area to achieve low damping and low thermal noise levels. Finite-element based modeling procedures based on 2-D discretization of the governing Reynolds equation are ideally suited for studying thin-film dynamics in such structures which utilize relatively complex backplate geometries. In this paper, the dynamic properties of both the diaphragm and thin air film are studied using a modal projection procedure in a commonly used finite element software and the results are used to simulate the dynamic frequency response of the coupled structure to internally generated electrostatic actuation pressure. The model is also extended to simulate thermal mechanical noise spectra of these advanced sensing structures. In all cases simulations are compared with measured data and show excellent agreement-demonstrating 0.8 pN/√Hz and 1.8 μPa/√Hz thermal force and thermal pressure noise levels, respectively, for the 1.5 mm diameter structures under study which have a fundamental diaphragm resonance-limited bandwidth near 20 kHz.

  9. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  10. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  11. Hybrid and Electric Advanced Vehicle Systems Simulation

    NASA Technical Reports Server (NTRS)

    Beach, R. F.; Hammond, R. A.; Mcgehee, R. K.

    1985-01-01

    Predefined components connected to represent wide variety of propulsion systems. Hybrid and Electric Advanced Vehicle System (HEAVY) computer program is flexible tool for evaluating performance and cost of electric and hybrid vehicle propulsion systems. Allows designer to quickly, conveniently, and economically predict performance of proposed drive train.

  12. A real-time simulation evaluation of an advanced detection. Isolation and accommodation algorithm for sensor failures in turbine engines

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Delaat, J. C.

    1986-01-01

    An advanced sensor failure detection, isolation, and accommodation (ADIA) algorithm has been developed for use with an aircraft turbofan engine control system. In a previous paper the authors described the ADIA algorithm and its real-time implementation. Subsequent improvements made to the algorithm and implementation are discussed, and the results of an evaluation presented. The evaluation used a real-time, hybrid computer simulation of an F100 turbofan engine.

  13. Today's Business Simulation Industry

    ERIC Educational Resources Information Center

    Summers, Gary J.

    2004-01-01

    New technologies are transforming the business simulation industry. The technologies come from research in computational fields of science, and they endow simulations with new capabilities and qualities. These capabilities and qualities include computerized behavioral simulations, online feedback and coaching, advanced interfaces, learning on…

  14. Mathematical modeling and SAR simulation multifunction SAR technology efforts

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    The orbital SAR (synthetic aperture radar) simulation data was used in several simulation efforts directed toward advanced SAR development. Efforts toward simulating an operational radar, simulation of antenna polarization effects, and simulation of SAR images at serveral different wavelengths are discussed. Avenues for improvements in the orbital SAR simulation and its application to the development of advanced digital radar data processing schemes are indicated.

  15. Test vs. simulation

    NASA Technical Reports Server (NTRS)

    Wood, Charles C.

    1991-01-01

    The following topics are presented in tabular form: (1) simulation capability assessments (no propulsion system test); (2) advanced vehicle simulation capability assessment; (3) systems tests identified events; (4) main propulsion test article (MPTA) testing evaluation; (5) Saturn 5, 1B, and 1 testing evaluation. Special vehicle simulation issues that are propulsion related are briefly addressed.

  16. The Synergy Between Total Scattering and Advanced Simulation Techniques: Quantifying Geopolymer Gel Evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Claire; Bloomer, Breaunnah E.; Provis, John L.

    2012-05-16

    With the ever increasing demands for technologically advanced structural materials, together with emerging environmental consciousness due to climate change, geopolymer cement is fast becoming a viable alternative to traditional cements due to proven mechanical engineering characteristics and the reduction in CO2 emitted (approximately 80% less CO2 emitted compared to ordinary Portland cement). Nevertheless, much remains unknown regarding the kinetics of the molecular changes responsible for nanostructural evolution during the geopolymerization process. Here, in-situ total scattering measurements in the form of X-ray pair distribution function (PDF) analysis are used to quantify the extent of reaction of metakaolin/slag alkali-activated geopolymer binders, includingmore » the effects of various activators (alkali hydroxide/silicate) on the kinetics of the geopolymerization reaction. Restricting quantification of the kinetics to the initial ten hours of reaction does not enable elucidation of the true extent of the reaction, but using X-ray PDF data obtained after 128 days of reaction enables more accurate determination of the initial extent of reaction. The synergies between the in-situ X-ray PDF data and simulations conducted by multiscale density functional theory-based coarse-grained Monte Carlo analysis are outlined, particularly with regard to the potential for the X-ray data to provide a time scale for kinetic analysis of the extent of reaction obtained from the multiscale simulation methodology.« less

  17. Simulation of nucleation and growth of atomic layer deposition phosphorus for doping of advanced FinFETs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seidel, Thomas E., E-mail: zoomtotom@gmail.com; Goldberg, Alexander; Halls, Mat D.

    2016-01-15

    Simulations for the nucleation and growth of phosphorus films were carried out using density functional theory. The surface was represented by a Si{sub 9}H{sub 12} truncated cluster surface model with 2 × 1-reconstructured (100) Si-OH terminations for the initial reaction sites. Chemistries included phosphorous halides (PF{sub 3}, PCl{sub 3}, and PBr{sub 3}) and disilane (Si{sub 2}H{sub 6}). Atomic layer deposition (ALD) reaction sequences were illustrated with three-dimensional molecular models using sequential PF{sub 3} and Si{sub 2}H{sub 6} reactions and featuring SiFH{sub 3} as a byproduct. Exothermic reaction pathways were developed for both nucleation and growth for a Si-OH surface. Energetically favorable reactionsmore » for the deposition of four phosphorus atoms including lateral P–P bonding were simulated. This paper suggests energetically favorable thermodynamic reactions for the growth of elemental phosphorus on (100) silicon. Phosphorus layers made by ALD are an option for doping advanced fin field-effect transistors (FinFETs). Phosphorus may be thermally diffused into the silicon or recoil knocked in; simulations of the recoil profile of phosphorus into a FinFET surface are illustrated.« less

  18. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-01

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a SNAP derivative reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  19. Simulations of binary black hole mergers

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey

    2017-01-01

    Advanced LIGO's observations of merging binary black holes have inaugurated the era of gravitational wave astronomy. Accurate models of binary black holes and the gravitational waves they emit are helping Advanced LIGO to find as many gravitational waves as possible and to learn as much as possible about the waves' sources. These models require numerical-relativity simulations of binary black holes, because near the time when the black holes merge, all analytic approximations break down. Following breakthroughs in 2005, many research groups have built numerical-relativity codes capable of simulating binary black holes. In this talk, I will discuss current challenges in simulating binary black holes for gravitational-wave astronomy, and I will discuss the tremendous progress that has already enabled such simulations to become an essential tool for Advanced LIGO.

  20. Effect of simulation on knowledge of advanced cardiac life support, knowledge retention, and confidence of nursing students in Jordan.

    PubMed

    Tawalbeh, Loai I; Tubaishat, Ahmad

    2014-01-01

    This study examined the effect of simulation on nursing students' knowledge of advanced cardiac life support (ACLS), knowledge retention, and confidence in applying ACLS skills. An experimental, randomized controlled (pretest-posttest) design was used. The experimental group (n = 40) attended an ACLS simulation scenario, a 4-hour PowerPoint presentation, and demonstration on a static manikin, whereas the control group (n = 42) attended the PowerPoint presentation and a demonstration only. A paired t test indicated that posttest mean knowledge of ACLS and confidence was higher in both groups. The experimental group showed higher knowledge of ACLS and higher confidence in applying ACLS, compared with the control group. Traditional training involving PowerPoint presentation and demonstration on a static manikin is an effective teaching strategy; however, simulation is significantly more effective than traditional training in helping to improve nursing students' knowledge acquisition, knowledge retention, and confidence about ACLS. Copyright 2014, SLACK Incorporated.

  1. The Development of the Non-hydrostatic Unified Model of the Atmosphere (NUMA)

    DTIC Science & Technology

    2011-09-19

    capabilities: 1.  Highly scalable on current and future computer architectures ( exascale computing: this means CPUs and GPUs) 2.  Flexibility to use a...From Terascale to Petascale/ Exascale Computing •  10 of Top 500 are already in the Petascale range •  3 of top 10 are GPU-based machines 2

  2. Brain-wave measures of workload in advanced cockpits: The transition of technology from laboratory to cockpit simulator, phase 2

    NASA Technical Reports Server (NTRS)

    Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.

    1989-01-01

    The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.

  3. Advanced simulation and analysis of a geopotential research mission

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.

    1988-01-01

    Computer simulations have been performed for an orbital gradiometer mission to assist in the study of high degree and order gravity field recovery. The simulations were conducted for a satellite in near-circular, frozen orbit at a 160-km altitude using a gravitational field complete to degree and order 360. The mission duration is taken to be 32 days. The simulation provides a set of measurements to assist in the evaluation of techniques developed for the determination of the gravity field. Also, the simulation provides an ephemeris to study available tracking systems to satisfy the orbit determination requirements of the mission.

  4. Technology advancement for the ASCENDS mission using the ASCENDS CarbonHawk Experiment Simulator (ACES)

    NASA Astrophysics Data System (ADS)

    Obland, M. D.; Antill, C.; Browell, E. V.; Campbell, J. F.; CHEN, S.; Cleckner, C.; Dijoseph, M. S.; Harrison, F. W.; Ismail, S.; Lin, B.; Meadows, B. L.; Mills, C.; Nehrir, A. R.; Notari, A.; Prasad, N. S.; Kooi, S. A.; Vitullo, N.; Dobler, J. T.; Bender, J.; Blume, N.; Braun, M.; Horney, S.; McGregor, D.; Neal, M.; Shure, M.; Zaccheo, T.; Moore, B.; Crowell, S.; Rayner, P. J.; Welch, W.

    2013-12-01

    The ASCENDS CarbonHawk Experiment Simulator (ACES) is a NASA Langley Research Center project funded by NASA's Earth Science Technology Office that seeks to advance technologies critical to measuring atmospheric column carbon dioxide (CO2) mixing ratios in support of the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) mission. The technologies being advanced are: (1) multiple transmitter and telescope-aperture operations, (2) high-efficiency CO2 laser transmitters, (3) a high bandwidth detector and transimpedance amplifier (TIA), and (4) advanced algorithms for cloud and aerosol discrimination. The instrument architecture is being developed for ACES to operate on a high-altitude aircraft, and it will be directly scalable to meet the ASCENDS mission requirements. The above technologies are critical for developing an airborne simulator and spaceborne instrument with lower platform consumption of size, mass, and power, and with improved performance. This design employs several laser transmitters and telescope-apertures to demonstrate column CO2 retrievals with alignment of multiple laser beams in the far-field. ACES will transmit five laser beams: three from commercial lasers operating near 1.57-microns, and two from the Exelis atmospheric oxygen (O2) fiber laser amplifier system operating near 1.26-microns. The Master Oscillator Power Amplifier at 1.57-microns measures CO2 column concentrations using an Integrated-Path Differential Absorption (IPDA) lidar approach. O2 column amounts needed for calculating the CO2 mixing ratio will be retrieved using the Exelis laser system with a similar IPDA approach. The three aperture telescope design was built to meet the constraints of the Global Hawk high-altitude unmanned aerial vehicle (UAV). This assembly integrates fiber-coupled transmit collimators for all of the laser transmitters and fiber-coupled optical signals from the three telescopes to the aft optics and detector package. The detector

  5. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  6. AN ADVANCED LEAKAGE SCHEME FOR NEUTRINO TREATMENT IN ASTROPHYSICAL SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perego, A.; Cabezón, R. M.; Käppeli, R., E-mail: albino.perego@physik.tu-darmstadt.de

    We present an Advanced Spectral Leakage (ASL) scheme to model neutrinos in the context of core-collapse supernovae (CCSNe) and compact binary mergers. Based on previous gray leakage schemes, the ASL scheme computes the neutrino cooling rates by interpolating local production and diffusion rates (relevant in optically thin and thick regimes, respectively) separately for discretized values of the neutrino energy. Neutrino trapped components are also modeled, based on equilibrium and timescale arguments. The better accuracy achieved by the spectral treatment allows a more reliable computation of neutrino heating rates in optically thin conditions. The scheme has been calibrated and tested against Boltzmannmore » transport in the context of Newtonian spherically symmetric models of CCSNe. ASL shows a very good qualitative and a partial quantitative agreement for key quantities from collapse to a few hundreds of milliseconds after core bounce. We have proved the adaptability and flexibility of our ASL scheme, coupling it to an axisymmetric Eulerian and to a three-dimensional smoothed particle hydrodynamics code to simulate core collapse. Therefore, the neutrino treatment presented here is ideal for large parameter-space explorations, parametric studies, high-resolution tests, code developments, and long-term modeling of asymmetric configurations, where more detailed neutrino treatments are not available or are currently computationally too expensive.« less

  7. A mathematical representation of an advanced helicopter for piloted simulator investigations of control system and display variations

    NASA Technical Reports Server (NTRS)

    Aiken, E. W.

    1980-01-01

    A mathematical model of an advanced helicopter is described. The model is suitable for use in control/display research involving piloted simulation. The general design approach for the six degree of freedom equations of motion is to use the full set of nonlinear gravitational and inertial terms of the equations and to express the aerodynamic forces and moments as the reference values and first order terms of a Taylor series expansion about a reference trajectory defined as a function of longitudinal airspeed. Provisions for several different specific and generic flight control systems are included in the model. The logic required to drive various flight control and weapon delivery symbols on a pilot's electronic display is also provided. Finally, the model includes a simplified representation of low altitude wind and turbulence effects. This model was used in a piloted simulator investigation of the effects of control system and display variations for an attack helicopter mission.

  8. Photocatalytic Removal of Microcystin-LR by Advanced WO3-Based Nanoparticles under Simulated Solar Light

    PubMed Central

    Zhao, Chao; Li, Dawei; Feng, Chuanping; Zhang, Zhenya; Sugiura, Norio; Yang, Yingnan

    2015-01-01

    A series of advanced WO3-based photocatalysts including CuO/WO3, Pd/WO3, and Pt/WO3 were synthesized for the photocatalytic removal of microcystin-LR (MC-LR) under simulated solar light. In the present study, Pt/WO3 exhibited the best performance for the photocatalytic degradation of MC-LR. The MC-LR degradation can be described by pseudo-first-order kinetic model. Chloride ion (Cl−) with proper concentration could enhance the MC-LR degradation. The presence of metal cations (Cu2+ and Fe3+) improved the photocatalytic degradation of MC-LR. This study suggests that Pt/WO3 photocatalytic oxidation under solar light is a promising option for the purification of water containing MC-LR. PMID:25884038

  9. Communication training for advanced medical students improves information recall of medical laypersons in simulated informed consent talks--a randomized controlled trial.

    PubMed

    Werner, Anne; Holderried, Friederike; Schäffeler, Norbert; Weyrich, Peter; Riessen, Reimer; Zipfel, Stephan; Celebi, Nora

    2013-02-01

    Informed consent talks are mandatory before invasive interventions. However, the patients' information recall has been shown to be rather poor. We investigated, whether medical laypersons recalled more information items from a simulated informed consent talk after advanced medical students participated in a communication training aiming to reduce a layperson's cognitive load. Using a randomized, controlled, prospective cross-over-design, 30 5th and 6th year medical students were randomized into two groups. One group received communication training, followed by a comparison intervention (early intervention group, EI); the other group first received the comparison intervention and then communication training (late intervention group, LI). Before and after the interventions, the 30 medical students performed simulated informed consent talks with 30 blinded medical laypersons using a standardized set of information. We then recorded the number of information items the medical laypersons recalled. After the communication training both groups of medical laypersons recalled significantly more information items (EI: 41 ± 9% vs. 23 ± 9%, p < .0001, LI 49 ± 10% vs. 35 ± 6%, p < .0001). After the comparison intervention the improvement was modest and significant only in the LI (EI: 42 ± 9% vs. 40 ± 9%, p = .41, LI 35 ± 6% vs. 29 ± 9%, p = .016). Short communication training for advanced medical students improves information recall of medical laypersons in simulated informed consent talks.

  10. Analysis Report for Exascale Storage Requirements for Scientific Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruwart, Thomas M.

    Over the next 10 years, the Department of Energy will be transitioning from Petascale to Exascale Computing resulting in data storage, networking, and infrastructure requirements to increase by three orders of magnitude. The technologies and best practices used today are the result of a relatively slow evolution of ancestral technologies developed in the 1950s and 1960s. These include magnetic tape, magnetic disk, networking, databases, file systems, and operating systems. These technologies will continue to evolve over the next 10 to 15 years on a reasonably predictable path. Experience with the challenges involved in transitioning these fundamental technologies from Terascale tomore » Petascale computing systems has raised questions about how these will scale another 3 or 4 orders of magnitude to meet the requirements imposed by Exascale computing systems. This report is focused on the most concerning scaling issues with data storage systems as they relate to High Performance Computing- and presents options for a path forward. Given the ability to store exponentially increasing amounts of data, far more advanced concepts and use of metadata will be critical to managing data in Exascale computing systems.« less

  11. Reality versus Simulation

    ERIC Educational Resources Information Center

    Srinivasan, Srilekha; Perez, Lance C.; Palmer, Robert D.; Brooks, David W.; Wilson, Kathleen; Fowler, David

    2006-01-01

    A systematic study of the implementation of simulation hardware (TIMS) replacing software (MATLAB) was undertaken for advanced undergraduate and early graduate courses in electrical engineering. One outcome of the qualitative component of the study was remarkable: most students interviewed (4/4 and 6/9) perceived the software simulations as…

  12. Efficacy of standardized training on a virtual reality simulator to advance knee and shoulder arthroscopic motor skills.

    PubMed

    Rahm, Stefan; Wieser, Karl; Bauer, David E; Waibel, Felix Wa; Meyer, Dominik C; Gerber, Christian; Fucentese, Sandro F

    2018-05-16

    Most studies demonstrated, that training on a virtual reality based arthroscopy simulator leads to an improvement of technical skills in orthopaedic surgery. However, how long and what kind of training is optimal for young residents is unknown. In this study we tested the efficacy of a standardized, competency based training protocol on a validated virtual reality based knee- and shoulder arthroscopy simulator. Twenty residents and five experts in arthroscopy were included. All participants performed a test including knee -and shoulder arthroscopy tasks on a virtual reality knee- and shoulder arthroscopy simulator. The residents had to complete a competency based training program. Thereafter, the previously completed test was retaken. We evaluated the metric data of the simulator using a z-score and the Arthroscopic Surgery Skill Evaluation Tool (ASSET) to assess training effects in residents and performance levels in experts. The residents significantly improved from pre- to post training in the overall z-score: - 9.82 (range, - 20.35 to - 1.64) to - 2.61 (range, - 6.25 to 1.5); p < 0.001. The overall ASSET score improved from 55 (27 to 84) percent to 75 (48 to 92) percent; p < 0.001. The experts, however, achieved a significantly higher z-score in the shoulder tasks (p < 0.001 and a statistically insignificantly higher z-score in the knee tasks with a p = 0.921. The experts mean overall ASSET score (knee and shoulder) was significantly higher in the therapeutic tasks (p < 0.001) compared to the residents post training result. The use of a competency based simulator training with this specific device for 3-5 h is an effective tool to advance basic arthroscopic skills of resident in training from 0 to 5 years based on simulator measures and simulator based ASSET testing. Therefore, we conclude that this sort of training method appears useful to learn the handling of the camera, basic anatomy and the triangulation with instruments.

  13. 2011 Computation Directorate Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI

  14. An integrated fuzzy-based advanced eutrophication simulation model to develop the best management scenarios for a river basin.

    PubMed

    Srinivas, Rallapalli; Singh, Ajit Pratap

    2018-03-01

    Assessment of water quality status of a river with respect to its discharge has become prerequisite to sustainable river basin management. The present paper develops an integrated model for simulating and evaluating strategies for water quality management in a river basin management by controlling point source pollutant loadings and operations of multi-purpose projects. Water Quality Analysis and Simulation Program (WASP version 8.0) has been used for modeling the transport of pollutant loadings and their impact on water quality in the river. The study presents a novel approach of integrating fuzzy set theory with an "advanced eutrophication" model to simulate the transmission and distribution of several interrelated water quality variables and their bio-physiochemical processes in an effective manner in the Ganges river basin, India. After calibration, simulated values are compared with the observed values to validate the model's robustness. Fuzzy technique of order preference by similarity to ideal solution (F-TOPSIS) has been used to incorporate the uncertainty associated with the water quality simulation results. The model also simulates five different scenarios for pollution reduction, to determine the maximum pollutant loadings during monsoon and dry periods. The final results clearly indicate how modeled reduction in the rate of wastewater discharge has reduced impacts of pollutants in the downstream. Scenarios suggesting a river discharge rate of 1500 m 3 /s during the lean period, in addition to 25 and 50% reduction in the load rate, are found to be the most effective option to restore quality of river Ganges. Thus, the model serves as an important hydrologic tool to the policy makers by suggesting appropriate remediation action plans.

  15. Atomistic Simulations of High-intensity XFEL Pulses on Diffractive Imaging of Nano-sized System Dynamics

    NASA Astrophysics Data System (ADS)

    Ho, Phay; Knight, Christopher; Bostedt, Christoph; Young, Linda; Tegze, Miklos; Faigel, Gyula

    2016-05-01

    We have developed a large-scale atomistic computational method based on a combined Monte Carlo and Molecular Dynamics (MC/MD) method to simulate XFEL-induced radiation damage dynamics of complex materials. The MD algorithm is used to propagate the trajectories of electrons, ions and atoms forward in time and the quantum nature of interactions with an XFEL pulse is accounted for by a MC method to calculate probabilities of electronic transitions. Our code has good scalability with MPI/OpenMP parallelization, and it has been run on Mira, a petascale system at the Argonne Leardership Computing Facility, with particle number >50 million. Using this code, we have examined the impact of high-intensity 8-keV XFEL pulses on the x-ray diffraction patterns of argon clusters. The obtained patterns show strong pulse parameter dependence, providing evidence of significant lattice rearrangement and diffuse scattering. Real-space electronic reconstruction was performed using phase retrieval methods. We found that the structure of the argon cluster can be recovered with atomic resolution even in the presence of considerable radiation damage. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Chemical Sciences, Geosciences, and Biosciences Division.

  16. Special issue on the "Consortium for Advanced Simulation of Light Water Reactors Research and Development Progress"

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Martin, William R.

    2017-04-01

    In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.

  17. Advanced helicopter cockpit and control configurations for helicopter combat missions

    NASA Technical Reports Server (NTRS)

    Haworth, Loran A.; Atencio, Adolph, Jr.; Bivens, Courtland; Shively, Robert; Delgado, Daniel

    1987-01-01

    Two piloted simulations were conducted by the U.S. Army Aeroflightdynamics Directorate to evaluate workload and helicopter-handling qualities requirements for single pilot operation in a combat Nap-of-the-Earth environment. The single-pilot advanced cockpit engineering simulation (SPACES) investigations were performed on the NASA Ames Vertical Motion Simulator, using the Advanced Digital Optical Control System control laws and an advanced concepts glass cockpit. The first simulation (SPACES I) compared single pilot to dual crewmember operation for the same flight tasks to determine differences between dual and single ratings, and to discover which control laws enabled adequate single-pilot helicopter operation. The SPACES II simulation concentrated on single-pilot operations and use of control laws thought to be viable candidates for single pilot operations workload. Measures detected significant differences between single-pilot task segments. Control system configurations were task dependent, demonstrating a need for inflight reconfigurable control system to match the optimal control system with the required task.

  18. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process

    PubMed Central

    Fraser, Kirk A.; St-Georges, Lyne; Kiss, Laszlo I.

    2014-01-01

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time. PMID:28788627

  19. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process.

    PubMed

    Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I

    2014-04-30

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  20. Conceptual design study for an advanced cab and visual system, volume 2

    NASA Technical Reports Server (NTRS)

    Rue, R. J.; Cyrus, M. L.; Garnett, T. A.; Nachbor, J. W.; Seery, J. A.; Starr, R. L.

    1980-01-01

    The performance, design, construction and testing requirements are defined for developing an advanced cab and visual system. The rotorcraft system integration simulator is composed of the advanced cab and visual system and the rotorcraft system motion generator, and is part of an existing simulation facility. User's applications for the simulator include rotorcraft design development, product improvement, threat assessment, and accident investigation.

  1. Recent advances in spacecraft thermal-control materials research.

    NASA Technical Reports Server (NTRS)

    Zerlaut, G. A.; Gilligan, J. E.; Gates, D. W.

    1972-01-01

    The state-of-the-art of spacecraft thermal-control materials technology has been significantly advanced during the past 4 years. Selective black coatings are discussed together with black paints, dielectric films on metal surfaces, and white radiator coatings. Criteria for the selection of thermal-control surfaces are considered, giving attention to prelaunch protection, the capability of being measured, reproducibility, simulator response, and aspects of a nonindigenous space environment. Progress in space simulation is related to vacuum technology, ultraviolet sources, solar wind simulation, and the production of protons. Advances have been made in the protection against space environmental effects, and in the development of thermal-control surfaces and pigments.

  2. The Next Frontier in Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarrao, John

    2016-11-16

    Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.

  3. Advanced simulation of mixed-material erosion/evolution and application to low and high-Z containing plasma facing components

    NASA Astrophysics Data System (ADS)

    Brooks, J. N.; Hassanein, A.; Sizyuk, T.

    2013-07-01

    Plasma interactions with mixed-material surfaces are being analyzed using advanced modeling of time-dependent surface evolution/erosion. Simulations use the REDEP/WBC erosion/redeposition code package coupled to the HEIGHTS package ITMC-DYN mixed-material formation/response code, with plasma parameter input from codes and data. We report here on analysis for a DIII-D Mo/C containing tokamak divertor. A DIII-D/DiMES probe experiment simulation predicts that sputtered molybdenum from a 1 cm diameter central spot quickly saturates (˜4 s) in the 5 cm diameter surrounding carbon probe surface, with subsequent re-sputtering and transport to off-probe divertor regions, and with high (˜50%) redeposition on the Mo spot. Predicted Mo content in the carbon agrees well with post-exposure probe data. We discuss implications and mixed-material analysis issues for Be/W mixing at the ITER outer divertor, and Li, C, Mo mixing at an NSTX divertor.

  4. Displays and simulators

    NASA Astrophysics Data System (ADS)

    Mohon, N.

    A 'simulator' is defined as a machine which imitates the behavior of a real system in a very precise manner. The major components of a simulator and their interaction are outlined in brief form, taking into account the major components of an aircraft flight simulator. Particular attention is given to the visual display portion of the simulator, the basic components of the display, their interactions, and their characteristics. Real image displays are considered along with virtual image displays, and image generators. Attention is given to an advanced simulator for pilot training, a holographic pancake window, a scan laser image generator, the construction of an infrared target simulator, and the Apollo Command Module Simulator.

  5. Overview of Computer Simulation Modeling Approaches and Methods

    Treesearch

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  6. Understanding interdisciplinary health care teams: using simulation design processes from the Air Carrier Advanced Qualification Program to identify and train critical teamwork skills.

    PubMed

    Hamman, William R; Beaudin-Seiler, Beth M; Beaubien, Jeffrey M

    2010-09-01

    In the report "Five Years After 'To Err is Human' ", it was noted that "the combination of complexity, professional fragmentation, and a tradition of individualism, enhanced by a well-entrenched hierarchical authority structure and diffuse accountability, forms a daunting barrier to creating the habits and beliefs of common purpose, teamwork, and individual accountability for successful interdependence that a safe culture requires". Training physicians, nurses, and other professionals to work in teams is a concept that has been promoted by many patient safety experts. However the model of teamwork in healthcare is diffusely defined, no clear performance metrics have been established, and the use of simulation to train teams has been suboptimal. This paper reports on the first three years of work performed in the Michigan Economic Development Corporation (MEDC) Tri-Corridor life science grant to apply concepts and processes of simulation design that were developed in the air carrier industry to understand and train healthcare teams. This work has been monitored by the American Academy for the Advancement of Science (AAA) and is based on concepts designed in the Advanced Qualification Program (AQP) from the air carrier industry, which trains and assesses teamwork skills in the same manner as technical skills. This grant has formed the foundation for the Center of Excellence for Simulation Education and Research (CESR).

  7. Fluid structure interaction simulations of the upper airway in obstructive sleep apnea patients before and after maxillomandibular advancement surgery.

    PubMed

    Chang, Kwang K; Kim, Ki Beom; McQuilling, Mark W; Movahed, Reza

    2018-06-01

    The purpose of this study was to analyze pharyngeal airflow using both computational fluid dynamics (CFD) and fluid structure interactions (FSI) in obstructive sleep apnea patients before and after maxillomandibular advancement (MMA) surgery. The airflow characteristics before and after surgery were compared with both CFD and FSI. In addition, the presurgery and postsurgery deformations of the airway were evaluated using FSI. Digitized pharyngeal airway models of 2 obstructive sleep apnea patients were generated from cone-beam computed tomography scans before and after MMA surgery. CFD and FSI were used to evaluate the pharyngeal airflow at a maximum inspiration rate of 166 ml per second. Standard steady-state numeric formulations were used for airflow simulations. Airway volume increased, pressure drop decreased, maximum airflow velocity decreased, and airway resistance dropped for both patients after the MMA surgery. These findings occurred in both the CFD and FSI simulations. The FSI simulations showed an area of marked airway deformation in both patients before surgery, but this deformation was negligible after surgery for both patients. Both CFD and FSI simulations produced airflow results that indicated less effort was needed to breathe after MMA surgery. The FSI simulations demonstrated a substantial decrease in airway deformation after surgery. These beneficial changes positively correlated with the large improvements in polysomnography outcomes after MMA surgery. Copyright © 2018 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  8. Simulation of EAST vertical displacement events by tokamak simulation code

    NASA Astrophysics Data System (ADS)

    Qiu, Qinglai; Xiao, Bingjia; Guo, Yong; Liu, Lei; Xing, Zhe; Humphreys, D. A.

    2016-10-01

    Vertical instability is a potentially serious hazard for elongated plasma. In this paper, the tokamak simulation code (TSC) is used to simulate vertical displacement events (VDE) on the experimental advanced superconducting tokamak (EAST). Key parameters from simulations, including plasma current, plasma shape and position, flux contours and magnetic measurements match experimental data well. The growth rates simulated by TSC are in good agreement with TokSys results. In addition to modeling the free drift, an EAST fast vertical control model enables TSC to simulate the course of VDE recovery. The trajectories of the plasma current center and control currents on internal coils (IC) fit experimental data well.

  9. Facilitating researcher use of flight simulators

    NASA Technical Reports Server (NTRS)

    Russell, C. Ray

    1990-01-01

    Researchers conducting experiments with flight simulators encounter numerous obstacles in bringing their ideas to the simulator. Research into how these simulators could be used more efficiently is presented. The study involved: (1) analyzing the Advanced Concepts Simulator software architecture, (2) analyzing the interaction between the researchers and simulation programmers, and (3) proposing a documentation tool for the researchers.

  10. CAT/RF Simulation Lessons Learned

    DTIC Science & Technology

    2003-06-11

    IVSS-2003-MAS-7 CAT /RF Simulation Lessons Learned Christopher Mocnik Vetronics Technology Area, RDECOM TARDEC Tim Lee DCS Corporation...developed a re- configurable Unmanned Ground Vehicle (UGV) simulation for the Crew integration and Automation Test bed ( CAT ) and Robotics Follower (RF...Advanced Technology Demonstration (ATD) experiments. This simulation was developed as a component of the Embedded Simulation System (ESS) of the CAT

  11. Striving for Better Medical Education: the Simulation Approach.

    PubMed

    Sakakushev, Boris E; Marinov, Blagoi I; Stefanova, Penka P; Kostianev, Stefan St; Georgiou, Evangelos K

    2017-06-01

    Medical simulation is a rapidly expanding area within medical education due to advances in technology, significant reduction in training hours and increased procedural complexity. Simulation training aims to enhance patient safety through improved technical competency and eliminating human factors in a risk free environment. It is particularly applicable to a practical, procedure-orientated specialties. Simulation can be useful for novice trainees, experienced clinicians (e.g. for revalidation) and team building. It has become a cornerstone in the delivery of medical education, being a paradigm shift in how doctors are educated and trained. Simulation must take a proactive position in the development of metric-based simulation curriculum, adoption of proficiency benchmarking definitions, and should not depend on the simulation platforms used. Conversely, ingraining of poor practice may occur in the absence of adequate supervision, and equipment malfunction during the simulation can break the immersion and disrupt any learning that has occurred. Despite the presence of high technology, there is a substantial learning curve for both learners and facilitators. The technology of simulation continues to advance, offering devices capable of improved fidelity in virtual reality simulation, more sophisticated procedural practice and advanced patient simulators. Simulation-based training has also brought about paradigm shifts in the medical and surgical education arenas and ensured that the scope and impact of simulation will continue to broaden.

  12. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  13. Advances in the simulation and automated measurement of well-sorted granular material: 1. Simulation

    USGS Publications Warehouse

    Daniel Buscombe,; Rubin, David M.

    2012-01-01

    1. In this, the first of a pair of papers which address the simulation and automated measurement of well-sorted natural granular material, a method is presented for simulation of two-phase (solid, void) assemblages of discrete non-cohesive particles. The purpose is to have a flexible, yet computationally and theoretically simple, suite of tools with well constrained and well known statistical properties, in order to simulate realistic granular material as a discrete element model with realistic size and shape distributions, for a variety of purposes. The stochastic modeling framework is based on three-dimensional tessellations with variable degrees of order in particle-packing arrangement. Examples of sediments with a variety of particle size distributions and spatial variability in grain size are presented. The relationship between particle shape and porosity conforms to published data. The immediate application is testing new algorithms for automated measurements of particle properties (mean and standard deviation of particle sizes, and apparent porosity) from images of natural sediment, as detailed in the second of this pair of papers. The model could also prove useful for simulating specific depositional structures found in natural sediments, the result of physical alterations to packing and grain fabric, using discrete particle flow models. While the principal focus here is on naturally occurring sediment and sedimentary rock, the methods presented might also be useful for simulations of similar granular or cellular material encountered in engineering, industrial and life sciences.

  14. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    NASA Astrophysics Data System (ADS)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor

  15. Quantifying the Effect of Fast Charger Deployments on Electric Vehicle Utility and Travel Patterns via Advanced Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, E.; Neubauer, J.; Burton, E.

    The disparate characteristics between conventional (CVs) and battery electric vehicles (BEVs) in terms of driving range, refill/recharge time, and availability of refuel/recharge infrastructure inherently limit the relative utility of BEVs when benchmarked against traditional driver travel patterns. However, given a high penetration of high-power public charging combined with driver tolerance for rerouting travel to facilitate charging on long-distance trips, the difference in utility between CVs and BEVs could be marginalized. We quantify the relationships between BEV utility, the deployment of fast chargers, and driver tolerance for rerouting travel and extending travel durations by simulating BEVs operated over real-world travel patternsmore » using the National Renewable Energy Laboratory's Battery Lifetime Analysis and Simulation Tool for Vehicles (BLAST-V). With support from the U.S. Department of Energy's Vehicle Technologies Office, BLAST-V has been developed to include algorithms for estimating the available range of BEVs prior to the start of trips, for rerouting baseline travel to utilize public charging infrastructure when necessary, and for making driver travel decisions for those trips in the presence of available public charging infrastructure, all while conducting advanced vehicle simulations that account for battery electrical, thermal, and degradation response. Results from BLAST-V simulations on vehicle utility, frequency of inserted stops, duration of charging events, and additional time and distance necessary for rerouting travel are presented to illustrate how BEV utility and travel patterns can be affected by various fast charge deployments.« less

  16. Advances in X-Ray Simulator Technology

    DTIC Science & Technology

    1995-07-01

    d’Etudes de Gramat ; I. Vitkovitsky, Logicon RDA INTRODUCTION DNA’s future x-ray simulators are based upon inductive energy storage, a technology which...switch. SYRINX, a proposed design to be built by the Centre d’Etudes de Gramat (CEG) in France would employ a modular approach, possibly with a...called SYRINX, would be built at the Centred’ Etudes de Gramat (CEG). It would employ a modular.long conduction time current source to drive a PRS

  17. Multicore Education through Simulation

    ERIC Educational Resources Information Center

    Ozturk, O.

    2011-01-01

    A project-oriented course for advanced undergraduate and graduate students is described for simulating multiple processor cores. Simics, a free simulator for academia, was utilized to enable students to explore computer architecture, operating systems, and hardware/software cosimulation. Motivation for including this course in the curriculum is…

  18. Kinetic-MHD hybrid simulation of fishbone modes excited by fast ions on the experimental advanced superconducting tokamak (EAST)

    NASA Astrophysics Data System (ADS)

    Pei, Youbin; Xiang, Nong; Hu, Youjun; Todo, Y.; Li, Guoqiang; Shen, Wei; Xu, Liqing

    2017-03-01

    Kinetic-MagnetoHydroDynamic hybrid simulations are carried out to investigate fishbone modes excited by fast ions on the Experimental Advanced Superconducting Tokamak. The simulations use realistic equilibrium reconstructed from experiment data with the constraint of the q = 1 surface location (q is the safety factor). Anisotropic slowing down distribution is used to model the distribution of the fast ions from neutral beam injection. The resonance condition is used to identify the interaction between the fishbone mode and the fast ions, which shows that the fishbone mode is simultaneously in resonance with the bounce motion of the trapped particles and the transit motion of the passing particles. Both the passing and trapped particles are important in destabilizing the fishbone mode. The simulations show that the mode frequency chirps down as the mode reaches the nonlinear stage, during which there is a substantial flattening of the perpendicular pressure of fast ions, compared with that of the parallel pressure. For passing particles, the resonance remains within the q = 1 surface, while, for trapped particles, the resonant location moves out radially during the nonlinear evolution. In addition, parameter scanning is performed to examine the dependence of the linear frequency and growth rate of fishbones on the pressure and injection velocity of fast ions.

  19. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  20. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  1. Technical tips and advancements in pediatric minimally invasive surgical training on porcine based simulations.

    PubMed

    Narayanan, Sarath Kumar; Cohen, Ralph Clinton; Shun, Albert

    2014-06-01

    Minimal access techniques have transformed the way pediatric surgery is practiced. Due to various constraints, surgical residency programs have not been able to tutor adequate training skills in the routine setting. The advent of new technology and methods in minimally invasive surgery (MIS), has similarly contributed to the need for systematic skills' training in a safe, simulated environment. To enable the training of the proper technique among pediatric surgery trainees, we have advanced a porcine non-survival model for endoscopic surgery. The technical advancements over the past 3 years and a subjective validation of the porcine model from 114 participating trainees using a standard questionnaire and a 5-point Likert scale have been described here. Mean attitude scores and analysis of variance (ANOVA) were used for statistical analysis of the data. Almost all trainees agreed or strongly agreed that the animal-based model was appropriate (98.35%) and also acknowledged that such workshops provided adequate practical experience before attempting on human subjects (96.6%). Mean attitude score for respondents was 19.08 (SD 3.4, range 4-20). Attitude scores showed no statistical association with years of experience or the level of seniority, indicating a positive attitude among all groups of respondents. Structured porcine-based MIS training should be an integral part of skill acquisition for pediatric surgery trainees and the experience gained can be transferred into clinical practice. We advocate that laparoscopic training should begin in a controlled workshop setting before procedures are attempted on human patients.

  2. Quarks and the cosmos.

    PubMed

    Turner, Michael S

    2007-01-05

    Cosmology is in the midst of a period of revolutionary discovery, propelled by bold ideas from particle physics and by technological advances from gigapixel charge-coupled device cameras to peta-scale computing. The basic features of the universe have now been determined: It is 13.7 billion years old, spatially flat, and expanding at an accelerating rate; it is composed of atoms (4%), exotic dark matter (20%), and dark energy (76%); and there is evidence that galaxies and other structures were seeded by quantum fluctuations. Although we know much about the universe, we understand far less. Poised to dramatically advance our understanding of both the universe and the laws that govern it, cosmology is on the verge of a golden age.

  3. Recent advances in lunar base simulation

    NASA Astrophysics Data System (ADS)

    Johenning, B.; Koelle, H. H.

    This article reports about the results of the latest computer runs of a lunar base simulation model. The lunar base consists of 20 facilities for lunar mining, processing and fabrication. The infrastructure includes solar and nuclear power plants, a central workshop, habitat and farm. Lunar products can be used for construction of solar power systems (SPS) or other spacecraft at several space locations. The simulation model evaluates the mass, energy and manpower flows between the elements of the system as well as system cost and cost of products on an annual basis for a given operational period. The 1983 standard model run over a fifty-years life cycle (beginning about the year 2000) was accomplished for a mean annual production volume of 78 180 Mg of hardware products for export resulting in average specific manufacturing cost of 8.4 $/kg and total annual cost of 1.25 billion dollars during the life cycle. The reference space transportation system uses LOX/LH 2 propulsion for which at the average 210 500 Mg LOX per year is produced on the moon. The sensitivity analysis indicates the importance of bootstrapping as well as the influence of market size, space transportation cost and specific resources demand on the mean lunar manufacturing cost. The option using lunar resources turns out to be quite attractive from the economical viewpoint. Systems analysis by this lunar base model and further trade-offs will be a useful tool to confirm this.

  4. Simulation model of the F/A-18 high angle-of-attack research vehicle utilized for the design of advanced control laws

    NASA Technical Reports Server (NTRS)

    Strickland, Mark E.; Bundick, W. Thomas; Messina, Michael D.; Hoffler, Keith D.; Carzoo, Susan W.; Yeager, Jessie C.; Beissner, Fred L., Jr.

    1996-01-01

    The 'f18harv' six degree-of-freedom nonlinear batch simulation used to support research in advanced control laws and flight dynamics issues as part of NASA's High Alpha Technology Program is described in this report. This simulation models an F/A-18 airplane modified to incorporate a multi-axis thrust-vectoring system for augmented pitch and yaw control power and actuated forebody strakes for enhanced aerodynamic yaw control power. The modified configuration is known as the High Alpha Research Vehicle (HARV). The 'f18harv' simulation was an outgrowth of the 'f18bas' simulation which modeled the basic F/A-18 with a preliminary version of a thrust-vectoring system designed for the HARV. The preliminary version consisted of two thrust-vectoring vanes per engine nozzle compared with the three vanes per engine actually employed on the F/A-18 HARV. The modeled flight envelope is extensive in that the aerodynamic database covers an angle-of-attack range of -10 degrees to +90 degrees, sideslip range of -20 degrees to +20 degrees, a Mach Number range between 0.0 and 2.0, and an altitude range between 0 and 60,000 feet.

  5. Process simulation for advanced composites production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less

  6. Advances in helical stent design and fabrication thermal treatment and structural interaction studies of the simulated plaque-laden artery

    NASA Astrophysics Data System (ADS)

    Welch, Tre Raymond

    Advancements in processing biomaterials have lead to the development of bioresorbable PLLA drug-loaded stents with different geometric configurations. To further advance the technology, systematic studies have been carried out. This dissertation consists of five specific aims: (1) To characterize the effects of thermal annealing on the mechanical characteristics of PLLA helical stent, (2) To characterize the mechanical characteristics of a PLLA double helix stent, (3) To characterize the physical and chemical properties of PLLA films impregnated with niacin and curcumin, (4) To characterize the mechanical interaction of expanded stent and vascular wall with both model simulation and experimental studies using PDMS phantom arteries, (5) To simulate the stent-plaque-artery interactions using computer models. Results and their significances in bioresorbable PLLA drug-loaded stents technology as well as clinical prospects will be presented. For Aim1, thermal annealing is shown to improve mechanical characteristics of the helical stent, including pressure-diameter response curves, incremental stiffness, and collapse pressure. Differential scanning calorimetric analysis of stent fiber reveals that thermal annealing contribute to increased percent crystallinity, thus enhanced mechanical characteristics of the stent. For Aim 2, the new double helix design was shown to leads to improved mechanical characteristics of stent, including pressure-diameter response curves, incremental stiffness, and collapse pressure. Further, it was found to lead to an increased percent crystallinity and reduced degradation rate. For Aim 3, the changes in mechanical properties, crystallinity in PLLA polymer loaded with curcumin, or niacin, or both from that of control are clearly delineated. Results from Aim 4 shed lights on the mechanical disturbance in the vicinity of deployed stent and vascular wall as well as the abnormal shear stresses on the vascular endothelium. Their implications in

  7. The Zooniverse

    NASA Astrophysics Data System (ADS)

    Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.

    2009-12-01

    The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities

  8. Advanced Aerospace Materials by Design

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Djomehri, Jahed; Wei, Chen-Yu

    2004-01-01

    The advances in the emerging field of nanophase thermal and structural composite materials; materials with embedded sensors and actuators for morphing structures; light-weight composite materials for energy and power storage; and large surface area materials for in-situ resource generation and waste recycling, are expected to :revolutionize the capabilities of virtually every system comprising of future robotic and :human moon and mars exploration missions. A high-performance multiscale simulation platform, including the computational capabilities and resources of Columbia - the new supercomputer, is being developed to discover, validate, and prototype next generation (of such advanced materials. This exhibit will describe the porting and scaling of multiscale 'physics based core computer simulation codes for discovering and designing carbon nanotube-polymer composite materials for light-weight load bearing structural and 'thermal protection applications.

  9. Preclinical endoscopic training using a part-task simulator: learning curve assessment and determination of threshold score for advancement to clinical endoscopy.

    PubMed

    Jirapinyo, Pichamol; Abidi, Wasif M; Aihara, Hiroyuki; Zaki, Theodore; Tsay, Cynthia; Imaeda, Avlin B; Thompson, Christopher C

    2017-10-01

    Preclinical simulator training has the potential to decrease endoscopic procedure time and patient discomfort. This study aims to characterize the learning curve of endoscopic novices in a part-task simulator and propose a threshold score for advancement to initial clinical cases. Twenty novices with no prior endoscopic experience underwent repeated endoscopic simulator sessions using the part-task simulator. Simulator scores were collected; their inverse was averaged and fit to an exponential curve. The incremental improvement after each session was calculated. Plateau was defined as the session after which incremental improvement in simulator score model was less than 5%. Additionally, all participants filled out questionnaires regarding simulator experience after sessions 1, 5, 10, 15, and 20. A visual analog scale and NASA task load index were used to assess levels of comfort and demand. Twenty novices underwent 400 simulator sessions. Mean simulator scores at sessions 1, 5, 10, 15, and 20 were 78.5 ± 5.95, 176.5 ± 17.7, 275.55 ± 23.56, 347 ± 26.49, and 441.11 ± 38.14. The best fit exponential model was [time/score] = 26.1 × [session #] -0.615 ; r 2  = 0.99. This corresponded to an incremental improvement in score of 35% after the first session, 22% after the second, 16% after the third and so on. Incremental improvement dropped below 5% after the 12th session corresponding to the predicted score of 265. Simulator training was related to higher comfort maneuvering an endoscope and increased readiness for supervised clinical endoscopy, both plateauing between sessions 10 and 15. Mental demand, physical demand, and frustration levels decreased with increased simulator training. Preclinical training using an endoscopic part-task simulator appears to increase comfort level and decrease mental and physical demand associated with endoscopy. Based on a rigorous model, we recommend that novices complete a minimum of 12 training

  10. A Scalable O(N) Algorithm for Large-Scale Parallel First-Principles Molecular Dynamics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    Traditional algorithms for first-principles molecular dynamics (FPMD) simulations only gain a modest capability increase from current petascale computers, due to their O(N 3) complexity and their heavy use of global communications. To address this issue, we are developing a truly scalable O(N) complexity FPMD algorithm, based on density functional theory (DFT), which avoids global communications. The computational model uses a general nonorthogonal orbital formulation for the DFT energy functional, which requires knowledge of selected elements of the inverse of the associated overlap matrix. We present a scalable algorithm for approximately computing selected entries of the inverse of the overlap matrix,more » based on an approximate inverse technique, by inverting local blocks corresponding to principal submatrices of the global overlap matrix. The new FPMD algorithm exploits sparsity and uses nearest neighbor communication to provide a computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic orbitals are confined, and a cutoff beyond which the entries of the overlap matrix can be omitted when computing selected entries of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to O(100K) atoms on O(100K) processors, with a wall-clock time of O(1) minute per molecular dynamics time step.« less

  11. Surgical simulation: a urological perspective.

    PubMed

    Wignall, Geoffrey R; Denstedt, John D; Preminger, Glenn M; Cadeddu, Jeffrey A; Pearle, Margaret S; Sweet, Robert M; McDougall, Elspeth M

    2008-05-01

    Surgical education is changing rapidly as several factors including budget constraints and medicolegal concerns limit opportunities for urological trainees. New methods of skills training such as low fidelity bench trainers and virtual reality simulators offer new avenues for surgical education. In addition, surgical simulation has the potential to allow practicing surgeons to develop new skills and maintain those they already possess. We provide a review of the background, current status and future directions of surgical simulators as they pertain to urology. We performed a literature review and an overview of surgical simulation in urology. Surgical simulators are in various stages of development and validation. Several simulators have undergone extensive validation studies and are in use in surgical curricula. While virtual reality simulators offer the potential to more closely mimic reality and present entire operations, low fidelity simulators remain useful in skills training, particularly for novices and junior trainees. Surgical simulation remains in its infancy. However, the potential to shorten learning curves for difficult techniques and practice surgery without risk to patients continues to drive the development of increasingly more advanced and realistic models. Surgical simulation is an exciting area of surgical education. The future is bright as advancements in computing and graphical capabilities offer new innovations in simulator technology. Simulators must continue to undergo rigorous validation studies to ensure that time spent by trainees on bench trainers and virtual reality simulators will translate into improved surgical skills in the operating room.

  12. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  13. The Next Frontier in Computing

    ScienceCinema

    Sarrao, John

    2018-06-13

    Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.

  14. Communication training for advanced medical students improves information recall of medical laypersons in simulated informed consent talks – a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Informed consent talks are mandatory before invasive interventions. However, the patients’ information recall has been shown to be rather poor. We investigated, whether medical laypersons recalled more information items from a simulated informed consent talk after advanced medical students participated in a communication training aiming to reduce a layperson’s cognitive load. Methods Using a randomized, controlled, prospective cross-over-design, 30 5th and 6th year medical students were randomized into two groups. One group received communication training, followed by a comparison intervention (early intervention group, EI); the other group first received the comparison intervention and then communication training (late intervention group, LI). Before and after the interventions, the 30 medical students performed simulated informed consent talks with 30 blinded medical laypersons using a standardized set of information. We then recorded the number of information items the medical laypersons recalled. Results After the communication training both groups of medical laypersons recalled significantly more information items (EI: 41 ± 9% vs. 23 ± 9%, p < .0001, LI 49 ± 10% vs. 35 ± 6%, p < .0001). After the comparison intervention the improvement was modest and significant only in the LI (EI: 42 ± 9% vs. 40 ± 9%, p = .41, LI 35 ± 6% vs. 29 ± 9%, p = .016). Conclusion Short communication training for advanced medical students improves information recall of medical laypersons in simulated informed consent talks. PMID:23374907

  15. Advanced Solid State Lighting for Human Evaluation Project

    NASA Technical Reports Server (NTRS)

    Zeitlin, Nancy; Holbert, Eirik

    2015-01-01

    Lighting intensity and color have a significant impact on human circadian rhythms. Advanced solid state lighting was developed for the Advanced Exploration System (AES) Deep Space Habitat(DSH) concept demonstrator. The latest generation of assemblies using the latest commercially available LED lights were designed for use in the Bigelow Aerospace Environmental Control and Life Support System (ECLSS) simulator and the University of Hawaii's Hawaii Space Exploration Analog and Simulation (Hi-SEAS) habitat. Agreements with both these organizations will allow the government to receive feedback on the lights and lighting algorithms from long term human interaction.

  16. Coupled field effects in BWR stability simulations using SIMULATE-3K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borkowski, J.; Smith, K.; Hagrman, D.

    1996-12-31

    The SIMULATE-3K code is the transient analysis version of the Studsvik advanced nodal reactor analysis code, SIMULATE-3. Recent developments have focused on further broadening the range of transient applications by refinement of core thermal-hydraulic models and on comparison with boiling water reactor (BWR) stability measurements performed at Ringhals unit 1, during the startups of cycles 14 through 17.

  17. Effect of Advanced Trauma Life Support program on medical interns' performance in simulated trauma patient management.

    PubMed

    Ahmadi, Koorosh; Sedaghat, Mohammad; Safdarian, Mahdi; Hashemian, Amir-Masoud; Nezamdoust, Zahra; Vaseie, Mohammad; Rahimi-Movaghar, Vafa

    2013-01-01

    Since appropriate and time-table methods in trauma care have an important impact on patients'outcome, we evaluated the effect of Advanced Trauma Life Support (ATLS) program on medical interns' performance in simulated trauma patient management. A descriptive and analytical study before and after the training was conducted on 24 randomly selected undergraduate medical interns from Imam Reza Hospital in Mashhad, Iran. On the first day, we assessed interns' clinical knowledge and their practical skill performance in confronting simulated trauma patients. After 2 days of ATLS training, we performed the same study and evaluated their score again on the fourth day. The two findings, pre- and post- ATLS periods, were compared through SPSS version 15.0 software. P values less than 0.05 were considered statistically significant. Our findings showed that interns'ability in all the three tasks improved after the training course. On the fourth day after training, there was a statistically significant increase in interns' clinical knowledge of ATLS procedures, the sequence of procedures and skill performance in trauma situations (P less than 0.001, P equal to 0.016 and P equal to 0.01 respectively). ATLS course has an important role in increasing clinical knowledge and practical skill performance of trauma care in medical interns.

  18. Achieving Accreditation Council for Graduate Medical Education duty hours compliance within advanced surgical training: a simulation-based feasibility assessment.

    PubMed

    Obi, Andrea; Chung, Jennifer; Chen, Ryan; Lin, Wandi; Sun, Siyuan; Pozehl, William; Cohn, Amy M; Daskin, Mark S; Seagull, F Jacob; Reddy, Rishindra M

    2015-11-01

    Certain operative cases occur unpredictably and/or have long operative times, creating a conflict between Accreditation Council for Graduate Medical Education (ACGME) rules and adequate training experience. A ProModel-based simulation was developed based on historical data. Probabilistic distributions of operative time calculated and combined with an ACGME compliant call schedule. For the advanced surgical cases modeled (cardiothoracic transplants), 80-hour violations were 6.07% and the minimum number of days off was violated 22.50%. There was a 36% chance of failure to fulfill any (either heart or lung) minimum case requirement despite adequate volume. The variable nature of emergency cases inevitably leads to work hour violations under ACGME regulations. Unpredictable cases mandate higher operative volume to ensure achievement of adequate caseloads. Publically available simulation technology provides a valuable avenue to identify adequacy of case volumes for trainees in both the elective and emergency setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Virtual Reality and Simulation in Neurosurgical Training.

    PubMed

    Bernardo, Antonio

    2017-10-01

    Recent biotechnological advances, including three-dimensional microscopy and endoscopy, virtual reality, surgical simulation, surgical robotics, and advanced neuroimaging, have continued to mold the surgeon-computer relationship. For developing neurosurgeons, such tools can reduce the learning curve, improve conceptual understanding of complex anatomy, and enhance visuospatial skills. We explore the current and future roles and application of virtual reality and simulation in neurosurgical training. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Goal-directed transthoracic echocardiography during advanced cardiac life support: A pilot study using simulation to assess ability

    PubMed Central

    Greenstein, Yonatan Y.; Martin, Thomas J.; Rolnitzky, Linda; Felner, Kevin; Kaufman, Brian

    2015-01-01

    Introduction Goal-directed echocardiography (GDE) is used to answer specific clinical questions which provide invaluable information to physicians managing a hemodynamically unstable patient. We studied perception and ability of housestaff previously trained in GDE to accurately diagnose common causes of cardiac arrest during simulated advanced cardiac life support (ACLS); we compared their results to those of expert echocardiographers. Methods Eleven pulmonary and critical care medicine fellows, seven emergency medicine residents, and five cardiologists board-certified in echocardiography were enrolled. Baseline ability to acquire four transthoracic echocardiography views was assessed and participants were exposed to six simulated cardiac arrests and were asked to perform a GDE during ACLS. Housestaff performance was compared to the performance of five expert echocardiographers. Results Average baseline and scenario views by housestaff were of good or excellent quality 89% and 83% of the time, respectively. Expert average baseline and scenario views were always of good or excellent quality. Housestaff and experts made the correct diagnosis in 68% and 77% of cases, respectively. On average, participants required 1.5 pulse checks to make the correct diagnosis. 94% of housestaff perceived this study as an accurate assessment of ability. Conclusions In an ACLS compliant manner, housestaff are capable of diagnosing management altering pathologies the majority of the time and they reach similar diagnostic conclusions in the same amount of time as expert echocardiographers in a simulated cardiac arrest scenario. PMID:25932707

  1. Goal-Directed Transthoracic Echocardiography During Advanced Cardiac Life Support: A Pilot Study Using Simulation to Assess Ability.

    PubMed

    Greenstein, Yonatan Y; Martin, Thomas J; Rolnitzky, Linda; Felner, Kevin; Kaufman, Brian

    2015-08-01

    Goal-directed echocardiography (GDE) is used to answer specific clinical questions that provide invaluable information to physicians managing a hemodynamically unstable patient. We studied perception and ability of house staff previously trained in GDE to accurately diagnose common causes of cardiac arrest during simulated advanced cardiac life support (ACLS); we compared their results with those of expert echocardiographers. Eleven pulmonary and critical care medicine fellows, 7 emergency medicine residents, and 5 cardiologists board certified in echocardiography were enrolled. Baseline ability to acquire 4 transthoracic echocardiography views was assessed, and participants were exposed to 6 simulated cardiac arrests and were asked to perform a GDE during ACLS. House staff performance was compared with the performance of 5 expert echocardiographers. Average baseline and scenario views by house staff were of good or excellent quality 89% and 83% of the time, respectively. Expert average baseline and scenario views were always of good or excellent quality. House staff and experts made the correct diagnosis in 68% and 77% of cases, respectively. On average, participants required 1.5 pulse checks to make the correct diagnosis. Of house staff, 94% perceived this study as an accurate assessment of ability. In an ACLS-compliant manner, house staff are capable of diagnosing management-altering pathologies the majority of the time, and they reach similar diagnostic conclusions in the same amount of time as expert echocardiographers in a simulated cardiac arrest scenario.

  2. Spatial Disorientation Training in the Rotor Wing Flight Simulator.

    PubMed

    Powell-Dunford, Nicole; Bushby, Alaistair; Leland, Richard A

    This study is intended to identify efficacy, evolving applications, best practices, and challenges of spatial disorientation (SD) training in flight simulators for rotor wing pilots. Queries of a UK Ministry of Defense research database and Pub Med were undertaken using the search terms 'spatial disorientation,' 'rotor wing,' and 'flight simulator.' Efficacy, evolving applications, best practices, and challenges of SD simulation for rotor wing pilots were also ascertained through discussion with subject matter experts and industrial partners. Expert opinions were solicited at the aeromedical physiologist, aeromedical psychologist, instructor pilot, aeromedical examiner, and corporate executive levels. Peer review literature search yielded 129 articles, with 5 relevant to the use of flight simulators for the spatial disorientation training of rotor wing pilots. Efficacy of such training was measured subjectively and objectively. A preponderance of anecdotal reports endorse the benefits of rotor wing simulator SD training, with a small trial substantiating performance improvement. Advancing technologies enable novel training applications. The mobile nature of flight students and concurrent anticollision technologies can make long-range assessment of SD training efficacy challenging. Costs of advanced technologies could limit the extent to which the most advanced simulators can be employed across the rotor wing community. Evidence suggests the excellent training value of rotor wing simulators for SD training. Objective data from further research, particularly with regards to evolving technologies, may justify further usage of advanced simulator platforms for SD training and research. Powell-Dunford N, Bushby A, Leland RA. Spatial disorientation training in the rotor wing flight simulator. Aerosp Med Hum Perform. 2016; 87(10):890-893.

  3. A Monte Carlo simulation of advanced HIV disease: application to prevention of CMV infection.

    PubMed

    Paltiel, A D; Scharfstein, J A; Seage, G R; Losina, E; Goldie, S J; Weinstein, M C; Craven, D E; Freedberg, K A

    1998-01-01

    Disagreement exists among decision makers regarding the allocation of limited HIV patient care resources and, specifically, the comparative value of preventing opportunistic infections in late-stage disease. A Monte Carlo simulation framework was used to evaluate a state-transition model of the natural history of HIV illness in patients with CD4 counts below 300/mm3 and to project the costs and consequences of alternative strategies for preventing AIDS-related complications. The authors describe the model and demonstrate how it may be employed to assess the cost-effectiveness of oral ganciclovir for prevention of cytomegalovirus (CMV) infection. Ganciclovir prophylaxis confers an estimated additional 0.7 quality-adjusted month of life at a net cost of $10,700, implying an incremental cost-effectiveness ratio of roughly $173,000 per quality-adjusted life year gained. Sensitivity analysis reveals that this baseline result is stable over a wide range of input data estimates, including quality of life and drug efficacy, but it is sensitive to CMV incidence and drug price assumptions. The Monte Carlo simulation framework offers decision makers a powerful and flexible tool for evaluating choices in the realm of chronic disease patient care. The authors have used it to assess HIV-related treatment options and continue to refine it to reflect advances in defining the pathogenesis and treatment of AIDS. Compared with alternative interventions, CMV prophylaxis does not appear to be a cost-effective use of scarce HIV clinical care funds. However, targeted prevention in patients identified to be at higher risk for CMV-related disease may warrant consideration.

  4. A Simulation-Based Program to Train Medical Residents to Lead and Perform Advanced Cardiovascular Life Support

    PubMed Central

    Stefan, Mihaela S.; Belforti, Raquel K.; Langlois, Gerard; Rothberg, Michael B.

    2014-01-01

    Background Medical residents are often responsible for leading and performing cardiopulmonary resuscitation; however, their levels of expertise and comfort as leaders of advanced cardiovascular life support (ACLS) teams vary widely. While the current American Heart Association ACLS course provides education in recommended resuscitative protocols, training in leadership skills is insufficient. In this article, we describe the design and implementation in our institution of a formative curriculum aimed at improving residents’ readiness for being leaders of ACLS teams using human patient simulation. Human patient simulation refers to a variety of technologies using mannequins with realistic features, which allows learners to practice through scenarios without putting patients at risk. We discuss the limitations of the program and the challenges encountered in implementation. We also provide a description of the initiation and organization of the program. Case scenarios and assessment tools are provided. Description of the Institutional Training Program Our simulation-based training curriculum consists of 8 simulated patient scenarios during four 1-hour sessions. Postgraduate year–2 and 3 internal medicine residents participate in this program in teams of 4. Assessment tools are utilized only for formative evaluation. Debriefing is used as a teaching strategy for the individual resident leader of the ACLS team to facilitate learning and improve performance. To evaluate the impact of the curriculum, we administered a survey before and after the intervention. The survey consisted of 10 questions answered on a 5-point Likert scale, which addressed residents’ confidence in leading ACLS teams, management of the equipment, and management of cardiac rhythms. Respondents’ mean presimulation (ie, baseline) and postsimulation (outcome) scores were compared using a 2-sample t test. Residents’ overall confidence score improved from 2.8 to 3.9 (P < 0.001; mean improvement

  5. Advanced Chemical Modeling for Turbulent Combustion Simulations

    DTIC Science & Technology

    2012-05-03

    premixed combustion. The chemistry work proposes a method for defining jet fuel surrogates, describes how different sub- mechanisms can be incorporated...Chemical Modeling For Turbulent Combustion Simulations Final Report submitted by: Heinz Pitsch (PI) Stanford University Mechanical Engineering Flow Physics...predict the combustion characteristics of fuel oxidation and pollutant emissions from engines . The relevant fuel chemistry must be accurately modeled

  6. Development of a Turbofan Engine Simulation in a Graphical Simulation Environment

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Heui

    2003-01-01

    This paper presents the development of a generic component level model of a turbofan engine simulation with a digital controller, in an advanced graphical simulation environment. The goal of this effort is to develop and demonstrate a flexible simulation platform for future research in propulsion system control and diagnostic technology. A previously validated FORTRAN-based model of a modern, high-performance, military-type turbofan engine is being used to validate the platform development. The implementation process required the development of various innovative procedures, which are discussed in the paper. Open-loop and closed-loop comparisons are made between the two simulations. Future enhancements that are to be made to the modular engine simulation are summarized.

  7. MHD Simulation of Magnetic Nozzle Plasma with the NIMROD Code: Applications to the VASIMR Advanced Space Propulsion Concept

    NASA Astrophysics Data System (ADS)

    Tarditi, Alfonso G.; Shebalin, John V.

    2002-11-01

    A simulation study with the NIMROD code [1] is being carried on to investigate the efficiency of the thrust generation process and the properties of the plasma detachment in a magnetic nozzle. In the simulation, hot plasma is injected in the magnetic nozzle, modeled as a 2D, axi-symmetric domain. NIMROD has two-fluid, 3D capabilities but the present runs are being conducted within the MHD, 2D approximation. As the plasma travels through the magnetic field, part of its thermal energy is converted into longitudinal kinetic energy, along the axis of the nozzle. The plasma eventually detaches from the magnetic field at a certain distance from the nozzle throat where the kinetic energy becomes larger than the magnetic energy. Preliminary NIMROD 2D runs have been benchmarked with a particle trajectory code showing satisfactory results [2]. Further testing is here reported with the emphasis on the analysis of the diffusion rate across the field lines and of the overall nozzle efficiency. These simulation runs are specifically designed for obtaining comparisons with laboratory measurements of the VASIMR experiment, by looking at the evolution of the radial plasma density and temperature profiles in the nozzle. VASIMR (Variable Specific Impulse Magnetoplasma Rocket, [3]) is an advanced space propulsion concept currently under experimental development at the Advanced Space Propulsion Laboratory, NASA Johnson Space Center. A plasma (typically ionized Hydrogen or Helium) is generated by a RF (Helicon) discharge and heated by an Ion Cyclotron Resonance Heating antenna. The heated plasma is then guided into a magnetic nozzle to convert the thermal plasma energy into effective thrust. The VASIMR system has no electrodes and a solenoidal magnetic field produced by an asymmetric mirror configuration ensures magnetic insulation of the plasma from the material surfaces. By powering the plasma source and the heating antenna at different levels it is possible to vary smoothly of the

  8. A Petascale Non-Hydrostatic Atmospheric Dynamical Core in the HOMME Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tufo, Henry

    The High-Order Method Modeling Environment (HOMME) is a framework for building scalable, conserva- tive atmospheric models for climate simulation and general atmospheric-modeling applications. Its spatial discretizations are based on Spectral-Element (SE) and Discontinuous Galerkin (DG) methods. These are local methods employing high-order accurate spectral basis-functions that have been shown to perform well on massively parallel supercomputers at any resolution and scale particularly well at high resolutions. HOMME provides the framework upon which the CAM-SE community atmosphere model dynamical-core is constructed. In its current incarnation, CAM-SE employs the hydrostatic primitive-equations (PE) of motion, which limits its resolution to simulations coarser thanmore » 0.1 per grid cell. The primary objective of this project is to remove this resolution limitation by providing HOMME with the capabilities needed to build nonhydrostatic models that solve the compressible Euler/Navier-Stokes equations.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas

    Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.

  10. The Nuclear Energy Advanced Modeling and Simulation Safeguards and Separations Reprocessing Plant Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alex; Billings, Jay Jay; de Almeida, Valmor F

    2011-08-01

    This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usabilitymore » and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.« less

  11. Mission Simulation Facility: Simulation Support for Autonomy Development

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Plice, Laura; Neukom, Christian; Flueckiger, Lorenzo; Wagner, Michael

    2003-01-01

    The Mission Simulation Facility (MSF) supports research in autonomy technology for planetary exploration vehicles. Using HLA (High Level Architecture) across distributed computers, the MSF connects users autonomy algorithms with provided or third-party simulations of robotic vehicles and planetary surface environments, including onboard components and scientific instruments. Simulation fidelity is variable to meet changing needs as autonomy technology advances in Technical Readiness Level (TRL). A virtual robot operating in a virtual environment offers numerous advantages over actual hardware, including availability, simplicity, and risk mitigation. The MSF is in use by researchers at NASA Ames Research Center (ARC) and has demonstrated basic functionality. Continuing work will support the needs of a broader user base.

  12. A real-time simulator of a turbofan engine

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Delaat, John C.; Merrill, Walter C.

    1989-01-01

    A real-time digital simulator of a Pratt and Whitney F100 engine has been developed for real-time code verification and for actuator diagnosis during full-scale engine testing. This self-contained unit can operate in an open-loop stand-alone mode or as part of closed-loop control system. It can also be used for control system design and development. Tests conducted in conjunction with the NASA Advanced Detection, Isolation, and Accommodation program show that the simulator is a valuable tool for real-time code verification and as a real-time actuator simulator for actuator fault diagnosis. Although currently a small perturbation model, advances in microprocessor hardware should allow the simulator to evolve into a real-time, full-envelope, full engine simulation.

  13. Large eddy simulations in 2030 and beyond

    PubMed Central

    Piomelli, U

    2014-01-01

    Since its introduction, in the early 1970s, large eddy simulations (LES) have advanced considerably, and their application is transitioning from the academic environment to industry. Several landmark developments can be identified over the past 40 years, such as the wall-resolved simulations of wall-bounded flows, the development of advanced models for the unresolved scales that adapt to the local flow conditions and the hybridization of LES with the solution of the Reynolds-averaged Navier–Stokes equations. Thanks to these advancements, LES is now in widespread use in the academic community and is an option available in most commercial flow-solvers. This paper will try to predict what algorithmic and modelling advancements are needed to make it even more robust and inexpensive, and which areas show the most promise. PMID:25024415

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources

  15. An Advanced, Interactive, High-Performance Liquid Chromatography Simulator and Instructor Resources

    ERIC Educational Resources Information Center

    Boswell, Paul G.; Stoll, Dwight R.; Carr, Peter W.; Nagel, Megan L.; Vitha, Mark F.; Mabbott, Gary A.

    2013-01-01

    High-performance liquid chromatography (HPLC) simulation software has long been recognized as an effective educational tool, yet many of the existing HPLC simulators are either too expensive, outdated, or lack many important features necessary to make them widely useful for educational purposes. Here, a free, open-source HPLC simulator is…

  16. Mesoscopic Model — Advanced Simulation of Microforming Processes

    NASA Astrophysics Data System (ADS)

    Geißdörfer, Stefan; Engel, Ulf; Geiger, Manfred

    2007-04-01

    Continued miniaturization in many fields of forming technology implies the need for a better understanding of the effects occurring while scaling down from conventional macroscopic scale to microscale. At microscale, the material can no longer be regarded as a homogeneous continuum because of the presence of only a few grains in the deformation zone. This leads to a change in the material behaviour resulting among others in a large scatter of forming results. A correlation between the integral flow stress of the workpiece and the scatter of the process factors on the one hand and the mean grain size and its standard deviation on the other hand has been observed in experiments. The conventional FE-simulation of scaled down processes is not able to consider the size-effects observed such as the actual reduction of the flow stress, the increasing scatter of the process factors and a local material flow being different to that obtained in the case of macroparts. For that reason, a new simulation model has been developed taking into account all the size-effects. The present paper deals with the theoretical background of the new mesoscopic model, its characteristics like synthetic grain structure generation and the calculation of micro material properties — based on conventional material properties. The verification of the simulation model is done by carrying out various experiments with different mean grain sizes and grain structures but the same geometrical dimensions of the workpiece.

  17. Advanced Engineering Environments: Implications for Aerospace Manufacturing

    NASA Technical Reports Server (NTRS)

    Thomas, D.

    2001-01-01

    There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.

  18. Data-intensive computing on numerically-insensitive supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, James P; Fasel, Patricia K; Habib, Salman

    2010-12-03

    With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.

  19. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  20. Simplified and advanced modelling of traction control systems of heavy-haul locomotives

    NASA Astrophysics Data System (ADS)

    Spiryagin, Maksym; Wolfs, Peter; Szanto, Frank; Cole, Colin

    2015-05-01

    Improving tractive effort is a very complex task in locomotive design. It requires the development of not only mechanical systems but also power systems, traction machines and traction algorithms. At the initial design stage, traction algorithms can be verified by means of a simulation approach. A simple single wheelset simulation approach is not sufficient because all locomotive dynamics are not fully taken into consideration. Given that many traction control strategies exist, the best solution is to use more advanced approaches for such studies. This paper describes the modelling of a locomotive with a bogie traction control strategy based on a co-simulation approach in order to deliver more accurate results. The simplified and advanced modelling approaches of a locomotive electric power system are compared in this paper in order to answer a fundamental question. What level of modelling complexity is necessary for the investigation of the dynamic behaviours of a heavy-haul locomotive running under traction? The simulation results obtained provide some recommendations on simulation processes and the further implementation of advanced and simplified modelling approaches.

  1. Integrated Instrument Simulator Suites for Earth Science

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, Johnathan; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.; hide

    2012-01-01

    The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.

  2. Simulations of toroidal Alfvén eigenmode excited by fast ions on the Experimental Advanced Superconducting Tokamak

    NASA Astrophysics Data System (ADS)

    Pei, Youbin; Xiang, Nong; Shen, Wei; Hu, Youjun; Todo, Y.; Zhou, Deng; Huang, Juan

    2018-05-01

    Kinetic-MagnetoHydroDynamic (MHD) hybrid simulations are carried out to study fast ion driven toroidal Alfvén eigenmodes (TAEs) on the Experimental Advanced Superconducting Tokamak (EAST). The first part of this article presents the linear benchmark between two kinetic-MHD codes, namely MEGA and M3D-K, based on a realistic EAST equilibrium. Parameter scans show that the frequency and the growth rate of the TAE given by the two codes agree with each other. The second part of this article discusses the resonance interaction between the TAE and fast ions simulated by the MEGA code. The results show that the TAE exchanges energy with the co-current passing particles with the parallel velocity |v∥ | ≈VA 0/3 or |v∥ | ≈VA 0/5 , where VA 0 is the Alfvén speed on the magnetic axis. The TAE destabilized by the counter-current passing ions is also analyzed and found to have a much smaller growth rate than the co-current ions driven TAE. One of the reasons for this is found to be that the overlapping region of the TAE spatial location and the counter-current ion orbits is narrow, and thus the wave-particle energy exchange is not efficient.

  3. Learning to Control Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Subramanian, Devika

    2004-01-01

    Advanced life support systems have many interacting processes and limited resources. Controlling and optimizing advanced life support systems presents unique challenges. In particular, advanced life support systems are nonlinear coupled dynamical systems and it is difficult for humans to take all interactions into account to design an effective control strategy. In this project. we developed several reinforcement learning controllers that actively explore the space of possible control strategies, guided by rewards from a user specified long term objective function. We evaluated these controllers using a discrete event simulation of an advanced life support system. This simulation, called BioSim, designed by Nasa scientists David Kortenkamp and Scott Bell has multiple, interacting life support modules including crew, food production, air revitalization, water recovery, solid waste incineration and power. They are implemented in a consumer/producer relationship in which certain modules produce resources that are consumed by other modules. Stores hold resources between modules. Control of this simulation is via adjusting flows of resources between modules and into/out of stores. We developed adaptive algorithms that control the flow of resources in BioSim. Our learning algorithms discovered several ingenious strategies for maximizing mission length by controlling the air and water recycling systems as well as crop planting schedules. By exploiting non-linearities in the overall system dynamics, the learned controllers easily out- performed controllers written by human experts. In sum, we accomplished three goals. We (1) developed foundations for learning models of coupled dynamical systems by active exploration of the state space, (2) developed and tested algorithms that learn to efficiently control air and water recycling processes as well as crop scheduling in Biosim, and (3) developed an understanding of the role machine learning in designing control systems for

  4. Terascale Cluster for Advanced Turbulent Combustion Simulations

    DTIC Science & Technology

    2008-07-25

    the system We have given the name CATS (for Combustion And Turbulence Simulator) to the terascale system that was obtained through this grant. CATS ...lnfiniBand interconnect. CATS includes an interactive login node and a file server, each holding in excess of 1 terabyte of file storage. The 35 active...compute nodes of CATS enable us to run up to 140-core parallel MPI batch jobs; one node is reserved to run the scheduler. CATS is operated and

  5. Advanced ISDN satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The research performed by GTE Government Systems and the University of Colorado in support of the NASA Satellite Communications Applications Research (SCAR) Program is summarized. Two levels of research were undertaken. The first dealt with providing interim services Integrated Services Digital Network (ISDN) satellite (ISIS) capabilities that accented basic rate ISDN with a ground control similar to that of the Advanced Communications Technology Satellite (ACTS). The ISIS Network Model development represents satellite systems like the ACTS orbiting switch. The ultimate aim is to move these ACTS ground control functions on-board the next generation of ISDN communications satellite to provide full-service ISDN satellite (FSIS) capabilities. The technical and operational parameters for the advanced ISDN communications satellite design are obtainable from the simulation of ISIS and FSIS engineering software models of the major subsystems of the ISDN communications satellite architecture. Discrete event simulation experiments would generate data for analysis against NASA SCAR performance measure and the data obtained from the ISDN satellite terminal adapter hardware (ISTA) experiments, also developed in the program. The Basic and Option 1 phases of the program are also described and include the following: literature search, traffic mode, network model, scenario specifications, performance measures definitions, hardware experiment design, hardware experiment development, simulator design, and simulator development.

  6. Simulating the Physical World

    NASA Astrophysics Data System (ADS)

    Berendsen, Herman J. C.

    2004-06-01

    The simulation of physical systems requires a simplified, hierarchical approach which models each level from the atomistic to the macroscopic scale. From quantum mechanics to fluid dynamics, this book systematically treats the broad scope of computer modeling and simulations, describing the fundamental theory behind each level of approximation. Berendsen evaluates each stage in relation to its applications giving the reader insight into the possibilities and limitations of the models. Practical guidance for applications and sample programs in Python are provided. With a strong emphasis on molecular models in chemistry and biochemistry, this book will be suitable for advanced undergraduate and graduate courses on molecular modeling and simulation within physics, biophysics, physical chemistry and materials science. It will also be a useful reference to all those working in the field. Additional resources for this title including solutions for instructors and programs are available online at www.cambridge.org/9780521835275. The first book to cover the wide range of modeling and simulations, from atomistic to the macroscopic scale, in a systematic fashion Providing a wealth of background material, it does not assume advanced knowledge and is eminently suitable for course use Contains practical examples and sample programs in Python

  7. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  8. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations

  9. An advanced stochastic weather generator for simulating 2-D high-resolution climate variables

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2017-07-01

    A new stochastic weather generator, Advanced WEather GENerator for a two-dimensional grid (AWE-GEN-2d) is presented. The model combines physical and stochastic approaches to simulate key meteorological variables at high spatial and temporal resolution: 2 km × 2 km and 5 min for precipitation and cloud cover and 100 m × 100 m and 1 h for near-surface air temperature, solar radiation, vapor pressure, atmospheric pressure, and near-surface wind. The model requires spatially distributed data for the calibration process, which can nowadays be obtained by remote sensing devices (weather radar and satellites), reanalysis data sets and ground stations. AWE-GEN-2d is parsimonious in terms of computational demand and therefore is particularly suitable for studies where exploring internal climatic variability at multiple spatial and temporal scales is fundamental. Applications of the model include models of environmental systems, such as hydrological and geomorphological models, where high-resolution spatial and temporal meteorological forcing is crucial. The weather generator was calibrated and validated for the Engelberg region, an area with complex topography in the Swiss Alps. Model test shows that the climate variables are generated by AWE-GEN-2d with a level of accuracy that is sufficient for many practical applications.

  10. Autonomic physiological data associated with simulator discomfort

    NASA Technical Reports Server (NTRS)

    Miller, James C.; Sharkey, Thomas J.; Graham, Glenna A.; Mccauley, Michael E.

    1993-01-01

    The development of a physiological monitoring capability for the Army's advanced helicopter simulator facility is reported. Additionally, preliminary physiological data is presented. Our objective was to demonstrate the sensitivity of physiological measures in this simulator to self-reported simulator sickness. The data suggested that heart period, hypergastria, and skin conductance level were more sensitive to simulator sickness than were vagal tone and normal electrogastric activity.

  11. Simulation and Gaming: Directions, Issues, Ponderables.

    ERIC Educational Resources Information Center

    Uretsky, Michael

    1995-01-01

    Discusses the current use of simulation and gaming in a variety of settings. Describes advances in technology that facilitate the use of simulation and gaming, including computer power, computer networks, software, object-oriented programming, video, multimedia, virtual reality, and artificial intelligence. Considers the future use of simulation…

  12. Spacecraft applications of advanced global positioning system technology

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This is the final report on the Texas Instruments Incorporated (TI) simulations study of Spacecraft Application of Advanced Global Positioning System (GPS) Technology. This work was conducted for the NASA Johnson Space Center (JSC) under contract NAS9-17781. GPS, in addition to its baselined capability as a highly accurate spacecraft navigation system, can provide traffic control, attitude control, structural control, and uniform time base. In Phase 1 of this program, another contractor investigated the potential of GPS in these four areas and compared GPS to other techniques. This contract was for the Phase 2 effort, to study the performance of GPS for these spacecraft applications through computer simulations. TI had previously developed simulation programs for GPS differential navigation and attitude measurement. These programs were adapted for these specific spacecraft applications. In addition, TI has extensive expertise in the design and production of advanced GPS receivers, including space-qualified GPS receivers. We have drawn on this background to augment the simulation results in the system level overview, which is Section 2 of this report.

  13. Technology Advancements for Active Remote Sensing of Carbon Dioxide from Space using the ASCENDS CarbonHawk Experiment Simulator

    NASA Astrophysics Data System (ADS)

    Obland, M. D.; Nehrir, A. R.; Liu, Z.; Chen, S.; Campbell, J. F.; Lin, B.; Kooi, S. A.; Fan, T. F.; Choi, Y.; Plant, J.; Yang, M. M.; Browell, E. V.; Harrison, F. W.; Meadows, B.; Dobler, J. T.; Zaccheo, T. S.

    2015-12-01

    This work describes advances in critical lidar technologies and techniques developed as part of the ASCENDS CarbonHawk Experiment Simulator (ACES) system for measuring atmospheric column carbon dioxide (CO2) mixing ratios in support of the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) mission. The ACES design demonstrates advancements in: (1) enhanced power-aperture product through the use and operation of multiple co-aligned laser transmitters and a multi-aperture telescope design; (2) high-efficiency, high-power Erbium-Doped Fiber Amplifiers (EDFAs); (3) high-bandwidth, low-noise HgCdTe detector and transimpedence amplifier (TIA) subsystem capable of long-duration operation; and (4) advanced algorithms for cloud and aerosol discrimination. The ACES instrument, an Intensity-Modulated Continuous-Wave (IM-CW) lidar, was designed for high-altitude aircraft operations and can be directly applied to space instrumentation to meet the ASCENDS mission requirements. Specifically, the lidar simultaneously transmits three IM-CW laser beams from the high power EDFAs operating near 1571 nm. The outgoing laser beams are aligned to the field of view of three fiber-coupled 17.8-cm diameter telescopes, and the backscattered light collected by the same three telescopes is sent to the detector/TIA subsystem, which has a bandwidth of 4.9 MHz and operates service-free with a tactical Dewar and cryocooler. The electronic bandwidth is only slightly higher than 1 MHz, effectively limiting the noise level. Two key laser modulation approaches are being tested to significantly mitigate the effects of thin clouds on the retrieved CO2 column amounts. This work provides an over view of these technologies, the modulation approaches, and results from recent test flights.

  14. Next Generation Vehicle Positioning and Simulation Solutions : Using GPS and Advanced Simulation Tools to Improve Highway Safety

    DOT National Transportation Integrated Search

    2013-06-03

    "Integrated Global Positioning System and Inertial Navigation Unit (GPS/INU) Simulator for Enhanced Traffic Safety," is a project awarded to Ohio State University to integrate different simulation models to accurately study the relationship between v...

  15. Effects of Technological Advances in Surgical Education on Quantitative Outcomes From Residency Programs.

    PubMed

    Dietl, Charles A; Russell, John C

    2016-01-01

    The purpose of this article is to review the literature on current technology for surgical education and to evaluate the effect of technological advances on the Accreditation Council of Graduate Medical Education (ACGME) Core Competencies, American Board of Surgery In-Training Examination (ABSITE) scores, and American Board of Surgery (ABS) certification. A literature search was obtained from MEDLINE via PubMed.gov, ScienceDirect.com, and Google Scholar on all peer-reviewed studies published since 2003 using the following search queries: technology for surgical education, simulation-based surgical training, simulation-based nontechnical skills (NTS) training, ACGME Core Competencies, ABSITE scores, and ABS pass rate. Our initial search list included the following: 648 on technology for surgical education, 413 on simulation-based surgical training, 51 on simulation-based NTS training, 78 on ABSITE scores, and 33 on ABS pass rate. Further, 42 articles on technological advances for surgical education met inclusion criteria based on their effect on ACGME Core Competencies, ABSITE scores, and ABS certification. Systematic review showed that 33 of 42 and 26 of 42 publications on technological advances for surgical education showed objective improvements regarding patient care and medical knowledge, respectively, whereas only 2 of 42 publications showed improved ABSITE scores, but none showed improved ABS pass rates. Improvements in the other ACGME core competencies were documented in 14 studies, 9 of which were on simulation-based NTS training. Most of the studies on technological advances for surgical education have shown a positive effect on patient care and medical knowledge. However, the effect of simulation-based surgical training and simulation-based NTS training on ABSITE scores and ABS certification has not been assessed. Studies on technological advances in surgical education and simulation-based NTS training showing quantitative evidence that surgery residency

  16. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  17. Center For Advanced Energy Studies Overview

    ScienceCinema

    Blackman, Harold; Curnutt, Byron; Harker, Caitlin; Hamilton, Melinda; Butt, Darryl; Imel, George; Tokuhiro, Akira; Harris, Jason; Hill, David

    2017-12-09

    A collaboration between Idaho National Laboratory, Boise State University, Idaho State University and the University of Idaho. Conducts research in nuclear energy, advanced materials, carbon management, bioenergy, energy policy, modeling and simulation, and energy efficiency. Educates next generation of energy workforce.

  18. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  19. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 1. An Advanced Protocol for Molecular Dynamics Simulations and Collision Cross-Section Calculation.

    PubMed

    Ghassabi Kondalaji, Samaneh; Khakinejad, Mahdiar; Tafreshian, Amirmahdi; J Valentine, Stephen

    2017-05-01

    Collision cross-section (CCS) measurements with a linear drift tube have been utilized to study the gas-phase conformers of a model peptide (acetyl-PAAAAKAAAAKAAAAKAAAAK). Extensive molecular dynamics (MD) simulations have been conducted to derive an advanced protocol for the generation of a comprehensive pool of in-silico structures; both higher energy and more thermodynamically stable structures are included to provide an unbiased sampling of conformational space. MD simulations at 300 K are applied to the in-silico structures to more accurately describe the gas-phase transport properties of the ion conformers including their dynamics. Different methods used previously for trajectory method (TM) CCS calculation employing the Mobcal software [1] are evaluated. A new method for accurate CCS calculation is proposed based on clustering and data mining techniques. CCS values are calculated for all in-silico structures, and those with matching CCS values are chosen as candidate structures. With this approach, more than 300 candidate structures with significant structural variation are produced; although no final gas-phase structure is proposed here, in a second installment of this work, gas-phase hydrogen deuterium exchange data will be utilized as a second criterion to select among these structures as well as to propose relative populations for these ion conformers. Here the need to increase conformer diversity and accurate CCS calculation is demonstrated and the advanced methods are discussed. Graphical Abstract ᅟ.

  20. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 1. An Advanced Protocol for Molecular Dynamics Simulations and Collision Cross-Section Calculation

    NASA Astrophysics Data System (ADS)

    Ghassabi Kondalaji, Samaneh; Khakinejad, Mahdiar; Tafreshian, Amirmahdi; J. Valentine, Stephen

    2017-05-01

    Collision cross-section (CCS) measurements with a linear drift tube have been utilized to study the gas-phase conformers of a model peptide (acetyl-PAAAAKAAAAKAAAAKAAAAK). Extensive molecular dynamics (MD) simulations have been conducted to derive an advanced protocol for the generation of a comprehensive pool of in-silico structures; both higher energy and more thermodynamically stable structures are included to provide an unbiased sampling of conformational space. MD simulations at 300 K are applied to the in-silico structures to more accurately describe the gas-phase transport properties of the ion conformers including their dynamics. Different methods used previously for trajectory method (TM) CCS calculation employing the Mobcal software [1] are evaluated. A new method for accurate CCS calculation is proposed based on clustering and data mining techniques. CCS values are calculated for all in-silico structures, and those with matching CCS values are chosen as candidate structures. With this approach, more than 300 candidate structures with significant structural variation are produced; although no final gas-phase structure is proposed here, in a second installment of this work, gas-phase hydrogen deuterium exchange data will be utilized as a second criterion to select among these structures as well as to propose relative populations for these ion conformers. Here the need to increase conformer diversity and accurate CCS calculation is demonstrated and the advanced methods are discussed.

  1. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  2. Abaqus Simulations of Rock Response to Dynamic Loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steedman, David W.; Coblentz, David

    The LANL Geodynamics Team has been applying Abaqus modeling to achieve increasingly complex simulations. Advancements in Abaqus model building and simulation tools allows this progress. We use Lab-developed constitutive models, the fully coupled CEL Abaqus and general contact to simulate response of realistic sites to explosively driven shock.

  3. Evaluation of mobility impacts of advanced information systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeta, S.; Poonuru, K.; Sinha, K.

    2000-06-01

    Advanced technologies under the aegis of advanced traveler information systems and advanced traffic management systems are being employed to address the debilitating traffic congestion problem. Broadly identified under the label intelligent transportation systems (ITS), they focus on enhancing the efficiency of the existing roadway utilization. Though ITS has transitioned from the conceptual framework stage to the operational test phase that analyzes real-world feasibility, studies that systematically quantify the multidimensional real-world impacts of these technologies in terms of mobility, safety, and air quality, are lacking. This paper proposes a simulation-based framework to address the mobility impacts of these technologies through themore » provision of information to travelers. The information provision technologies are labeled as advanced information systems (AIS), and include pretrip information, en route information, variable message signs, and combinations thereof. The primary focus of the paper is to evaluate alternative AIS technologies using the heavily traveled Borman Expressway corridor in northwestern Indiana as a case study. Simulation results provide insights into the mobility impacts of AIS technologies, and contrast the effectiveness of alternative information provision sources and strategies.« less

  4. Constructing Neuronal Network Models in Massively Parallel Environments.

    PubMed

    Ippen, Tammo; Eppler, Jochen M; Plesser, Hans E; Diesmann, Markus

    2017-01-01

    Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers.

  5. Constructing Neuronal Network Models in Massively Parallel Environments

    PubMed Central

    Ippen, Tammo; Eppler, Jochen M.; Plesser, Hans E.; Diesmann, Markus

    2017-01-01

    Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers. PMID:28559808

  6. Observing system simulations using synthetic radiances and atmospheric retrievals derived for the AMSU and HIRS in a mesoscale model. [Advanced Microwave Sounding Unit

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Huang, Hung-Lung; Kim, Dongsoo

    1990-01-01

    The paper addresses the concept of synthetic satellite imagery as a visualization and diagnostic tool for understanding satellite sensors of the future and to detail preliminary results on the quality of soundings from the current sensors. Preliminary results are presented on the quality of soundings from the combination of the High-Resolution Infrared Radiometer Sounder and the Advanced Microwave Sounding Unit. Results are also presented on the first Observing System Simulation Experiment using this data in a mesoscale numerical prediction model.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. GneimoSim: A Modular Internal Coordinates Molecular Dynamics Simulation Package

    PubMed Central

    Larsen, Adrien B.; Wagner, Jeffrey R.; Kandel, Saugat; Salomon-Ferrer, Romelia; Vaidehi, Nagarajan; Jain, Abhinandan

    2014-01-01

    The Generalized Newton Euler Inverse Mass Operator (GNEIMO) method is an advanced method for internal coordinates molecular dynamics (ICMD). GNEIMO includes several theoretical and algorithmic advancements that address longstanding challenges with ICMD simulations. In this paper we describe the GneimoSim ICMD software package that implements the GNEIMO method. We believe that GneimoSim is the first software package to include advanced features such as the equipartition principle derived for internal coordinates, and a method for including the Fixman potential to eliminate systematic statistical biases introduced by the use of hard constraints. Moreover, by design, GneimoSim is extensible and can be easily interfaced with third party force field packages for ICMD simulations. Currently, GneimoSim includes interfaces to LAMMPS, OpenMM, Rosetta force field calculation packages. The availability of a comprehensive Python interface to the underlying C++ classes and their methods provides a powerful and versatile mechanism for users to develop simulation scripts to configure the simulation and control the simulation flow. GneimoSim has been used extensively for studying the dynamics of protein structures, refinement of protein homology models, and for simulating large scale protein conformational changes with enhanced sampling methods. GneimoSim is not limited to proteins and can also be used for the simulation of polymeric materials. PMID:25263538

  9. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less

  10. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery.

    PubMed

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-01-01

    Computer-assisted preoperative simulation surgery is employed to plan and interact with the 3D images during the orthognathic procedure. It is useful for positioning and fixation of maxilla by a plate. We report a case of maxillary retrusion by a bilateral cleft lip and palate, in which a 2-stage orthognathic procedure (maxillary advancement by distraction technique and mandibular setback surgery) was performed following a computer-assisted preoperative simulation planning to achieve the positioning and fixation of the plate. A high accuracy was achieved in the present case. A 21-year-old male patient presented to our department with a complaint of maxillary retrusion following bilateral cleft lip and palate. Computer-assisted preoperative simulation with 2-stage orthognathic procedure using distraction technique and mandibular setback surgery was planned. The preoperative planning of the procedure resulted in good aesthetic outcomes. The error of the maxillary position was less than 1mm. The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Terminal area air traffic control simulation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    To study the impact of advanced aeronautical technologies on operations to and from terminal airports, a computer model of air traffic movements was developed. The advantages of fast-time simulation are discussed, and the arrival scheduling and flight simulation are described. A New York area study, user's guide, and programmer's guide are included.

  12. Thermodynamic forces in coarse-grained simulations

    NASA Astrophysics Data System (ADS)

    Noid, William

    Atomically detailed molecular dynamics simulations have profoundly advanced our understanding of the structure and interactions in soft condensed phases. Nevertheless, despite dramatic advances in the methodology and resources for simulating atomically detailed models, low-resolution coarse-grained (CG) models play a central and rapidly growing role in science. CG models not only empower researchers to investigate phenomena beyond the scope of atomically detailed simulations, but also to precisely tailor models for specific phenomena. However, in contrast to atomically detailed simulations, which evolve on a potential energy surface, CG simulations should evolve on a free energy surface. Therefore, the forces in CG models should reflect the thermodynamic information that has been eliminated from the CG configuration space. As a consequence of these thermodynamic forces, CG models often demonstrate limited transferability and, moreover, rarely provide an accurate description of both structural and thermodynamic properties. In this talk, I will present a framework that clarifies the origin and impact of these thermodynamic forces. Additionally, I will present computational methods for quantifying these forces and incorporating their effects into CG MD simulations. As time allows, I will demonstrate applications of this framework for liquids, polymers, and interfaces. We gratefully acknowledge the support of the National Science Foundation via CHE 1565631.

  13. Technology Advancements for Active Remote Sensing of Carbon Dioxide From Space using the ASCENDS CarbonHawk Experiment Simulator

    NASA Astrophysics Data System (ADS)

    Obland, M. D.; Liu, Z.; Campbell, J. F.; Lin, B.; Kooi, S. A.; Carrion, W.; Hicks, J.; Fan, T. F.; Nehrir, A. R.; Browell, E. V.; Meadows, B.; Davis, K. J.

    2016-12-01

    This work describes advances in critical lidar technologies and techniques developed as part of the ASCENDS CarbonHawk Experiment Simulator (ACES) system for measuring atmospheric column carbon dioxide (CO2) mixing ratios in support of the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) mission. The ACES design demonstrates advancements in: (1) enhanced power-aperture product through the use and operation of multiple co-aligned laser transmitters and a multi-aperture telescope design; (2) high-efficiency, high-power Erbium-Doped Fiber Amplifiers (EDFAs); (3) high-bandwidth, low-noise HgCdTe detector and transimpedence amplifier (TIA) subsystem capable of long-duration operation; and (4) advanced algorithms for cloud and aerosol discrimination. The ACES instrument, an Intensity-Modulated Continuous-Wave (IM-CW) lidar, was designed for high-altitude aircraft operations and can be directly applied to space instrumentation to meet the ASCENDS mission requirements. Specifically, the lidar simultaneously transmits three IM-CW laser beams from the high power EDFAs operating near 1571 nm. The outgoing laser beams are aligned to the field of view of three fiber-coupled 17.8-cm diameter telescopes, and the backscattered light collected by the same three telescopes is sent to the detector/TIA subsystem, which has a bandwidth of 4.9 MHz and operates service-free with a tactical Dewar and cryocooler. The electronic bandwidth is only slightly higher than 1 MHz, effectively limiting the noise level. Two key laser modulation approaches are being tested to significantly mitigate the effects of thin clouds on the retrieved CO2 column amounts. This work provides an over view of these technologies, the modulation approaches, and results from recent test flights during the Atmospheric Carbon and Transport - America (ACT-America) Earth Venture Suborbital flight campaign.

  14. Advanced Placement U.S. History: What Happens after the Examination?

    ERIC Educational Resources Information Center

    Henry, Michael

    1991-01-01

    Discusses a survey of 56 advanced placement (AP) U.S. history teachers. Explores the scope of AP history and types of posttest activities used after Advanced Placement examinations. Concludes that public school courses developed more deeply into post-1960 events than the private schools did. Describes movies, debates, simulations, and local…

  15. Climate SPHINX: evaluating the impact of resolution and stochastic physics parameterisations in the EC-Earth global climate model

    NASA Astrophysics Data System (ADS)

    Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.

    2017-03-01

    The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth

    like AWE, and has a large team at EMC supporting and enhancing it. PLFS is open sourced with a BSD license on sourceforge. Post PDSI funding comes from NNSA and industry sources. Moreover, PLFS has spin out half a dozen or more papers, partnered on research with multiple schools and vendors, and has projects to transparently 1) dis- tribute metadata over independent metadata servers, 2) exploit drastically non-POSIX Hadoop storage for HPC POSIX applications, 3) compress checkpoints on the fly, 4) batch delayed writes for write speed, 5) compress read-back indexes and parallelize their redistribution, 6) double-buffer writes in NAND Flash storage to decouple host blocking during checkpoint from disk write time in the storage system, 7) pack small files into a smaller number of bigger containers. There are two large scale open source Linux software projects that PDSI significantly incubated, though neither were initated in PDSI. These are 1) Ceph, a UCSC parallel object storage research project that has continued to be a vehicle for research, and has become a released part of Linux, and 2) Parallel NFS (pNFS) a portion of the IETF’s NFSv4.1 that brings the core data parallelism found in Lustre, PanFS, PVFS, and Ceph to the industry standard NFS, with released code in Linux 3.0, and its vendor offerings, with products from NetApp, EMC, BlueArc and RedHat. Both are fundamentally supported and advanced by vendor companies now, but were critcally transferred from research demonstration to viable product with funding from PDSI, in part. At this point Lustre remains the primary path to scalable IO in Exascale systems, but both Ceph and pNFS are viable alternatives with different fundamental advantages. Finally, research community building was a big success for PDSI. Through the HECFSIO workshops and HECURA project with NSF PDSI stimulated and helped to steer leveraged funding of over $25M. Through the Petascale (now Parallel) Data Storage Workshop series, www

  17. Advances in quantum and molecular mechanical (QM/MM) simulations for organic and enzymatic reactions.

    PubMed

    Acevedo, Orlando; Jorgensen, William L

    2010-01-19

    Application of combined quantum and molecular mechanical (QM/MM) methods focuses on predicting activation barriers and the structures of stationary points for organic and enzymatic reactions. Characterization of the factors that stabilize transition structures in solution and in enzyme active sites provides a basis for design and optimization of catalysts. Continued technological advances allowed for expansion from prototypical cases to mechanistic studies featuring detailed enzyme and condensed-phase environments with full integration of the QM calculations and configurational sampling. This required improved algorithms featuring fast QM methods, advances in computing changes in free energies including free-energy perturbation (FEP) calculations, and enhanced configurational sampling. In particular, the present Account highlights development of the PDDG/PM3 semi-empirical QM method, computation of multi-dimensional potentials of mean force (PMF), incorporation of on-the-fly QM in Monte Carlo (MC) simulations, and a polynomial quadrature method for efficient modeling of proton-transfer reactions. The utility of this QM/MM/MC/FEP methodology is illustrated for a variety of organic reactions including substitution, decarboxylation, elimination, and pericyclic reactions. A comparison to experimental kinetic results on medium effects has verified the accuracy of the QM/MM approach in the full range of solvents from hydrocarbons to water to ionic liquids. Corresponding results from ab initio and density functional theory (DFT) methods with continuum-based treatments of solvation reveal deficiencies, particularly for protic solvents. Also summarized in this Account are three specific QM/MM applications to biomolecular systems: (1) a recent study that clarified the mechanism for the reaction of 2-pyrone derivatives catalyzed by macrophomate synthase as a tandem Michael-aldol sequence rather than a Diels-Alder reaction, (2) elucidation of the mechanism of action of fatty

  18. Evaluation of Erosion Resistance of Advanced Turbine Thermal Barrier Coatings

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Kuczmarski, Maria A.; Miller, Robert A.; Cuy, Michael D.

    2007-01-01

    The erosion resistant turbine thermal barrier coating system is critical to aircraft engine performance and durability. By demonstrating advanced turbine material testing capabilities, we will be able to facilitate the critical turbine coating and subcomponent development and help establish advanced erosion-resistant turbine airfoil thermal barrier coatings design tools. The objective of this work is to determine erosion resistance of advanced thermal barrier coating systems under simulated engine erosion and/or thermal gradient environments, validating advanced turbine airfoil thermal barrier coating systems based on nano-tetragonal phase toughening design approaches.

  19. Distance-Learning for Advanced Military Education: Using Wargame Simulation Course as an Example

    ERIC Educational Resources Information Center

    Keh, Huan-Chao; Wang, Kuei-Min; Wai, Shu-Shen; Huang, Jiung-yao; Hui, Lin; Wu, Ji-Jen

    2008-01-01

    Distance learning in advanced military education can assist officers around the world to become more skilled and qualified for future challenges. Through well-chosen technology, the efficiency of distance-learning can be improved significantly. In this paper we present the architecture of Advanced Military Education-Distance Learning (AME-DL)…

  20. The Tuition Advance Fund: An Analysis Prepared for Boston University.

    ERIC Educational Resources Information Center

    Botsford, Keith

    Three models for anlayzing the Tuition Advance Fund (TAF) are examined. The three models are: projections by the Institute for Demographic and Economic Studies (IDES), projections by Data Resources, Inc. (DRI), and the Tuition Advance Fund Simulation (TAFSIM) models from Boston University. Analysis of the TAF is based on enrollment, price, and…

  1. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  2. Simulating 'the right stuff'

    NASA Astrophysics Data System (ADS)

    Fischetti, M. A.; Truxal, C.

    1985-03-01

    The present investigation is mainly concerned with simulators employed in the training of pilots in the Armed Services and other military personnel, taking into account certain problems and approaches for overcoming them. The use of simulators for training programs has a number of advantages compared to training involving a use of the actual equipment. Questions arise, however, regarding the value of such a training. Thus, it has been said that, while simulators gave students practice in manual skill, they could not teach them to handle the stress of being in a real aircraft. It has also been argued that some tasks are not represented accurately enough for proper training. In response to this criticism, the capacity of the simulators has been greatly improved. However, this development leads to problems related to the cost of simulator training. Attention is given to better visuals for flight simulators, the current generation of graphics imagery and expected improvements, possibilities for reducing flight simulator costs, and advances due to progress in microcomputers.

  3. FY13 Annual Report: PHEV Advanced Series Gen-set Development/Demonstration Activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambon, Paul H.

    2013-12-01

    The objective of this project is to integrate ORNL advancements in vehicle technologies to properly design, and size a gen-set for various vehicle applications and then simulate multiple advanced series hybrid (HEV/PHEV) vehicles with the genset models.

  4. Simulation: an evolving methodology for health administration education.

    PubMed

    Taylor, J K; Moore, J A; Holland, M G

    1985-01-01

    Simulation provides a valuable addition to a university's teaching methods. Computer-assisted gaming is especially effective in teaching advanced business strategy and corporate policy when the nature and complexity of the simulation permit. The potential for using simulation techniques in postgraduate professional education and in managerial self-assessment appears to be significant over the next several years.

  5. GneimoSim: a modular internal coordinates molecular dynamics simulation package.

    PubMed

    Larsen, Adrien B; Wagner, Jeffrey R; Kandel, Saugat; Salomon-Ferrer, Romelia; Vaidehi, Nagarajan; Jain, Abhinandan

    2014-12-05

    The generalized Newton-Euler inverse mass operator (GNEIMO) method is an advanced method for internal coordinates molecular dynamics (ICMD). GNEIMO includes several theoretical and algorithmic advancements that address longstanding challenges with ICMD simulations. In this article, we describe the GneimoSim ICMD software package that implements the GNEIMO method. We believe that GneimoSim is the first software package to include advanced features such as the equipartition principle derived for internal coordinates, and a method for including the Fixman potential to eliminate systematic statistical biases introduced by the use of hard constraints. Moreover, by design, GneimoSim is extensible and can be easily interfaced with third party force field packages for ICMD simulations. Currently, GneimoSim includes interfaces to LAMMPS, OpenMM, and Rosetta force field calculation packages. The availability of a comprehensive Python interface to the underlying C++ classes and their methods provides a powerful and versatile mechanism for users to develop simulation scripts to configure the simulation and control the simulation flow. GneimoSim has been used extensively for studying the dynamics of protein structures, refinement of protein homology models, and for simulating large scale protein conformational changes with enhanced sampling methods. GneimoSim is not limited to proteins and can also be used for the simulation of polymeric materials. © 2014 Wiley Periodicals, Inc.

  6. INACSL Standards of Best Practice for Simulation: Past, Present, and Future.

    PubMed

    Sittner, Barbara J; Aebersold, Michelle L; Paige, Jane B; Graham, Leslie L M; Schram, Andrea Parsons; Decker, Sharon I; Lioce, Lori

    2015-01-01

    To describe the historical evolution of the International Nursing Association for Clinical Simulation and Learning's (INACSL) Standards of Best Practice: Simulation. The establishment of simulation standards began as a concerted effort by the INACSL Board of Directors in 2010 to provide best practices to design, conduct, and evaluate simulation activities in order to advance the science of simulation as a teaching methodology. A comprehensive review of the evolution of INACSL Standards of Best Practice: Simulation was conducted using journal publications, the INACSL website, INACSL member survey, and reports from members of the INACSL Standards Committee. The initial seven standards, published in 2011, were reviewed and revised in 2013. Two new standards were published in 2015. The standards will continue to evolve as the science of simulation advances. As the use of simulation-based experiences increases, the INACSL Standards of Best Practice: Simulation are foundational to standardizing language, behaviors, and curricular design for facilitators and learners.

  7. [Research advances in dendrochronology].

    PubMed

    Fang, Ke-Yan; Chen, Qiu-Yan; Liu, Chang-Zhi; Cao, Chun-Fu; Chen, Ya-Jun; Zhou, Fei-Fei

    2014-07-01

    Tree-ring studies in China have achieved great advances since the 1990s, particularly for the dendroclimatological studies which have made some influence around the world. However, because of the uneven development, limited attention has been currently paid on the other branches of dendrochronology. We herein briefly compared the advances of dendrochronology in China and of the world and presented suggestions on future dendrochronological studies. Large-scale tree-ring based climate reconstructions in China are highly needed by employing mathematical methods and a high quality tree-ring network of the ring-width, density, stable isotope and wood anatomy. Tree-ring based field climate reconstructions provide potentials on explorations of climate forcings during the reconstructed periods via climate diagnosis and process simulation.

  8. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  9. Improving Advanced Inverter Control Convergence in Distribution Power Flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Palmintier, Bryan; Ding, Fei

    Simulation of modern distribution system powerflow increasingly requires capturing the impact of advanced PV inverter voltage regulation on powerflow. With Volt/var control, the inverter adjusts its reactive power flow as a function of the point of common coupling (PCC) voltage. Similarly, Volt/watt control curtails active power production as a function of PCC voltage. However, with larger systems and higher penetrations of PV, this active/reactive power flow itself can cause significant changes to the PCC voltage potentially introducing oscillations that slow the convergence of system simulations. Improper treatment of these advanced inverter functions could potentially lead to incorrect results. This papermore » explores a simple approach to speed such convergence by blending in the previous iteration's reactive power estimate to dampen these oscillations. Results with a single large (5MW) PV system and with multiple 500kW advanced inverters show dramatic improvements using this approach.« less

  10. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  11. Physics-based simulation models for EBSD: advances and challenges

    NASA Astrophysics Data System (ADS)

    Winkelmann, A.; Nolze, G.; Vos, M.; Salvat-Pujol, F.; Werner, W. S. M.

    2016-02-01

    EBSD has evolved into an effective tool for microstructure investigations in the scanning electron microscope. The purpose of this contribution is to give an overview of various simulation approaches for EBSD Kikuchi patterns and to discuss some of the underlying physical mechanisms.

  12. A Review of Endoscopic Simulation: Current Evidence on Simulators and Curricula.

    PubMed

    King, Neil; Kunac, Anastasia; Merchant, Aziz M

    2016-01-01

    Upper and lower endoscopy is an important tool that is being utilized more frequently by general surgeons. Training in therapeutic endoscopic techniques has become a mandatory requirement for general surgery residency programs in the United States. The Fundamentals of Endoscopic Surgery has been developed to train and assess competency in these advanced techniques. Simulation has been shown to increase the skill and learning curve of trainees in other surgical disciplines. Several types of endoscopy simulators are commercially available; mechanical trainers, animal based, and virtual reality or computer-based simulators all have their benefits and limitations. However they have all been shown to improve trainee's endoscopic skills. Endoscopic simulators will play a critical role as part of a comprehensive curriculum designed to train the next generation of surgeons. We reviewed recent literature related to the various types of endoscopic simulators and their use in an educational curriculum, and discuss the relevant findings. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Status of NASA/Army rotorcraft research and development piloted flight simulation

    NASA Technical Reports Server (NTRS)

    Condon, Gregory W.; Gossett, Terrence D.

    1988-01-01

    The status of the major NASA/Army capabilities in piloted rotorcraft flight simulation is reviewed. The requirements for research and development piloted simulation are addressed as well as the capabilities and technologies that are currently available or are being developed by NASA and the Army at Ames. The application of revolutionary advances (in visual scene, electronic cockpits, motion, and modelling of interactive mission environments and/or vehicle systems) to the NASA/Army facilities are also addressed. Particular attention is devoted to the major advances made in integrating these individual capabilities into fully integrated simulation environment that were or are being applied to new rotorcraft mission requirements. The specific simulators discussed are the Vertical Motion Simulator and the Crew Station Research and Development Facility.

  14. Advanced surface design for logistics analysis

    NASA Astrophysics Data System (ADS)

    Brown, Tim R.; Hansen, Scott D.

    The development of anthropometric arm/hand and tool models and their manipulation in a large system model for maintenance simulation are discussed. The use of Advanced Surface Design and s-fig technology in anthropometrics, and three-dimensional graphics simulation tools, are found to achieve a good balance between model manipulation speed and model accuracy. The present second generation models are shown to be twice as fast to manipulate as the first generation b-surf models, to be easier to manipulate into various configurations, and to more closely approximate human contours.

  15. Defense Simulation Internet: next generation information highway.

    PubMed

    Lilienthal, M G

    1995-06-01

    The Department of Defense has been engaged in the Defense Modeling and Simulation Initiative (DMSI) to provide advanced distributed simulation warfighters in geographically distributed localities. Lessons learned from the Defense Simulation Internet (DSI) concerning architecture, standards, protocols, interoperability, information sharing, and distributed data bases are equally applicable to telemedicine. Much of the vision and objectives of the DMSI are easily translated into the vision for world wide telemedicine.

  16. A novel feedback algorithm for simulating controlled dynamics and confinement in the advanced reversed-field pinch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlin, J.-E.; Scheffel, J.

    2005-06-15

    In the advanced reversed-field pinch (RFP), the current density profile is externally controlled to diminish tearing instabilities. Thus the scaling of energy confinement time with plasma current and density is improved substantially as compared to the conventional RFP. This may be numerically simulated by introducing an ad hoc electric field, adjusted to generate a tearing mode stable parallel current density profile. In the present work a current profile control algorithm, based on feedback of the fluctuating electric field in Ohm's law, is introduced into the resistive magnetohydrodynamic code DEBSP [D. D. Schnack and D. C. Baxter, J. Comput. Phys. 55,more » 485 (1984); D. D. Schnack, D. C. Barnes, Z. Mikic, D. S. Marneal, E. J. Caramana, and R. A. Nebel, Comput. Phys. Commun. 43, 17 (1986)]. The resulting radial magnetic field is decreased considerably, causing an increase in energy confinement time and poloidal {beta}. It is found that the parallel current density profile spontaneously becomes hollow, and that a formation, being related to persisting resistive g modes, appears close to the reversal surface.« less

  17. Assessing Advanced Technology in CENATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Barker, Kevin J.; Gioiosa, Roberto

    PNNL's Center for Advanced Technology Evaluation (CENATE) is a new U.S. Department of Energy center whose mission is to assess and facilitate access to emerging computing technology. CENATE is assessing a range of advanced technologies, from evolutionary to disruptive. Technologies of interest include the processor socket (homogeneous and accelerated systems), memories (dynamic, static, memory cubes), motherboards, networks (network interface cards and switches), and input/output and storage devices. CENATE is developing a multi-perspective evaluation process based on integrating advanced system instrumentation, performance measurements, and modeling and simulation. We show evaluations of two emerging network technologies: silicon photonics interconnects and the Datamore » Vortex network. CENATE's evaluation also addresses the question of which machine is best for a given workload under certain constraints. We show a performance-power tradeoff analysis of a well-known machine learning application on two systems.« less

  18. Simulation of Hawaiian Electric Companies Feeder Operations with Advanced Inverters and Analysis of Annual Photovoltaic Energy Curtailment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giraldez Miner, Julieta I.; Nagarajan, Adarsh; Gotseff, Peter

    The Hawaiian Electric Companies achieved a consolidated Renewable Portfolio Standard (RPS) of approximately 26% at the end of 2016. This significant RPS performance was achieved using various renewable energy sources - biomass, geothermal, solar photovoltaic (PV) systems, hydro, wind, and biofuels - and customer-sited, grid-connected technologies (primarily private rooftop solar PV systems). The Hawaiian Electric Companies are preparing grid-modernization plans for the island grids. The plans outline specific near-term actions to accelerate the achievement of Hawai'i's 100% RPS by 2045. A key element of the Companies' grid-modernization strategy is to utilize new technologies - including storage and PV systems withmore » grid-supportive inverters - that will help to more than triple the amount of private rooftop solar PV systems. The Hawaiian Electric Companies collaborated with the Smart Inverter Technical Working Group Hawai'i (SITWG) to partner with the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) to research the implementation of advanced inverter grid support functions (GSF). Together with the technical guidance from the Companies's planning engineers and stakeholder input from the SITWG members, NREL proposed a scope of work that explored different modes of voltage-regulation GSF to better understand the trade-offs of the grid benefits and curtailment impacts from the activation of selected advanced inverter grid support functions. The simulation results presented in this report examine the effectiveness in regulating voltage as well as the impact to the utility and the customers of various inverter-based grid support functions on two Hawaiian Electric distribution substations.« less

  19. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  20. Advanced ceramic matrix composites for TPS

    NASA Technical Reports Server (NTRS)

    Rasky, Daniel J.

    1992-01-01

    Recent advances in ceramic matrix composite (CMC) technology provide considerable opportunity for application to future aircraft thermal protection system (TPS), providing materials with higher temperature capability, lower weight, and higher strength and stiffness than traditional materials. The Thermal Protection Material Branch at NASA Ames Research Center has been making significant progress in the development, characterization, and entry simulation (arc-jet) testing of new CMC's. This protection gives a general overview of the Ames Thermal Protection Materials Branch research activities, followed by more detailed descriptions of recent advances in very-high temperature Zr and Hf based ceramics, high temperature, high strength SiC matrix composites, and some activities in polymer precursors and ceramic coating processing. The presentation closes with a brief comparison of maximum heat flux capabilities of advanced TPS materials.

  1. Progress in virtual reality simulators for surgical training and certification.

    PubMed

    de Visser, Hans; Watson, Marcus O; Salvado, Olivier; Passenger, Joshua D

    2011-02-21

    There is increasing evidence that educating trainee surgeons by simulation is preferable to traditional operating-room training methods with actual patients. Apart from reducing costs and risks to patients, training by simulation can provide some unique benefits, such as greater control over the training procedure and more easily defined metrics for assessing proficiency. Virtual reality (VR) simulators are now playing an increasing role in surgical training. However, currently available VR simulators lack the fidelity to teach trainees past the novice-to-intermediate skills level. Recent technological developments in other industries using simulation, such as the games and entertainment and aviation industries, suggest that the next generation of VR simulators should be suitable for training, maintenance and certification of advanced surgical skills. To be effective as an advanced surgical training and assessment tool, VR simulation needs to provide adequate and relevant levels of physical realism, case complexity and performance assessment. Proper validation of VR simulators and an increased appreciation of their value by the medical profession are crucial for them to be accepted into surgical training curricula.

  2. Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Madnia, Cyrus K.; Steinberger, Craig J.

    1990-01-01

    This research is involved with the implementation of advanced computational schemes based on large eddy simulations (LES) and direct numerical simulations (DNS) to study the phenomenon of mixing and its coupling with chemical reactions in compressible turbulent flows. In the efforts related to LES, a research program to extend the present capabilities of this method was initiated for the treatment of chemically reacting flows. In the DNS efforts, the focus is on detailed investigations of the effects of compressibility, heat release, and non-equilibrium kinetics modelings in high speed reacting flows. Emphasis was on the simulations of simple flows, namely homogeneous compressible flows, and temporally developing high speed mixing layers.

  3. New perspectives for advanced automobile diesel engines

    NASA Technical Reports Server (NTRS)

    Tozzi, L.; Sekar, R.; Kamo, R.; Wood, J. C.

    1983-01-01

    Computer simulation results are presented for advanced automobile diesel engine performance. Four critical factors for performance enhancement were identified: (1) part load preheating and exhaust gas energy recovery, (2) fast heat release combustion process, (3) reduction in friction, and (4) air handling system efficiency. Four different technology levels were considered in the analysis. Simulation results are compared in terms of brake specific fuel consumption and vehicle fuel economy in km/liter (miles per gallon). Major critical performance sensitivity areas are: (1) combustion process, (2) expander and compressor efficiency, and (3) part load preheating and compound system. When compared to the state of the art direct injection, cooled, automobile diesel engine, the advanced adiabatic compound engine concept showed the unique potential of doubling the fuel economy. Other important performance criteria such as acceleration, emissions, reliability, durability and multifuel capability are comparable to or better than current passenger car diesel engines.

  4. Advanced technologies in plastic surgery: how new innovations can improve our training and practice.

    PubMed

    Grunwald, Tiffany; Krummel, Thomas; Sherman, Randy

    2004-11-01

    Over the last two decades, virtual reality, haptics, simulators, robotics, and other "advanced technologies" have emerged as important innovations in medical learning and practice. Reports on simulator applications in medicine now appear regularly in the medical, computer science, engineering, and popular literature. The goal of this article is to review the emerging intersection between advanced technologies and surgery and how new technology is being utilized in several surgical fields, particularly plastic surgery. The authors also discuss how plastic and reconstructive surgeons can benefit by working to further the development of multimedia and simulated environment technologies in surgical practice and training.

  5. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  6. Introducing Simulation via the Theory of Records

    ERIC Educational Resources Information Center

    Johnson, Arvid C.

    2011-01-01

    While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…

  7. Advanced solid elements for sheet metal forming simulation

    NASA Astrophysics Data System (ADS)

    Mataix, Vicente; Rossi, Riccardo; Oñate, Eugenio; Flores, Fernando G.

    2016-08-01

    The solid-shells are an attractive kind of element for the simulation of forming processes, due to the fact that any kind of generic 3D constitutive law can be employed without any additional hypothesis. The present work consists in the improvement of a triangular prism solid-shell originally developed by Flores[2, 3]. The solid-shell can be used in the analysis of thin/thick shell, undergoing large deformations. The element is formulated in total Lagrangian formulation, and employs the neighbour (adjacent) elements to perform a local patch to enrich the displacement field. In the original formulation a modified right Cauchy-Green deformation tensor (C) is obtained; in the present work a modified deformation gradient (F) is obtained, which allows to generalise the methodology and allows to employ the Pull-Back and Push-Forwards operations. The element is based in three modifications: (a) a classical assumed strain approach for transverse shear strains (b) an assumed strain approach for the in-plane components using information from neighbour elements and (c) an averaging of the volumetric strain over the element. The objective is to use this type of elements for the simulation of shells avoiding transverse shear locking, improving the membrane behaviour of the in-plane triangle and to handle quasi-incompressible materials or materials with isochoric plastic flow.

  8. Simulation of the hybrid and steady state advanced operating modes in ITER

    NASA Astrophysics Data System (ADS)

    Kessel, C. E.; Giruzzi, G.; Sips, A. C. C.; Budny, R. V.; Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Joffrin, E.; Schneider, M.; Murakami, M.; Luce, T.; St. John, Holger; Oikawa, T.; Hayashi, N.; Takizuka, T.; Ozeki, T.; Na, Y.-S.; Park, J. M.; Garcia, J.; Tucillo, A. A.

    2007-09-01

    Integrated simulations are performed to establish a physics basis, in conjunction with present tokamak experiments, for the operating modes in the International Thermonuclear Experimental Reactor (ITER). Simulations of the hybrid mode are done using both fixed and free-boundary 1.5D transport evolution codes including CRONOS, ONETWO, TSC/TRANSP, TOPICS and ASTRA. The hybrid operating mode is simulated using the GLF23 and CDBM05 energy transport models. The injected powers are limited to the negative ion neutral beam, ion cyclotron and electron cyclotron heating systems. Several plasma parameters and source parameters are specified for the hybrid cases to provide a comparison of 1.5D core transport modelling assumptions, source physics modelling assumptions, as well as numerous peripheral physics modelling. Initial results indicate that very strict guidelines will need to be imposed on the application of GLF23, for example, to make useful comparisons. Some of the variations among the simulations are due to source models which vary widely among the codes used. In addition, there are a number of peripheral physics models that should be examined, some of which include fusion power production, bootstrap current, treatment of fast particles and treatment of impurities. The hybrid simulations project to fusion gains of 5.6-8.3, βN values of 2.1-2.6 and fusion powers ranging from 350 to 500 MW, under the assumptions outlined in section 3. Simulations of the steady state operating mode are done with the same 1.5D transport evolution codes cited above, except the ASTRA code. In these cases the energy transport model is more difficult to prescribe, so that energy confinement models will range from theory based to empirically based. The injected powers include the same sources as used for the hybrid with the possible addition of lower hybrid. The simulations of the steady state mode project to fusion gains of 3.5-7, βN values of 2.3-3.0 and fusion powers of 290 to 415 MW

  9. A Multi-Operator Simulation for Investigation of Distributed Air Traffic Management Concepts

    NASA Technical Reports Server (NTRS)

    Peters, Mark E.; Ballin, Mark G.; Sakosky, John S.

    2002-01-01

    This paper discusses the current development of an air traffic operations simulation that supports feasibility research for advanced air traffic management concepts. The Air Traffic Operations Simulation (ATOS) supports the research of future concepts that provide a much greater role for the flight crew in traffic management decision-making. ATOS provides representations of the future communications, navigation, and surveillance (CNS) infrastructure, a future flight deck systems architecture, and advanced crew interfaces. ATOS also provides a platform for the development of advanced flight guidance and decision support systems that may be required for autonomous operations.

  10. Recent Advances in Voltammetry

    PubMed Central

    Batchelor-McAuley, Christopher; Kätelhön, Enno; Barnes, Edward O; Compton, Richard G; Laborda, Eduardo; Molina, Angela

    2015-01-01

    Recent progress in the theory and practice of voltammetry is surveyed and evaluated. The transformation over the last decade of the level of modelling and simulation of experiments has realised major advances such that electrochemical techniques can be fully developed and applied to real chemical problems of distinct complexity. This review focuses on the topic areas of: multistep electrochemical processes, voltammetry in ionic liquids, the development and interpretation of theories of electron transfer (Butler–Volmer and Marcus–Hush), advances in voltammetric pulse techniques, stochastic random walk models of diffusion, the influence of migration under conditions of low support, voltammetry at rough and porous electrodes, and nanoparticle electrochemistry. The review of the latter field encompasses both the study of nanoparticle-modified electrodes, including stripping voltammetry and the new technique of ‘nano-impacts’. PMID:26246984

  11. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE PAGES

    Wang, Bei; Ethier, Stephane; Tang, William; ...

    2017-06-29

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  12. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  13. Simulated flight acoustic investigation of treated ejector effectiveness on advanced mechanical suppresors for high velocity jet noise reduction

    NASA Technical Reports Server (NTRS)

    Brausch, J. F.; Motsinger, R. E.; Hoerst, D. J.

    1986-01-01

    Ten scale-model nozzles were tested in an anechoic free-jet facility to evaluate the acoustic characteristics of a mechanically suppressed inverted-velocity-profile coannular nozzle with an accoustically treated ejector system. The nozzle system used was developed from aerodynamic flow lines evolved in a previous contract, defined to incorporate the restraints imposed by the aerodynamic performance requirements of an Advanced Supersonic Technology/Variable Cycle Engine system through all its mission phases. Accoustic data of 188 test points were obtained, 87 under static and 101 under simulated flight conditions. The tests investigated variables of hardwall ejector application to a coannular nozzle with 20-chute outer annular suppressor, ejector axial positioning, treatment application to ejector and plug surfaces, and treatment design. Laser velocimeter, shadowgraph photograph, aerodynamic static pressure, and temperature measurement were acquired on select models to yield diagnositc information regarding the flow field and aerodynamic performance characteristics of the nozzles.

  14. Advanced Initiatives in Medical Simulation, 3rd Annual Conference to Create Awareness of Medical Simulation

    DTIC Science & Technology

    2006-06-30

    Mexico suggested bringing together government agencies, academics, and industry representatives with an interest in medical simulation to identify ways...test, take online continuing medical education (CME) courses on electronic fetal monitoring and shoulder dystocia , and complete a physician

  15. Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study

    PubMed Central

    McKenna, Kim D.; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann

    2015-01-01

    Abstract Objectives. The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders’ efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. Methods. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Results. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to

  16. Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.

    PubMed

    McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann

    2015-01-01

    The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt

  17. Advanced Instrumentation for Transient Reactor Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corradini, Michael L.; Anderson, Mark; Imel, George

    Transient testing involves placing fuel or material into the core of specialized materials test reactors that are capable of simulating a range of design basis accidents, including reactivity insertion accidents, that require the reactor produce short bursts of intense highpower neutron flux and gamma radiation. Testing fuel behavior in a prototypic neutron environment under high-power, accident-simulation conditions is a key step in licensing nuclear fuels for use in existing and future nuclear power plants. Transient testing of nuclear fuels is needed to develop and prove the safety basis for advanced reactors and fuels. In addition, modern fuel development and designmore » increasingly relies on modeling and simulation efforts that must be informed and validated using specially designed material performance separate effects studies. These studies will require experimental facilities that are able to support variable scale, highly instrumented tests providing data that have appropriate spatial and temporal resolution. Finally, there are efforts now underway to develop advanced light water reactor (LWR) fuels with enhanced performance and accident tolerance. These advanced reactor designs will also require new fuel types. These new fuels need to be tested in a controlled environment in order to learn how they respond to accident conditions. For these applications, transient reactor testing is needed to help design fuels with improved performance. In order to maximize the value of transient testing, there is a need for in-situ transient realtime imaging technology (e.g., the neutron detection and imaging system like the hodoscope) to see fuel motion during rapid transient excursions with a higher degree of spatial and temporal resolution and accuracy. There also exists a need for new small, compact local sensors and instrumentation that are capable of collecting data during transients (e.g., local displacements, temperatures, thermal conductivity, neutron flux

  18. Piloted evaluation of an integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1992-01-01

    A piloted evaluation of the integrated flight and propulsion control simulator for advanced integrated propulsion and airframe control design is described. The evaluation will cover control effector gains and deadbands, control effectiveness and control authority, and heads up display functionality. For this evaluation the flight simulator is configured for transition flight using an advanced Short Take-Off and Vertical Landing fighter aircraft model, a simplified high-bypass turbofan engine model, fighter cockpit displays, and pilot effectors. The piloted tasks used for rating displays and control effector gains are described. Pilot comments and simulation results confirm that the display symbology and control gains are very adequate for the transition flight task. Additionally, it is demonstrated that this small-scale, fixed base flight simulator facility can adequately perform a real time, piloted control evaluation.

  19. Simulation models and designs for advanced Fischer-Tropsch technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less

  20. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  1. VTI Driving Simulator: Mathematical Model of a Four-wheeled Vehicle for Simulation in Real Time. VTI Rapport 267A.

    ERIC Educational Resources Information Center

    Nordmark, Staffan

    1984-01-01

    This report contains a theoretical model for describing the motion of a passenger car. The simulation program based on this model is used in conjunction with an advanced driving simulator and run in real time. The mathematical model is complete in the sense that the dynamics of the engine, transmission and steering system is described in some…

  2. Formulation of consumables management models: Consumables analysis/crew simulator interface requirements

    NASA Technical Reports Server (NTRS)

    Zamora, M. A.

    1977-01-01

    Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.

  3. Results from Binary Black Hole Simulations in Astrophysics Applications

    NASA Technical Reports Server (NTRS)

    Baker, John G.

    2007-01-01

    Present and planned gravitational wave observatories are opening a new astronomical window to the sky. A key source of gravitational waves is the merger of two black holes. The Laser Interferometer Space Antenna (LISA), in particular, is expected to observe these events with signal-to-noise ratio's in the thousands. To fully reap the scientific benefits of these observations requires a detailed understanding, based on numerical simulations, of the predictions of General Relativity for the waveform signals. New techniques for simulating binary black hole mergers, introduced two years ago, have led to dramatic advances in applied numerical simulation work. Over the last two years, numerical relativity researchers have made tremendous strides in understanding the late stages of binary black hole mergers. Simulations have been applied to test much of the basic physics of binary black hole interactions, showing robust results for merger waveform predictions, and illuminating such phenomena as spin-precession. Calculations have shown that merging systems can be kicked at up to 2500 km/s by the thrust from asymmetric emission. Recently, long lasting simulations of ten or more orbits allow tests of post-Newtonian (PN) approximation results for radiation from the last orbits of the binary's inspiral. Already, analytic waveform models based PN techniques with incorporated information from numerical simulations may be adequate for observations with current ground based observatories. As new advances in simulations continue to rapidly improve our theoretical understanding of the systems, it seems certain that high-precision predictions will be available in time for LISA and other advanced ground-based instruments. Future gravitational wave observatories are expected to make precision.

  4. Burns education: The emerging role of simulation for training healthcare professionals.

    PubMed

    Sadideen, Hazim; Goutos, Ioannis; Kneebone, Roger

    2017-02-01

    Burns education appears to be under-represented in UK undergraduate curricula. However current postgraduate courses in burns education provide formal training in resuscitation and management. Simulation has proven to be a powerful modality to advance surgical training in both technical and non-technical skills. We present a literature review that summarises the format of current burns education, and provides detailed insight into historic, current and novel advances in burns simulation for both technical and non-technical skills, that can be used to augment surgical training. Addressing the economic and practical limitations of current immersive surgical simulation is important, and this review proposes future directions for integration of innovative simulation strategies into training curricula. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  5. Hazard alerting and situational awareness in advanced air transport cockpits

    NASA Technical Reports Server (NTRS)

    Hansman, R. John; Wanke, Craig; Kuchar, James; Mykityshyn, Mark; Hahn, Edward; Midkiff, Alan

    1993-01-01

    Advances in avionics and display technology have significantly changed the cockpit environment in current 'glass cockpit' aircraft. Recent developments in display technology, on-board processing, data storage, and datalinked communications are likely to further alter the environment in second and third generation 'glass cockpit' aircraft. The interaction of advanced cockpit technology with human cognitive performance has been a major area of activity within the MIT Aeronautical Systems Laboratory. This paper presents an overview of the MIT Advanced Cockpit Simulation Facility. Several recent research projects are briefly reviewed and the most important results are summarized.

  6. Performance experiments with alternative advanced teleoperator control modes for a simulated solar maximum satellite repair

    NASA Technical Reports Server (NTRS)

    Das, H.; Zak, H.; Kim, W. S.; Bejczy, A. K.; Schenker, P. S.

    1992-01-01

    Experiments are described which were conducted at the JPL Advanced Teleoperator Lab to demonstrate and evaluate the effectiveness of various teleoperator control modes in the performance of a simulated Solar Max Satellite Repair (SMSR) task. THe SMSR was selected as a test because it is very rich in performance capability requirements and it actually has been performed by two EVA astronauts in the Space Shuttle Bay in 1984. The main subtasks are: thermal blanket removal; installation of a hinge attachment for electrical panel opening; opening of electrical panel; removal of electrical connectors; relining of cable bundles; replacement of electrical panel; securing parts and cables; re-mate electrical connectors; closing of electrical panel; and reinstating thermal blanket. The current performance experiments are limited to thermal blanket cutting, electrical panel unbolting and handling electrical bundles and connectors. In one formal experiment even different control modes were applied to the unbolting and reinsertion of electrical panel screws subtasks. The seven control modes are alternative combinations of manual position and rate control with force feedback and remote compliance referenced to force-torque sensor information. Force-torque sensor and end effector position data and task completion times were recorded for analysis and quantification of operator performance.

  7. Simulation Training: Evaluating the Instructor’s Contribution to a Wizard of Oz Simulator in Obstetrics and Gynecology Ultrasound Training

    PubMed Central

    Tepper, Ronnie

    2017-01-01

    Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the

  8. Numerical simulation of unsteady viscous flows

    NASA Technical Reports Server (NTRS)

    Hankey, Wilbur L.

    1987-01-01

    Most unsteady viscous flows may be grouped into two categories, i.e., forced and self-sustained oscillations. Examples of forced oscillations occur in turbomachinery and in internal combustion engines while self-sustained oscillations prevail in vortex shedding, inlet buzz, and wing flutter. Numerical simulation of these phenomena was achieved due to the advancement of vector processor computers. Recent progress in the simulation of unsteady viscous flows is addressed.

  9. Design and Performance Frameworks for Constructing Problem-Solving Simulations

    ERIC Educational Resources Information Center

    Stevens, Rons; Palacio-Cayetano, Joycelin

    2003-01-01

    Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…

  10. Advanced Signal Processing for Integrated LES-RANS Simulations: Anti-aliasing Filters

    NASA Technical Reports Server (NTRS)

    Schlueter, J. U.

    2003-01-01

    Currently, a wide variety of flow phenomena are addressed with numerical simulations. Many flow solvers are optimized to simulate a limited spectrum of flow effects effectively, such as single parts of a flow system, but are either inadequate or too expensive to be applied to a very complex problem. As an example, the flow through a gas turbine can be considered. In the compressor and the turbine section, the flow solver has to be able to handle the moving blades, model the wall turbulence, and predict the pressure and density distribution properly. This can be done by a flow solver based on the Reynolds-Averaged Navier-Stokes (RANS) approach. On the other hand, the flow in the combustion chamber is governed by large scale turbulence, chemical reactions, and the presence of fuel spray. Experience shows that these phenomena require an unsteady approach. Hence, for the combustor, the use of a Large Eddy Simulation (LES) flow solver is desirable. While many design problems of a single flow passage can be addressed by separate computations, only the simultaneous computation of all parts can guarantee the proper prediction of multi-component phenomena, such as compressor/combustor instability and combustor/turbine hot-streak migration. Therefore, a promising strategy to perform full aero-thermal simulations of gas-turbine engines is the use of a RANS flow solver for the compressor sections, an LES flow solver for the combustor, and again a RANS flow solver for the turbine section.

  11. Piloted evaluation of an integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1992-01-01

    This paper describes a piloted evaluation of the integrated flight and propulsion control simulator at NASA Lewis Research Center. The purpose of this evaluation is to demonstrate the suitability and effectiveness of this fixed based simulator for advanced integrated propulsion and airframe control design. The evaluation will cover control effector gains and deadbands, control effectiveness and control authority, and heads up display functionality. For this evaluation the flight simulator is configured for transition flight using an advanced Short Take-Off and vertical Landing fighter aircraft model, a simplified high-bypass turbofan engine model, fighter cockpit, displays, and pilot effectors. The paper describes the piloted tasks used for rating displays and control effector gains. Pilot comments and simulation results confirm that the display symbology and control gains are very adequate for the transition flight task. Additionally, it is demonstrated that this small-scale, fixed base flight simulator facility can adequately perform a real time, piloted control evaluation.

  12. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  13. Simulation Applications at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Inouye, M.

    1984-01-01

    Aeronautical applications of simulation technology at Ames Research Center are described. The largest wind tunnel in the world is used to determine the flow field and aerodynamic characteristics of various aircraft, helicopter, and missile configurations. Large computers are used to obtain similar results through numerical solutions of the governing equations. Capabilities are illustrated by computer simulations of turbulence, aileron buzz, and an exhaust jet. Flight simulators are used to assess the handling qualities of advanced aircraft, particularly during takeoff and landing.

  14. Evaluation of the FAA Advanced Flow Control Procedures.

    DOT National Transportation Integrated Search

    1972-01-01

    The report is an evaluation of the present FAA Advanced Flow Control Procedures (AFCP), based on data gathered from its implementation on February 5, 1971 and on a fast-time digital simulation of traffic feeding into the NY airports on that day. The ...

  15. Testing the anisotropy of the universe using the simulated gravitational wave events from advanced LIGO and Virgo

    NASA Astrophysics Data System (ADS)

    Lin, Hai-Nan; Li, Jin; Li, Xin

    2018-05-01

    The detection of gravitational waves (GWs) provides a powerful tool to constrain the cosmological parameters. In this paper, we investigate the possibility of using GWs as standard sirens in testing the anisotropy of the universe. We consider the GW signals produced by the coalescence of binary black hole systems and simulate hundreds of GW events from the advanced laser interferometer gravitational-wave observatory and Virgo. It is found that the anisotropy of the universe can be tightly constrained if the redshift of the GW source is precisely known. The anisotropic amplitude can be constrained with an accuracy comparable to the Union2.1 complication of type-Ia supernovae if ≳ 400 GW events are observed. As for the preferred direction, ≳ 800 GW events are needed in order to achieve the accuracy of Union2.1. With 800 GW events, the probability of pseudo anisotropic signals with an amplitude comparable to Union2.1 is negligible. These results show that GWs can provide a complementary tool to supernovae in testing the anisotropy of the universe.

  16. Control Design for an Advanced Geared Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Litt, Jonathan S.

    2017-01-01

    This paper describes the design process for the control system of an advanced geared turbofan engine. This process is applied to a simulation that is representative of a 30,000 pound-force thrust class concept engine with two main spools, ultra-high bypass ratio, and a variable area fan nozzle. Control system requirements constrain the non-linear engine model as it operates throughout its flight envelope of sea level to 40,000 feet and from 0 to 0.8 Mach. The purpose of this paper is to review the engine control design process for an advanced turbofan engine configuration. The control architecture selected for this project was developed from literature and reflects a configuration that utilizes a proportional integral controller with sets of limiters that enable the engine to operate safely throughout its flight envelope. Simulation results show the overall system meets performance requirements without exceeding operational limits.

  17. Analysis of simulated advanced spaceborne thermal emission and reflection (ASTER) radiometer data of the Iron Hill, Colorado, study area for mapping lithologies

    USGS Publications Warehouse

    Rowan, L.C.

    1998-01-01

    The advanced spaceborne thermal emission and reflection (ASTER) radiometer was designed to record reflected energy in nine channels with 15 or 30 m resolution, including stereoscopic images, and emitted energy in five channels with 90 m resolution from the NASA Earth Observing System AM1 platform. A simulated ASTER data set was produced for the Iron Hill, Colorado, study area by resampling calibrated, registered airborne visible/infrared imaging spectrometer (AVIRIS) data, and thermal infrared multispectral scanner (TIMS) data to the appropriate spatial and spectral parameters. A digital elevation model was obtained to simulate ASTER-derived topographic data. The main lithologic units in the area are granitic rocks and felsite into which a carbonatite stock and associated alkalic igneous rocks were intruded; these rocks are locally covered by Jurassic sandstone, Tertiary rhyolitic tuff, and colluvial deposits. Several methods were evaluated for mapping the main lithologic units, including the unsupervised classification and spectral curve-matching techniques. In the five thermal-infrared (TIR) channels, comparison of the results of linear spectral unmixing and unsupervised classification with published geologic maps showed that the main lithologic units were mapped, but large areas with moderate to dense tree cover were not mapped in the TIR data. Compared to TIMS data, simulated ASTER data permitted slightly less discrimination in the mafic alkalic rock series, and carbonatite was not mapped in the TIMS nor in the simulated ASTER TIR data. In the nine visible and near-infrared channels, unsupervised classification did not yield useful results, but both the spectral linear unmixing and the matched filter techniques produced useful results, including mapping calcitic and dolomitic carbonatite exposures, travertine in hot spring deposits, kaolinite in argillized sandstone and tuff, and muscovite in sericitized granite and felsite, as well as commonly occurring illite

  18. A Summary of Proceedings for the Advanced Deployable Day/Night Simulation Symposium

    DTIC Science & Technology

    2009-07-01

    initiated to design , develop, and deliver transportable visual simulations that jointly provide night-vision and high-resolution daylight capability. The...Deployable Day/Night Simulation (ADDNS) Technology Demonstration Project was initiated to design , develop, and deliver transportable visual...was Dr. Richard Wildes (York University); Mr. Vitaly Zholudev (Department of Computer Science, York University), Mr. X. Zhu (Neptec Design Group), and

  19. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  20. Advanced Fuels Campaign FY 2014 Accomplishments Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braase, Lori; May, W. Edgar

    The mission of the Advanced Fuels Campaign (AFC) is to perform Research, Development, and Demonstration (RD&D) activities for advanced fuel forms (including cladding) to enhance the performance and safety of the nation’s current and future reactors; enhance proliferation resistance of nuclear fuel; effectively utilize nuclear energy resources; and address the longer-term waste management challenges. This includes development of a state-of-the art Research and Development (R&D) infrastructure to support the use of a “goal-oriented science-based approach.” In support of the Fuel Cycle Research and Development (FCRD) program, AFC is responsible for developing advanced fuels technologies to support the various fuel cyclemore » options defined in the Department of Energy (DOE) Nuclear Energy Research and Development Roadmap, Report to Congress, April 2010. AFC uses a “goal-oriented, science-based approach” aimed at a fundamental understanding of fuel and cladding fabrication methods and performance under irradiation, enabling the pursuit of multiple fuel forms for future fuel cycle options. This approach includes fundamental experiments, theory, and advanced modeling and simulation. The modeling and simulation activities for fuel performance are carried out under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, which is closely coordinated with AFC. In this report, the word “fuel” is used generically to include fuels, targets, and their associated cladding materials. R&D of light water reactor (LWR) fuels with enhanced accident tolerance is also conducted by AFC. These fuel systems are designed to achieve significantly higher fuel and plant performance to allow operation to significantly higher burnup, and to provide enhanced safety during design basis and beyond design basis accident conditions. The overarching goal is to develop advanced nuclear fuels and materials that are robust, have high performance capability, and are more

  1. Single pilot scanning behavior in simulated instrument flight

    NASA Technical Reports Server (NTRS)

    Pennington, J. E.

    1979-01-01

    A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.

  2. A study on optimization of hybrid drive train using Advanced Vehicle Simulator (ADVISOR)

    NASA Astrophysics Data System (ADS)

    Same, Adam; Stipe, Alex; Grossman, David; Park, Jae Wan

    This study investigates the advantages and disadvantages of three hybrid drive train configurations: series, parallel, and "through-the-ground" parallel. Power flow simulations are conducted with the MATLAB/Simulink-based software ADVISOR. These simulations are then applied in an application for the UC Davis SAE Formula Hybrid vehicle. ADVISOR performs simulation calculations for vehicle position using a combined backward/forward method. These simulations are used to study how efficiency and agility are affected by the motor, fuel converter, and hybrid configuration. Three different vehicle models are developed to optimize the drive train of a vehicle for three stages of the SAE Formula Hybrid competition: autocross, endurance, and acceleration. Input cycles are created based on rough estimates of track geometry. The output from these ADVISOR simulations is a series of plots of velocity profile and energy storage State of Charge that provide a good estimate of how the Formula Hybrid vehicle will perform on the given course. The most noticeable discrepancy between the input cycle and the actual velocity profile of the vehicle occurs during deceleration. A weighted ranking system is developed to organize the simulation results and to determine the best drive train configuration for the Formula Hybrid vehicle. Results show that the through-the-ground parallel configuration with front-mounted motors achieves an optimal balance of efficiency, simplicity, and cost. ADVISOR is proven to be a useful tool for vehicle power train design for the SAE Formula Hybrid competition. This vehicle model based on ADVISOR simulation is applicable to various studies concerning performance and efficiency of hybrid drive trains.

  3. Integration of laparoscopic virtual-reality simulation into gynaecology training.

    PubMed

    Burden, C; Oestergaard, J; Larsen, C R

    2011-11-01

    Surgery carries the risk of serious harm, as well as benefit, to patients. For healthcare organisations, theatre time is an expensive commodity and litigation costs for surgical specialities are very high. Advanced laparoscopic surgery, now widely used in gynaecology for improved outcomes and reduced length of stay, involves longer operation times and a higher rate of complications for surgeons in training. Virtual-reality (VR) simulation is a relatively new training method that has the potential to promote surgical skill development before advancing to surgery on patients themselves. VR simulators have now been on the market for more than 10 years and, yet, few countries in the world have fully integrated VR simulation training into their gynaecology surgical training programmes. In this review, we aim to summarise the VR simulators currently available together with evidence of their effectiveness in gynaecology, to understand their limitations and to discuss their incorporation into national training curricula. © 2011 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2011 RCOG.

  4. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  5. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  6. Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Madnia, C. K.; Steinberger, C. J.; Tsai, A.

    1991-01-01

    This research is involved with the implementations of advanced computational schemes based on large eddy simulations (LES) and direct numerical simulations (DNS) to study the phenomenon of mixing and its coupling with chemical reactions in compressible turbulent flows. In the efforts related to LES, a research program was initiated to extend the present capabilities of this method for the treatment of chemically reacting flows, whereas in the DNS efforts, focus was on detailed investigations of the effects of compressibility, heat release, and nonequilibrium kinetics modeling in high speed reacting flows. The efforts to date were primarily focussed on simulations of simple flows, namely, homogeneous compressible flows and temporally developing hign speed mixing layers. A summary of the accomplishments is provided.

  7. Accomplishments and challenges of surgical simulation.

    PubMed

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  8. The future of simulations for space applications

    NASA Astrophysics Data System (ADS)

    Matsumoto, H.

    Space development has been rapidly increasing and there will be huge investment by business markets for space development and applications such as space factory and Solar Power Station (SPS). In such a situation, we would like to send a warning message regarding the future space simulations. It is widely recognized that space simulation have been contributing to the quantitative understanding of various plasma phenomena occurring in the solarterrestrial environment. In the current century, however, in addition to the conventional contribution to the solar-terrestrial physics, we also have to pay our attention to the application of space simulation for human activities in space. We believe that space simulations can be a a powerful and helpful tool for the understanding the spacecraft-environment interactions occurring in space development and applications. The global influence by exhausted heavy ions from electric propulsion on the plasmasphere can be also analyzed by the combination of MHD and particle simulations. The results obtained in the simulations can provide us very significant and beneficial information so that we can minimize the undesirable effects in space development and applications. 1 Brief history of ISSS and contribution to the space plasma physics Numerical simulation has been largely recognized as a powerful tool in the advance of space plasma physics. The International School for Space Simulation (ISSS) series was set up in order to emphasize such a recognition in the early eighties, on the common initiative of M. Ashour-Abdalla, R. Gendrin, T. Sato and myself. The preceding five ISSS's (in Japan, USA, France, Japan, and Japan again) have greatly contributed to the promotion of and advance of computer simulations as well as the education of students trying to start the simulation study for their own research objectives.

  9. Rendering of dense, point cloud data in a high fidelity driving simulator.

    DOT National Transportation Integrated Search

    2014-09-01

    Driving Simulators are advanced tools that can address many research questions in transportation. Recently they have been used to advance the practice of transportation engineering, specifically signs, signals, pavement markings, and most powerfully ...

  10. Recent progress in simulating galaxy formation from the largest to the smallest scales

    NASA Astrophysics Data System (ADS)

    Faucher-Giguère, Claude-André

    2018-05-01

    Galaxy formation simulations are an essential part of the modern toolkit of astrophysicists and cosmologists alike. Astrophysicists use the simulations to study the emergence of galaxy populations from the Big Bang, as well as the formation of stars and supermassive black holes. For cosmologists, galaxy formation simulations are needed to understand how baryonic processes affect measurements of dark matter and dark energy. Owing to the extreme dynamic range of galaxy formation, advances are driven by novel approaches using simulations with different tradeoffs between volume and resolution. Large-volume but low-resolution simulations provide the best statistics, while higher-resolution simulations of smaller cosmic volumes can be evolved with self-consistent physics and reveal important emergent phenomena. I summarize recent progress in galaxy formation simulations, including major developments in the past five years, and highlight some key areas likely to drive further advances over the next decade.

  11. Advanced flight deck/crew station simulator functional requirements

    NASA Technical Reports Server (NTRS)

    Wall, R. L.; Tate, J. L.; Moss, M. J.

    1980-01-01

    This report documents a study of flight deck/crew system research facility requirements for investigating issues involved with developing systems, and procedures for interfacing transport aircraft with air traffic control systems planned for 1985 to 2000. Crew system needs of NASA, the U.S. Air Force, and industry were investigated and reported. A matrix of these is included, as are recommended functional requirements and design criteria for simulation facilities in which to conduct this research. Methods of exploiting the commonality and similarity in facilities are identified, and plans for exploiting this in order to reduce implementation costs and allow efficient transfer of experiments from one facility to another are presented.

  12. Advanced Thermal Barrier and Environmental Barrier Coating Development at NASA GRC

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Robinson, Craig

    2017-01-01

    This presentation summarizes NASA's advanced thermal barrier and environmental barrier coating systems, and the coating performance improvements that has recently been achieved and documented in laboratory simulated rig test conditions. One of the emphases has been placed on the toughness and impact resistance enhancements of the low conductivity, defect cluster thermal barrier coating systems. The advances in the next generation environmental barrier coatings for SiCSiC ceramic matrix composites have also been highlighted, particularly in the design of a new series of oxide-silicate composition systems to be integrated with next generation SiC-SiC turbine engine components for 2700F coating applications. Major technical barriers in developing the thermal and environmental barrier coating systems are also described. The performance and model validations in the rig simulated turbine combustion, heat flux, steam and calcium-magnesium-aluminosilicate (CMAS) environments have helped the current progress in improved temperature capability, environmental stability, and long-term fatigue-environment system durability of the advanced thermal and environmental barrier coating systems.

  13. Molecular dynamics simulations of large macromolecular complexes.

    PubMed

    Perilla, Juan R; Goh, Boon Chong; Cassidy, C Keith; Liu, Bo; Bernardi, Rafael C; Rudack, Till; Yu, Hang; Wu, Zhe; Schulten, Klaus

    2015-04-01

    Connecting dynamics to structural data from diverse experimental sources, molecular dynamics simulations permit the exploration of biological phenomena in unparalleled detail. Advances in simulations are moving the atomic resolution descriptions of biological systems into the million-to-billion atom regime, in which numerous cell functions reside. In this opinion, we review the progress, driven by large-scale molecular dynamics simulations, in the study of viruses, ribosomes, bioenergetic systems, and other diverse applications. These examples highlight the utility of molecular dynamics simulations in the critical task of relating atomic detail to the function of supramolecular complexes, a task that cannot be achieved by smaller-scale simulations or existing experimental approaches alone. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Simulator for concurrent processing data flow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.

    1992-01-01

    A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.

  15. Ninth Conference on Space Simulation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The papers presented in this conference provided an international dialogue and a meaningful exchange in the simulation of space environments as well as the evolution of these technological advances into other fields. The papers represent a significant contribution to the understanding of space simulation problems and the utilization of this knowledge. The topics of the papers include; spacecraft testing; facilities and test equipment; system and subsystem test; life sciences, medicine and space; physical environmental factors; chemical environmental factors; contamination; space physics; and thermal protection.

  16. Simulation-based transthoracic echocardiography: “An anesthesiologist's perspective”

    PubMed Central

    Magoon, Rohan; Sharma, Amita; Ladha, Suruchi; Kapoor, Poonam Malhotra; Hasija, Suruchi

    2016-01-01

    With the growing requirement of echocardiography in the perioperative management, the anesthesiologists need to be well trained in transthoracic echocardiography (TTE). Lack of formal, structured teaching program precludes the same. The present article reviews the expanding domain of TTE, simulation-based TTE training, the advancements, current limitations, and the importance of simulation-based training for the anesthesiologists. PMID:27397457

  17. The Role of Crop Systems Simulation in Agriculture and Environment

    USDA-ARS?s Scientific Manuscript database

    Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...

  18. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  19. En route air traffic flow simulation.

    DOT National Transportation Integrated Search

    1971-01-01

    The report covers the conception, design, development, and initial implementation of an advanced simulation technique applied to a study of national air traffic flow and its control by En Route Air Route Traffic Control Centers (ARTCC). It is intende...

  20. Recent Advances in Agglomerated Multigrid

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.; Hammond, Dana P.

    2013-01-01

    We report recent advancements of the agglomerated multigrid methodology for complex flow simulations on fully unstructured grids. An agglomerated multigrid solver is applied to a wide range of test problems from simple two-dimensional geometries to realistic three- dimensional configurations. The solver is evaluated against a single-grid solver and, in some cases, against a structured-grid multigrid solver. Grid and solver issues are identified and overcome, leading to significant improvements over single-grid solvers.