Sample records for terascale simulation fwp

  1. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  2. Terascale Cluster for Advanced Turbulent Combustion Simulations

    DTIC Science & Technology

    2008-07-25

    the system We have given the name CATS (for Combustion And Turbulence Simulator) to the terascale system that was obtained through this grant. CATS ...lnfiniBand interconnect. CATS includes an interactive login node and a file server, each holding in excess of 1 terabyte of file storage. The 35 active...compute nodes of CATS enable us to run up to 140-core parallel MPI batch jobs; one node is reserved to run the scheduler. CATS is operated and

  3. Terascale spectral element algorithms and implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, P. F.; Tufo, H. M.

    1999-08-17

    We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.

  4. Terascale direct numerical simulations of turbulent combustion using S3D

    NASA Astrophysics Data System (ADS)

    Chen, J. H.; Choudhary, A.; de Supinski, B.; DeVries, M.; Hawkes, E. R.; Klasky, S.; Liao, W. K.; Ma, K. L.; Mellor-Crummey, J.; Podhorszki, N.; Sankaran, R.; Shende, S.; Yoo, C. S.

    2009-01-01

    Computational science is paramount to the understanding of underlying processes in internal combustion engines of the future that will utilize non-petroleum-based alternative fuels, including carbon-neutral biofuels, and burn in new combustion regimes that will attain high efficiency while minimizing emissions of particulates and nitrogen oxides. Next-generation engines will likely operate at higher pressures, with greater amounts of dilution and utilize alternative fuels that exhibit a wide range of chemical and physical properties. Therefore, there is a significant role for high-fidelity simulations, direct numerical simulations (DNS), specifically designed to capture key turbulence-chemistry interactions in these relatively uncharted combustion regimes, and in particular, that can discriminate the effects of differences in fuel properties. In DNS, all of the relevant turbulence and flame scales are resolved numerically using high-order accurate numerical algorithms. As a consequence terascale DNS are computationally intensive, require massive amounts of computing power and generate tens of terabytes of data. Recent results from terascale DNS of turbulent flames are presented here, illustrating its role in elucidating flame stabilization mechanisms in a lifted turbulent hydrogen/air jet flame in a hot air coflow, and the flame structure of a fuel-lean turbulent premixed jet flame. Computing at this scale requires close collaborations between computer and combustion scientists to provide optimized scaleable algorithms and software for terascale simulations, efficient collective parallel I/O, tools for volume visualization of multiscale, multivariate data and automating the combustion workflow. The enabling computer science, applied to combustion science, is also required in many other terascale physics and engineering simulations. In particular, performance monitoring is used to identify the performance of key kernels in the DNS code, S3D and especially memory intensive loops in the code. Through the careful application of loop transformations, data reuse in cache is exploited thereby reducing memory bandwidth needs, and hence, improving S3D's nodal performance. To enhance collective parallel I/O in S3D, an MPI-I/O caching design is used to construct a two-stage write-behind method for improving the performance of write-only operations. The simulations generate tens of terabytes of data requiring analysis. Interactive exploration of the simulation data is enabled by multivariate time-varying volume visualization. The visualization highlights spatial and temporal correlations between multiple reactive scalar fields using an intuitive user interface based on parallel coordinates and time histogram. Finally, an automated combustion workflow is designed using Kepler to manage large-scale data movement, data morphing, and archival and to provide a graphical display of run-time diagnostics.

  5. THOR Field and Wave Processor - FWP

    NASA Astrophysics Data System (ADS)

    Soucek, Jan; Rothkaehl, Hanna; Balikhin, Michael; Zaslavsky, Arnaud; Nakamura, Rumi; Khotyaintsev, Yuri; Uhlir, Ludek; Lan, Radek; Yearby, Keith; Morawski, Marek; Winkler, Marek

    2016-04-01

    If selected, Turbulence Heating ObserveR (THOR) will become the first mission ever flown in space dedicated to plasma turbulence. The Fields and Waves Processor (FWP) is an integrated electronics unit for all electromagnetic field measurements performed by THOR. FWP will interface with all fields sensors: electric field antennas of the EFI instrument, the MAG fluxgate magnetometer and search-coil magnetometer (SCM) and perform data digitization and on-board processing. FWP box will house multiple data acquisition sub-units and signal analyzers all sharing a common power supply and data processing unit and thus a single data and power interface to the spacecraft. Integrating all the electromagnetic field measurements in a single unit will improve the consistency of field measurement and accuracy of time synchronization. The feasibility of making highly sensitive electric and magnetic field measurements in space has been demonstrated by Cluster (among other spacecraft) and THOR instrumentation complemented by a thorough electromagnetic cleanliness program will further improve on this heritage. Taking advantage of the capabilities of modern electronics, FWP will provide simultaneous synchronized waveform and spectral data products at high time resolution from the numerous THOR sensors, taking advantage of the large telemetry bandwidth of THOR. FWP will also implement a plasma a resonance sounder and a digital plasma quasi-thermal noise analyzer designed to provide high cadence measurements of plasma density and temperature complementary to data from particle instruments. FWP will be interfaced with the particle instrument data processing unit (PPU) via a dedicated digital link which will enable performing on board correlation between waves and particles, quantifying the transfer of energy between waves and particles. The FWP instrument shall be designed and built by an international consortium of scientific institutes from Czech Republic, Poland, France, UK, Sweden and Austria.

  6. THOR Fields and Wave Processor - FWP

    NASA Astrophysics Data System (ADS)

    Soucek, Jan; Rothkaehl, Hanna; Ahlen, Lennart; Balikhin, Michael; Carr, Christopher; Dekkali, Moustapha; Khotyaintsev, Yuri; Lan, Radek; Magnes, Werner; Morawski, Marek; Nakamura, Rumi; Uhlir, Ludek; Yearby, Keith; Winkler, Marek; Zaslavsky, Arnaud

    2017-04-01

    If selected, Turbulence Heating ObserveR (THOR) will become the first spacecraft mission dedicated to the study of plasma turbulence. The Fields and Waves Processor (FWP) is an integrated electronics unit for all electromagnetic field measurements performed by THOR. FWP will interface with all THOR fields sensors: electric field antennas of the EFI instrument, the MAG fluxgate magnetometer, and search-coil magnetometer (SCM), and perform signal digitization and on-board data processing. FWP box will house multiple data acquisition sub-units and signal analyzers all sharing a common power supply and data processing unit and thus a single data and power interface to the spacecraft. Integrating all the electromagnetic field measurements in a single unit will improve the consistency of field measurement and accuracy of time synchronization. The scientific value of highly sensitive electric and magnetic field measurements in space has been demonstrated by Cluster (among other spacecraft) and THOR instrumentation will further improve on this heritage. Large dynamic range of the instruments will be complemented by a thorough electromagnetic cleanliness program, which will prevent perturbation of field measurements by interference from payload and platform subsystems. Taking advantage of the capabilities of modern electronics and the large telemetry bandwidth of THOR, FWP will provide multi-component electromagnetic field waveforms and spectral data products at a high time resolution. Fully synchronized sampling of many signals will allow to resolve wave phase information and estimate wavelength via interferometric correlations between EFI probes. FWP will also implement a plasma resonance sounder and a digital plasma quasi-thermal noise analyzer designed to provide high cadence measurements of plasma density and temperature complementary to data from particle instruments. FWP will rapidly transmit information about magnetic field vector and spacecraft potential to the particle instrument data processing unit (PPU) via a dedicated digital link. This information will help particle instruments to optimize energy and angular sweeps and calculate on-board moments. FWP will also coordinate the acquisition of high resolution waveform snapshots with very high time resolution electron data from the TEA instrument. This combined wave/particle measurement will provide the ultimate dataset for investigation of wave-particle interactions on electron scales. The FWP instrument shall be designed and built by an international consortium of scientific institutes from Czech Republic, Poland, France, UK, Sweden and Austria.

  7. A statistical retrieval of cloud parameters for the millimeter wave Ice Cloud Imager on board MetOp-SG

    NASA Astrophysics Data System (ADS)

    Prigent, Catherine; Wang, Die; Aires, Filipe; Jimenez, Carlos

    2017-04-01

    The meteorological observations from satellites in the microwave domain are currently limited to below 190 GHz. However, the next generation of European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Polar System-Second Generation-EPS-SG will carry an instrument, the Ice Cloud Imager (ICI), with frequencies up to 664 GHz, to improve the characterization of the cloud frozen phase. In this paper, a statistical retrieval of cloud parameters for ICI is developed, trained on a synthetic database derived from the coupling of a mesoscale cloud model and radiative transfer calculations. The hydrometeor profiles simulated with the Weather Research and Forecasting model (WRF) for twelve diverse European mid-latitude situations are used to simulate the brightness temperatures with the Atmospheric Radiative Transfer Simulator (ARTS) to prepare the retrieval database. The WRF+ARTS simulations have been compared to the Special Sensor Microwave Imager/Sounder (SSMIS) observations up to 190 GHz: this successful evaluation gives us confidence in the simulations at the ICI channels from 183 to 664 GHz. Statistical analyses have been performed on this simulated retrieval database, showing that it is not only physically realistic but also statistically satisfactory for retrieval purposes. A first Neural Network (NN) classifier is used to detect the cloud presence. A second NN is developed to retrieve the liquid and ice integrated cloud quantities over sea and land separately. The detection and retrieval of the hydrometeor quantities (i.e., ice, snow, graupel, rain, and liquid cloud) are performed with ICI-only, and with ICI combined with observations from the MicroWave Imager (MWI, with frequencies from 19 to 190 GHz, also on board MetOp-SG). The ICI channels have been optimized for the detection and quantification of the cloud frozen phases: adding the MWI channels improves the performance of the vertically integrated hydrometeor contents, especially for the cloud liquid phases. The relative error for the retrieved integrated frozen water content (FWP, i.e., ice+snow+graupel) is below 40% for 0.1kg/m2 < FWP < 0.5kg/m2 and below 20% for FWP > 0.5 kg/m2.

  8. Influence of shade on the growth and nitrogen assimilation of developing fruits on bell pepper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achhireddy, N.R.; Fletcher, J.S.; Beevers, L.

    Accumulation of dry mass, total N, protein N, and soluble amino acid N in the developing fruit and seeds of bell pepper (Capsicum annuum L.) was determined at selected intervals following anthesis. The importance of photosynthesis to the growth and nitrogen (N) assimilation in the developing fruit wall plus placenta (FWP) and seeds was evaluated by comparing the growth and accumulation of reduced N in nonphotosynthetic and photosynthetic fruits (covered vs. uncovered). The growth rate of the FWP and seeds was similar under both conditions. After 65 days of growth, the fruits kept in the dark weighed about 15% lessmore » than those receiving illumination; seed weight was the same for both treatments. Total N content of the FWP or seed continued to increase up to 55 days after anthesis. The FWP accumulated over 90% of fruit's total N, and there were no significant differences between covered and uncovered fruits. Protein N accounted for about 50% of the total N present in both covered and uncovered fruits. 15 references, 2 figures, 2 tables.« less

  9. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  10. Rediscovering America: The FWP Legacy and Challenge

    ERIC Educational Resources Information Center

    Hirsch, Jerrold

    2012-01-01

    An overview of the history of the Federal Writers' Project, hitting on critical reference points for our vision of what the project might look like today: the 1930s' FWP's cross-disciplinary integration of literature and history; the rejection of strict divisions between high and low culture; and the bottom-up approach to history that had begun…

  11. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samulyak, Roman V.; Brookhaven National Lab.; Parks, Paul

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy.more » High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.« less

  12. Colliders as a simultaneous probe of supersymmetric dark matter and Terascale cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barenboim, Gabriela; /Valencia U.; Lykken, Joseph D.

    2006-08-01

    Terascale supersymmetry has the potential to provide a natural explanation of the dominant dark matter component of the standard {Lambda}CDM cosmology. However once we impose the constraints on minimal supersymmetry parameters from current particle physics data, a satisfactory dark matter abundance is no longer prima facie natural. This Neutralino Tuning Problem could be a hint of nonstandard cosmology during and/or after the Terascale era. To quantify this possibility, we introduce an alternative cosmological benchmark based upon a simple model of quintessential inflation. This benchmark has no free parameters, so for a given supersymmetry model it allows an unambiguous prediction ofmore » the dark matter relic density. As a example, we scan over the parameter space of the CMSSM, comparing the neutralino relic density predictions with the bounds from WMAP. We find that the WMAP-allowed regions of the CMSSM are an order of magnitude larger if we use the alternative cosmological benchmark, as opposed to {Lambda}CDM. Initial results from the CERN Large Hadron Collider will distinguish between the two allowed regions.« less

  13. Colliders as a simultaneous probe of supersymmetric dark matter and Terascale cosmology

    NASA Astrophysics Data System (ADS)

    Barenboim, Gabriela; Lykken, Joseph D.

    2006-12-01

    Terascale supersymmetry has the potential to provide a natural explanation of the dominant dark matter component of the standard ΛCDM cosmology. However once we impose the constraints on minimal supersymmetry parameters from current particle physics data, a satisfactory dark matter abundance is no longer prima facie natural. This Neutralino Tuning Problem could be a hint of nonstandard cosmology during and/or after the Terascale era. To quantify this possibility, we introduce an alternative cosmological benchmark based upon a simple model of quintessential inflation. This benchmark has no free parameters, so for a given supersymmetry model it allows an unambiguous prediction of the dark matter relic density. As a example, we scan over the parameter space of the CMSSM, comparing the neutralino relic density predictions with the bounds from WMAP. We find that the WMAP allowed regions of the CMSSM are an order of magnitude larger if we use the alternative cosmological benchmark, as opposed to ΛCDM. Initial results from the CERN Large Hadron Collider will distinguish between the two allowed regions.

  14. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  15. Numerical heating in Particle-In-Cell simulations with Monte Carlo binary collisions

    NASA Astrophysics Data System (ADS)

    Alves, E. Paulo; Mori, Warren; Fiuza, Frederico

    2017-10-01

    The binary Monte Carlo collision (BMCC) algorithm is a robust and popular method to include Coulomb collision effects in Particle-in-Cell (PIC) simulations of plasmas. While a number of works have focused on extending the validity of the model to different physical regimes of temperature and density, little attention has been given to the fundamental coupling between PIC and BMCC algorithms. Here, we show that the coupling between PIC and BMCC algorithms can give rise to (nonphysical) numerical heating of the system, that can be far greater than that observed when these algorithms operate independently. This deleterious numerical heating effect can significantly impact the evolution of the simulated system particularly for long simulation times. In this work, we describe the source of this numerical heating, and derive scaling laws for the numerical heating rates based on the numerical parameters of PIC-BMCC simulations. We compare our theoretical scalings with PIC-BMCC numerical experiments, and discuss strategies to minimize this parasitic effect. This work is supported by DOE FES under FWP 100237 and 100182.

  16. 7 CFR 1410.11 - Farmable Wetlands Program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... in FWP if the land is buffer acreage that provides protection for and is contiguous to land otherwise... (b)(4) of this section; or (4) A suitable buffer as determined by the Deputy Administrator for lands...

  17. 7 CFR 1410.11 - Farmable Wetlands Program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in FWP if the land is buffer acreage that provides protection for and is contiguous to land otherwise... (b)(4) of this section; or (4) A suitable buffer as determined by the Deputy Administrator for lands...

  18. 7 CFR 1410.11 - Farmable Wetlands Program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in FWP if the land is buffer acreage that provides protection for and is contiguous to land otherwise... (b)(4) of this section; or (4) A suitable buffer as determined by the Deputy Administrator for lands...

  19. Limits on silicon nanoelectronics for terascale integration.

    PubMed

    Meindl, J D; Chen, Q; Davis, J A

    2001-09-14

    Throughout the past four decades, silicon semiconductor technology has advanced at exponential rates in both performance and productivity. Concerns have been raised, however, that the limits of silicon technology may soon be reached. Analysis of fundamental, material, device, circuit, and system limits reveals that silicon technology has an enormous remaining potential to achieve terascale integration (TSI) of more than 1 trillion transistors per chip. Such massive-scale integration is feasible assuming the development and economical mass production of double-gate metal-oxide-semiconductor field effect transistors with gate oxide thickness of about 1 nanometer, silicon channel thickness of about 3 nanometers, and channel length of about 10 nanometers. The development of interconnecting wires for these transistors presents a major challenge to the achievement of nanoelectronics for TSI.

  20. Livestock and feed water productivity in the mixed crop-livestock system.

    PubMed

    Bekele, M; Mengistu, A; Tamir, B

    2017-10-01

    Recently with limited information from intensified grain-based farming systems in developed countries, livestock production is challenged as being huge consumer of freshwater. The smallholder mixed crop-livestock (MCL) system which is predominant in developing countries like Ethiopia, is maintained with considerable contributions of crop residues (CR) to livestock feeding. Inclusion of CR is expected to reduce the water requirement for feed production resulting improvement in livestock water productivity (LWP). This study was conducted to determine feed water productivity (FWP) and LWP in the MCL system. A multistage sampling procedure was followed to select farmers from different wealth status. Wealth status dictated by ownership of key farm resources such as size of cropland and livestock influenced the magnitude of livestock outputs, FWP and LWP. Significant difference in feed collected, freshwater evapotranspired, livestock outputs and water productivity (WP) were observed between wealth groups, where wealthier are relatively more advantaged. Water productivity of CR and grazing land (GL) analyzed separately showed contrasting differences where better-off gained more on CR, whereas vice versa on GL. These counterbalancing of variations may justify the non-significant difference in total FWP between wealth groups. Despite observed differences, low WP on GL indicates the need of interventions at all levels. The variation in WP of CR is attributed to availability of production factors which restrained the capacity of poor farmers most. A linear relationship between the proportion of CR in livestock feed and FWP was evident, but the relationship with LWP was not likely linear. As CR are inherently low in digestibility and nutritive values which have an effect on feed conversion into valuable livestock products and services, increasing share of CR beyond an optimum level is not a viable option to bring improvements in livestock productivity as expressed in terms of LWP. Ensuring land security, installing proper grazing management, improved forage seed supply and application of soil and water conservation are expected to enhance WP on GL. Given the relationship of production factors with crop biomass and associated WP, interventions targeted to improve provision of inputs, credit, extension and training support due emphasis to the poor would increase CR yield and reduce part of water use for feed production. Optimizing feed value of CR with treatment and supplementation, following water efficient forage production methods and maintenance of healthy productive animals are expected to amplify the benefits from livestock and eventually improve LWP.

  1. VisPort: Web-Based Access to Community-Specific Visualization Functionality [Shedding New Light on Exploding Stars: Visualization for TeraScale Simulation of Neutrino-Driven Supernovae (Final Technical Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, M Pauline

    2007-06-30

    The VisPort visualization portal is an experiment in providing Web-based access to visualization functionality from any place and at any time. VisPort adopts a service-oriented architecture to encapsulate visualization functionality and to support remote access. Users employ browser-based client applications to choose data and services, set parameters, and launch visualization jobs. Visualization products typically images or movies are viewed in the user's standard Web browser. VisPort emphasizes visualization solutions customized for specific application communities. Finally, VisPort relies heavily on XML, and introduces the notion of visualization informatics - the formalization and specialization of information related to the process and productsmore » of visualization.« less

  2. The Impact of the Nuclear Equation of State in Core Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Baird, M. L.; Lentz, E. J.; Hix, W. R.; Mezzacappa, A.; Messer, O. E. B.; Liebendoerfer, M.; TeraScale Supernova Initiative Collaboration

    2005-12-01

    One of the key ingredients to the core collapse supernova mechanism is the physics of matter at or near nuclear density. Included in simulations as part of the Equation of State (EOS), nuclear repulsion experienced at high densities are responsible for the bounce shock, which initially causes the outer envelope of the supernova to expand, as well as determining the structure of the newly formed proto-neutron star. Recent years have seen renewed interest in this fundamental piece of supernova physics, resulting in several promising candidate EOS parameterizations. We will present the impact of these variations in the nuclear EOS using spherically symmetric, Newtonian and General Relativistic neutrino transport simulations of stellar core collapse and bounce. This work is supported in part by SciDAC grants to the TeraScale Supernovae Initiative from the DOE Office of Science High Energy, Nuclear, and Advanced Scientific Computing Research Programs. Oak Ridge National Laboratory is managed by UT-Battelle, LLC, for U.S. Department of Energy under contract DEAC05-00OR22725

  3. DAKOTA JAGUAR 3.0 user's manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Bauman, Lara E; Chan, Ethan

    2013-05-01

    JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the features necessary to use JAGUAR.

  4. JAGUAR developer's manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Ethan

    2011-06-01

    JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the technical background necessary for a developer to understand JAGUAR.

  5. Turbulent Dynamo Amplification of Magnetic Fields in Laser-Produced Plasmas: Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Tzeferacos, P.; Rigby, A.; Bott, A.; Bell, A.; Bingham, R.; Casner, A.; Cattaneo, F.; Churazov, E.; Forest, C.; Katz, J.; Koenig, M.; Li, C.-K.; Meinecke, J.; Petrasso, R.; Park, H.-S.; Remington, B.; Ross, J.; Ryutov, D.; Ryu, D.; Reville, B.; Miniati, F.; Schekochihin, A.; Froula, D.; Lamb, D.; Gregori, G.

    2017-10-01

    The universe is permeated by magnetic fields, with strengths ranging from a femtogauss in the voids between the filaments of galaxy clusters to several teragauss in black holes and neutron stars. The standard model for cosmological magnetic fields is the nonlinear amplification of seed fields via turbulent dynamo. We have conceived experiments to demonstrate and study the turbulent dynamo mechanism in the laboratory. Here, we describe the design of these experiments through large-scale 3D FLASH simulations on the Mira supercomputer at ANL, and the laser-driven experiments we conducted with the OMEGA laser at LLE. Our results indicate that turbulence is capable of rapidly amplifying seed fields to near equipartition with the turbulent fluid motions. This work was supported in part from the ERC (FP7/2007-2013, No. 256973 and 247039), and the U.S. DOE, Contract No. B591485 to LLNL, FWP 57789 to ANL, Grant No. DE-NA0002724 and DE-SC0016566 to the University of Chicago, and DE-AC02-06CH11357 to ANL.

  6. Lattice Stability and Interatomic Potential of Non-equilibrium Warm Dense Gold

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Mo, M.; Soulard, L.; Recoules, V.; Hering, P.; Tsui, Y. Y.; Ng, A.; Glenzer, S. H.

    2017-10-01

    Interatomic potential is central to the calculation and understanding of the properties of matter. A manifestation of interatomic potential is lattice stability in the solid-liquid transition. Recently, we have used frequency domain interferometry (FDI) to study the disassembly of ultrafast laser heated warm dense gold nanofoils. The FDI measurement is implemented by a spatial chirped single-shot technique. The disassembly of the sample is characterized by the change in phase shift of the reflected probe resulted from hydrodynamic expansion. The experimental data is compared with the results of two-temperature molecular dynamic simulations based on a highly optimized embedded-atom-method (EAM) interatomic potential. Good agreement is found for absorbed energy densities of 0.9 to 4.3MJ/kg. This provides the first demonstration of the applicability of an EAM interatomic potential in the non-equilibrium warm dense matter regime. The MD simulations also reveal the critical role of pressure waves in solid-liquid transition in ultrafast laser heated nanofoils. This work is supported by DOE Office of Science, Fusion Energy Science under FWP 100182, and SLAC LDRD program.

  7. FWP executive summaries: basic energy sciences materials sciences and engineering program (SNL/NM).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samara, George A.; Simmons, Jerry A.

    2006-07-01

    This report presents an Executive Summary of the various elements of the Materials Sciences and Engineering Program which is funded by the Division of Materials Sciences and Engineering, Office of Basic Energy Sciences, U.S. Department of Energy at Sandia National Laboratories, New Mexico. A general programmatic overview is also presented.

  8. Writing Democracy: Notes on a Federal Writers' Project for the 21st Century

    ERIC Educational Resources Information Center

    Carter, Shannon; Mutnick, Deborah

    2012-01-01

    A general overview of the Writing Democracy project, including its origin story and key objectives. Draws parallels between the historical context that gave rise to the New Deal's Federal Writers' Project and today, examining the potential for a reprise of FWP in community literacy and public rhetoric and introducing articles collected in this…

  9. Underwater Threat Source Localization: Processing Sensor Network TDOAs with a Terascale Optical Core Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, Jacob; Imam, Neena

    2007-01-01

    Revolutionary computing technologies are defined in terms of technological breakthroughs, which leapfrog over near-term projected advances in conventional hardware and software to produce paradigm shifts in computational science. For underwater threat source localization using information provided by a dynamical sensor network, one of the most promising computational advances builds upon the emergence of digital optical-core devices. In this article, we present initial results of sensor network calculations that focus on the concept of signal wavefront time-difference-of-arrival (TDOA). The corresponding algorithms are implemented on the EnLight processing platform recently introduced by Lenslet Laboratories. This tera-scale digital optical core processor is optimizedmore » for array operations, which it performs in a fixed-point-arithmetic architecture. Our results (i) illustrate the ability to reach the required accuracy in the TDOA computation, and (ii) demonstrate that a considerable speed-up can be achieved when using the EnLight 64a prototype processor as compared to a dual Intel XeonTM processor.« less

  10. Simplified Parallel Domain Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson III, David J

    2011-01-01

    Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less

  11. THOR Fluxgate Magnetometer (MAG)

    NASA Astrophysics Data System (ADS)

    Nakamura, Rumi; Eastwood, Jonathan; Magnes, Werner; Valavanoglou, Aris; Carr, Christopher M.; O'Brien, Helen L.; Narita, Yasuhito; Delva, Magda; Chen, Christopher H. K.; Plaschke, Ferdinand; Soucek, Jan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The goal of the Fluxgate Magnetometer (MAG) is to measure the DC to low frequency ambient magnetic field. The design of the magnetometer consists of two tri-axial sensors and the related magnetometer electronics; the electronics are hosted on printed circuit boards in the common electronics box of the fields and wave processor (FWP). A fully redundant two sensor system mounted on a common boom and the new miniaturized low noise design based on MMS and Solar Orbiter instruments enable accurate measurement throughout the region of interest for THOR science. The usage of the common electronics hosted by FWP guarantees to fulfill the required timing accuracy with other fields measurements. These improvements are important to obtain precise measurements of magnetic field, which is essential to estimate basic plasma parameters and correctly identify the spatial and temporal scales of the turbulence. Furthermore, THOR MAG provides high quality data with sufficient overlap with the Search Coil Magnetometer (SCM) in frequency space to obtain full coverage of the wave forms over all the frequencies necessary to obtain the full solar wind turbulence spectrum from MHD to kinetic range with sufficient accuracy.

  12. Terascale Computing in Accelerator Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Kwok

    2002-08-21

    We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less

  13. The Development of the Non-hydrostatic Unified Model of the Atmosphere (NUMA)

    DTIC Science & Technology

    2011-09-19

    capabilities: 1.  Highly scalable on current and future computer architectures ( exascale computing: this means CPUs and GPUs) 2.  Flexibility to use a...From Terascale to Petascale/ Exascale Computing •  10 of Top 500 are already in the Petascale range •  3 of top 10 are GPU-based machines 2

  14. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.

    2017-10-01

    FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.

  15. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  16. Mining Tera-Scale Graphs: Theory, Engineering and Discoveries

    DTIC Science & Technology

    2012-05-01

    of them are domain selling or porn sites which are replicated from templates. slope of the size distribution do not change after year 2003. We...number ‘np1’ ‘cards’ ‘np2’ Concept 3: “Health System” health provider ‘np1’ ‘care’ ‘np2’ child providers ‘np’ ‘insurance’ ‘np2’ home system ‘np1

  17. Solid-density plasma expansion in intense ultra-short laser irradiation measured on nanometer scale and in real time

    NASA Astrophysics Data System (ADS)

    Kluge, T.; Metzkes, J.; Pelka, A.; Laso Garcia, A.; Prencipe, I.; Bussmann, M.; Zeil, K.; Schoenherr, T.; Hartley, N.; Gutt, C.; Galtier, E.; Nam, I.; Lee, Hj; McBride, Ee; Glenzer, S.; Huebner, U.; Roedel, C.; Nakatsutsumi, M.; Roedel, M.; Rehwald, M.; Garten, M.; Zacharias, M.; Schramm, U.; Cowan, T. E.

    2017-10-01

    Small Angle X-ray Scattering (SAXS) is discussed to allow unprecedented direct measurements limited only by the probe X-ray wavelength and duration. Here we present the first direct in-situ measurement of intense short-pulse laser - solid interaction that allows nanometer and high temporal resolution at the same time. A 120 fs laser pulse with energy 1 J was focused on a silicon membrane. The density was probed with an X-ray beam of 49 fs duration by SAXS. Despite prepulses, we can exclude premature bulk expansion. The plasma expansion is triggered only shortly before the main pulse, when an expansion of 10 nm within less than 200 fs was measured. Analysis of scattering patterns allows the first direct verification of numerical simulations. Supported by DOE FWP 100182, SF00515; EC FP7 LASERLAB-EUROPE/CHARPAC (contract 284464); German Federal Ministry of Education and Research (BMBF) under Contract Number 03Z1O511; MG and MZ supported by the European Union's Horizon 2020 No 654220.

  18. Visualization of the ultrafast structural phase transitions in warm dense matter

    NASA Astrophysics Data System (ADS)

    Mo, Mianzhen

    2017-10-01

    It is still a great challenge to obtain real-time atomistic-scale information on the structural phase transitions that lead to warm dense matter state. Recent advances in ultrafast electron diffraction (UED) techniques have opened up exciting prospects to unravel the mechanisms of solid-liquid phase transitions under these extreme non-equilibrium conditions. Here we report on precise measurements of melt time dependency on laser excitation energy density that resolve for the first time the transition from heterogeneous to homogeneous melting. This transition appears in both polycrystalline and single-crystal gold nanofilms with distinct measurable differences. These results test predictions from molecular-dynamics simulations with different interatomic potential models. These data further deliver accurate structure factor data to large wavenumbers that allow us to constrain electron-ion equilibration constants. Our results demonstrate electron-phonon coupling strength much weaker than DFT calculations, and contrary to previous results, provide evidence for bond softening. This work is supported by DOE Office of Science, Fusion Energy Science under FWP 100182, and the DOE BES Accelerator and Detector R&D program.

  19. Terascale Visualization: Multi-resolution Aspirin for Big-Data Headaches

    NASA Astrophysics Data System (ADS)

    Duchaineau, Mark

    2001-06-01

    Recent experience on the Accelerated Strategic Computing Initiative (ASCI) computers shows that computational physicists are successfully producing a prodigious collection of numbers on several thousand processors. But with this wealth of numbers comes an unprecedented difficulty in processing and moving them to provide useful insight and analysis. In this talk, a few simulations are highlighted where recent advancements in multiple-resolution mathematical representations and algorithms have provided some hope of seeing most of the physics of interest while keeping within the practical limits of the post-simulation storage and interactive data-exploration resources. A whole host of visualization research activities was spawned by the 1999 Gordon Bell Prize-winning computation of a shock-tube experiment showing Richtmyer-Meshkov turbulent instabilities. This includes efforts for the entire data pipeline from running simulation to interactive display: wavelet compression of field data, multi-resolution volume rendering and slice planes, out-of-core extraction and simplification of mixing-interface surfaces, shrink-wrapping to semi-regularize the surfaces, semi-structured surface wavelet compression, and view-dependent display-mesh optimization. More recently on the 12 TeraOps ASCI platform, initial results from a 5120-processor, billion-atom molecular dynamics simulation showed that 30-to-1 reductions in storage size can be achieved with no human-observable errors for the analysis required in simulations of supersonic crack propagation. This made it possible to store the 25 trillion bytes worth of simulation numbers in the available storage, which was under 1 trillion bytes. While multi-resolution methods and related systems are still in their infancy, for the largest-scale simulations there is often no other choice should the science require detailed exploration of the results.

  20. Association Study Reveals Novel Genes Related to Yield and Quality of Fruit in Cape Gooseberry (Physalis peruviana L.).

    PubMed

    García-Arias, Francy L; Osorio-Guarín, Jaime A; Núñez Zarantes, Victor M

    2018-01-01

    Association mapping has been proposed as an efficient approach to assist plant breeding programs to investigate the genetic basis of agronomic traits. In this study, we evaluated 18 traits related to yield, (FWP, NF, FWI, and FWII), fruit size-shape (FP, FA, MW, WMH, MH, HMW, DI, FSI, FSII, OVO, OBO), and fruit quality (FIR, CF, and SST), in a diverse collection of 100 accessions of Physalis peruviana including wild, landraces, and anther culture derived lines. We identified seven accessions with suitable traits: fruit weight per plant (FWP) > 7,000 g/plant and cracked fruits (CF) < 4%, to be used as parents in cape gooseberry breeding program. In addition, the accessions were also characterized using Genotyping By Sequencing (GBS). We discovered 27,982 and 36,142 informative SNP markers based on the alignment against the two cape gooseberry references transcriptomes. Besides, 30,344 SNPs were identified based on alignment to the tomato reference genome. Genetic structure analysis showed that the population could be divided into two or three sub-groups, corresponding to landraces-anther culture and wild accessions for K = 2 and wild, landraces, and anther culture plants for K = 3. Association analysis was carried out using a Mixed Linear Model (MLM) and 34 SNP markers were significantly associated. These results reveal the basis of the genetic control of important agronomic traits and may facilitate marker-based breeding in P. peruviana .

  1. Association Study Reveals Novel Genes Related to Yield and Quality of Fruit in Cape Gooseberry (Physalis peruviana L.)

    PubMed Central

    García-Arias, Francy L.; Osorio-Guarín, Jaime A.; Núñez Zarantes, Victor M.

    2018-01-01

    Association mapping has been proposed as an efficient approach to assist plant breeding programs to investigate the genetic basis of agronomic traits. In this study, we evaluated 18 traits related to yield, (FWP, NF, FWI, and FWII), fruit size-shape (FP, FA, MW, WMH, MH, HMW, DI, FSI, FSII, OVO, OBO), and fruit quality (FIR, CF, and SST), in a diverse collection of 100 accessions of Physalis peruviana including wild, landraces, and anther culture derived lines. We identified seven accessions with suitable traits: fruit weight per plant (FWP) > 7,000 g/plant and cracked fruits (CF) < 4%, to be used as parents in cape gooseberry breeding program. In addition, the accessions were also characterized using Genotyping By Sequencing (GBS). We discovered 27,982 and 36,142 informative SNP markers based on the alignment against the two cape gooseberry references transcriptomes. Besides, 30,344 SNPs were identified based on alignment to the tomato reference genome. Genetic structure analysis showed that the population could be divided into two or three sub-groups, corresponding to landraces-anther culture and wild accessions for K = 2 and wild, landraces, and anther culture plants for K = 3. Association analysis was carried out using a Mixed Linear Model (MLM) and 34 SNP markers were significantly associated. These results reveal the basis of the genetic control of important agronomic traits and may facilitate marker-based breeding in P. peruviana. PMID:29616069

  2. Polymeric Materials Models in the Warrior Injury Assessment Manikin (WIAMan) Anthropomorphic Test Device (ATD) Tech Demonstrator

    DTIC Science & Technology

    2017-01-01

    are the shear relaxation moduli and relaxation times , which make up the classical Prony series . A Prony- series expansion is a relaxation function...approximation for modeling time -dependent damping. The scalar parameters 1 and 2 control the nonlinearity of the Prony series . Under the...Velodyne that best fit the experimental stress-strain data. To do so, the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA

  3. SHEDDING NEW LIGHT ON EXPLODING STARS: TERASCALE SIMULATIONS OF NEUTRINO-DRIVEN SUPERNOVAE AND THEIR NUCLEOSYNTHESIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haxton, Wick

    2012-03-07

    This project was focused on simulations of core-collapse supernovae on parallel platforms. The intent was to address a number of linked issues: the treatment of hydrodynamics and neutrino diffusion in two and three dimensions; the treatment of the underlying nuclear microphysics that governs neutrino transport and neutrino energy deposition; the understanding of the associated nucleosynthesis, including the r-process and neutrino process; the investigation of the consequences of new neutrino phenomena, such as oscillations; and the characterization of the neutrino signal that might be recorded in terrestrial detectors. This was a collaborative effort with Oak Ridge National Laboratory, State University ofmore » New York at Stony Brook, University of Illinois at Urbana-Champaign, University of California at San Diego, University of Tennessee at Knoxville, Florida Atlantic University, North Carolina State University, and Clemson. The collaborations tie together experts in hydrodynamics, nuclear physics, computer science, and neutrino physics. The University of Washington contributions to this effort include the further development of techniques to solve the Bloch-Horowitz equation for effective interactions and operators; collaborative efforts on developing a parallel Lanczos code; investigating the nuclear and neutrino physics governing the r-process and neutrino physics; and exploring the effects of new neutrino physics on the explosion mechanism, nucleosynthesis, and terrestrial supernova neutrino detection.« less

  4. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data

    PubMed Central

    Venkat, A.; Christensen, C.; Gyulassy, A.; Summa, B.; Federer, F.; Angelucci, A.; Pascucci, V.

    2017-01-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data. PMID:28638896

  5. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.

    PubMed

    Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V

    2016-08-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.

  6. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  7. THOR Fluxgate Magnetometer (MAG)

    NASA Astrophysics Data System (ADS)

    Nakamura, Rumi; Eastwood, Jonathan; Magnes, Werner; Carr, Christopher, M.; O'Brien, Helen, L.; Narita, Yasuhito; K, Chen, Christopher H.; Berghofer, Gerhard; Valavanoglou, Aris; Delva, Magda; Plaschke, Ferdinand; Cupido, Emanuele; Soucek, Jan

    2017-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The fluxgate Magnetometer (MAG) measures the background to low frequency magnetic field. The high sensitivity measurements of MAG enable to characterize the nature of turbulent fluctuations as well as the large-scale context. MAG will provide the reference system for determining anisotropy of field fluctuations, pitch-angle and gyro-phase of particles. The design of the magnetometer consists of two tri-axial sensors and the related magnetometer electronics; the electronics are hosted on printed circuit boards in the common electronics box of the fields and wave processor (FWP). A fully redundant two- sensor system mounted on a common boom and the new miniaturized low noise design based on MMS and Solar Orbiter instruments enable accurate measurement throughout the region of interest for THOR science. The usage of the common electronics hosted by FWP guarantees to fulfill the required timing accuracy with other fields measurements. These improvements are important to obtain precise measurements of magnetic field, which is essential to estimate basic plasma parameters and correctly identify the spatial and temporal scales of the turbulence. Furthermore, THOR MAG provides high quality data with sufficient overlap with the Search Coil Magnetometer (SCM) in frequency space to obtain full coverage of the wave forms over all the frequencies necessary to obtain the full solar wind turbulence spectrum from MHD to kinetic range with sufficient accuracy. We discuss the role of MAG in THOR key science questions and present the new developments during Phase A such as the finalised instrument design, MAG relevant requirement, and new calibraion schemes.

  8. Assessment of the Accountability and Control of Arms, Ammunition, and Explosives (AA&E) Provided to the Security Forces of Afghanistan

    DTIC Science & Technology

    2009-09-11

    Corps to verify weapons and ammunition accountability. It conducted a serial number inventory of 67 weapons, including ten M16 rifles, 40 AK47 ...11. Mossberg M590A1 pump 12 gauge Former Warsaw Pact Weapons (FWP) 1. AK47 rifle and variants ( AK47 , AK74, AMD65, VZ58, AKMS) 2. RPK light machine...recorded as EL 4910  AK47 : 30 counted o Serial No. IA0499 recorded as IA0489  VZ58: 90 counted o 2 weapons were in the wrong boxes

  9. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    NASA Astrophysics Data System (ADS)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  10. Shedding New Light on Exploding Stars: Terascale Simulations of Nuetrino-Dreiven Supernovas and Their Nucleosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis C. Smolarski, S.J.

    Project Abstract This project was a continuation of work begun under a subcontract issued off of TSI-DOE Grant 1528746, awarded to the University of Illinois Urbana-Champaign. Dr. Anthony Mezzacappa is the Principal Investigator on the Illinois award. A separate award was issued to Santa Clara University to continue the collaboration during the time period May 2003 ? 2004. Smolarski continued to work on preconditioner technology and its interface with various iterative methods. He worked primarily with F. Dough Swesty (SUNY-Stony Brook) in continuing software development started in the 2002-03 academic year. Special attention was paid to the development and testingmore » of difference sparse approximate inverse preconditioners and their use in the solution of linear systems arising from radiation transport equations. The target was a high performance platform on which efficient implementation is a critical component of the overall effort. Smolarski also focused on the integration of the adaptive iterative algorithm, Chebycode, developed by Tom Manteuffel and Steve Ashby and adapted by Ryan Szypowski for parallel platforms, into the radiation transport code being developed at SUNY-Stony Brook.« less

  11. Ultrafast-electron-diffraction studies of predamaged tungsten excited by femtosecond optical pulses

    NASA Astrophysics Data System (ADS)

    Mo, M.; Chen, Z.; Li, R.; Wang, Y.; Shen, X.; Dunning, M.; Weathersby, S.; Makasyuk, I.; Coffee, R.; Zhen, Q.; Kim, J.; Reid, A.; Jobe, K.; Hast, C.; Tsui, Y.; Wang, X.; Glenzer, S.

    2016-10-01

    Tungsten is considered as the main candidate material for use in the divertor of magnetic confinement fusion reactors. However, radiation damage is expected to occur because of its direct exposure to the high flux of hot plasma and energetic neutrons in fusion environment. Hence, understanding the material behaviors of W under these adverse conditions is central to the design of magnetic fusion reactors. To do that, we have recently developed an MeV ultrafast electron diffraction probe to resolve the structural evolution of optically excited tungsten. To simulate the radiation damage effect, the tungsten samples were bombarded with 500 keV Cu ions. The pre-damaged and pristine W's were excited by 130fs, 400nm laser pulses, and the subsequent heated system was probed with 3.2MeV electrons. The pump probe measurement shows that the ion bombardment to the W leads to larger decay in Bragg peak intensities as compared to pristine W, which may be due to a phonon softening effect. The measurement also shows that pre-damaged W transitions into complete liquid phase for conditions where pristine W stays solid. Our new capability is able to test the theories of structural dynamics of W under conditions relevant to fusion reactor environment. The research was funded by DOE Fusion Energy Science under FWP #100182.

  12. Characterization of >100 T magnetic fields associated with relativistic Weibel instability in laser-produced plasmas

    NASA Astrophysics Data System (ADS)

    Mishra, Rohini; Ruyer, Charles; Goede, Sebastian; Roedel, Christian; Gauthier, Maxence; Zeil, Karl; Schramm, Ulrich; Glenzer, Siegfried; Fiuza, Frederico

    2016-10-01

    Weibel-type instabilities can occur in weakly magnetized and anisotropic plasmas of relevance to a wide range of astrophysical and laboratory scenarios. It leads to the conversion of a significant fraction of the kinetic energy of the plasma into magnetic energy. We will present a detailed numerical study, using 2D and 3D PIC simulations of the Weibel instability in relativistic laser-solid interactions. In this case, the instability develops due to the counter-streaming of laser-heated electrons and the background return current. We show that the growth rate of the instability is maximized near the critical density region on the rear side of the expanded plasma, producing up to 400 MG magnetic fields for Hydrogen plasmas. We have found that this strong field can be directly probed by energetic protons accelerated in rear side of the plasma by Target Normal Sheath Acceleration (TNSA). This allows the experimental characterization of the instability from the analysis of the spatial modulation of the detected protons. Our numerical results are compared with recent laser experiments with Hydrogen jets and show good agreement with the proton modulations observed experimentally. This work was supported by the DOE Office of Science, Fusion Energy Science (FWP 100182).

  13. Separating Added Value from Hype: Some Experiences and Prognostications

    NASA Astrophysics Data System (ADS)

    Reed, Dan

    2004-03-01

    These are exciting times for the interplay of science and computing technology. As new data archives, instruments and computing facilities are connected nationally and internationally, a new model of distributed scientific collaboration is emerging. However, any new technology brings both opportunities and challenges -- Grids are no exception. In this talk, we will discuss some of the experiences deploying Grid software in production environments, illustrated with experiences from the NSF PACI Alliance, the NSF Extensible Terascale Facility (ETF) and other Grid projects. From these experiences, we derive some guidelines for deployment and some suggestions for community engagement, software development and infrastructure

  14. Lessons Learned from Managing a Petabyte

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J

    2005-01-20

    The amount of data collected and stored by the average business doubles each year. Many commercial databases are already approaching hundreds of terabytes, and at this rate, will soon be managing petabytes. More data enables new functionality and capability, but the larger scale reveals new problems and issues hidden in ''smaller'' terascale environments. This paper presents some of these new problems along with implemented solutions in the framework of a petabyte dataset for a large High Energy Physics experiment. Through experience with two persistence technologies, a commercial database and a file-based approach, we expose format-independent concepts and issues prevalent atmore » this new scale of computing.« less

  15. Channel-shoal morphodynamics in response to distinct hydrodynamic drivers at the outer Weser estuary

    NASA Astrophysics Data System (ADS)

    Herrling, Gerald; Benninghoff, Markus; Zorndt, Anna; Winter, Christian

    2017-04-01

    The interaction of tidal, wave and wind forces primarily governs the morphodynamics of intertidal channel-shoal systems. Typical morphological changes comprise tidal channel meandering and/or migration with related shoal erosion or accretion. These intertidal flat systems are likely to response to accelerated sea level rise and to potential changes in storm frequency and direction. The aim of the ongoing research project is an evaluation of outer estuarine channel-shoal dynamics by combining the analysis of morphological monitoring data with high-resolution morphodynamic modelling. A focus is set on their evolution in reaction to different hydrodynamic forcings like tides, wind driven currents, waves under fair-weather and high energy conditions, and variable upstream discharges. As an example the Outer Weser region was chosen, and a tidal channel system serves as a reference site: Availability of almost annual bathymetrical observations of an approx. 10 km long tidal channel (Fedderwarder Priel) and its morphological development largely independent from maintenance dredging of the main Weser navigational channel make this tributary an ideal study area. The numerical modelling system Delft3D (Deltares) is applied to run real-time annual scenario simulations aiming to evaluate and to differentiate the morphological responses to distinct hydrodynamic drivers. A comprehensive morphological analysis of available observations at the FWP showed that the channel migration trends and directions are persistent at particular channel bends and meanders for the considered period of 14 years. Migration trends and directions are well reproduced by one-year model simulations. Morphodynamic modelling is applied to interpolate between observations and relate sediment dynamics to different forcing scenarios in the outer Weser estuary as a whole and at the scale of local tributary channels and flats.

  16. Issues and opportunities: beam simulations for heavy ion fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A

    1999-07-15

    UCRL- JC- 134975 PREPRINT code offering 3- D, axisymmetric, and ''transverse slice'' (steady flow) geometries, with a hierarchy of models for the ''lattice'' of focusing, bending, and accelerating elements. Interactive and script- driven code steering is afforded through an interpreter interface. The code runs with good parallel scaling on the T3E. Detailed simulations of machine segments and of complete small experiments, as well as simplified full- system runs, have been carried out, partially benchmarking the code. A magnetoinductive model, with module impedance and multi- beam effects, is under study. experiments, including an injector scalable to multi- beam arrays, a high-more » current beam transport and acceleration experiment, and a scaled final- focusing experiment. These ''phase I'' projects are laying the groundwork for the next major step in HIF development, the Integrated Research Experiment (IRE). Simulations aimed directly at the IRE must enable us to: design a facility with maximum power on target at minimal cost; set requirements for hardware tolerances, beam steering, etc.; and evaluate proposed chamber propagation modes. Finally, simulations must enable us to study all issues which arise in the context of a fusion driver, and must facilitate the assessment of driver options. In all of this, maximum advantage must be taken of emerging terascale computer architectures, requiring an aggressive code development effort. An organizing principle should be pursuit of the goal of integrated and detailed source- to- target simulation. methods for analysis of the beam dynamics in the various machine concepts, using moment- based methods for purposes of design, waveform synthesis, steering algorithm synthesis, etc. Three classes of discrete- particle models should be coupled: (1) electrostatic/ magnetoinductive PIC simulations should track the beams from the source through the final- focusing optics, passing details of the time- dependent distribution function to (2) electromagnetic or magnetoinductive PIC or hybrid PIG/ fluid simulations in the fusion chamber (which would finally pass their particle trajectory information to the radiation- hydrodynamics codes used for target design); in parallel, (3) detailed PIC, delta- f, core/ test- particle, and perhaps continuum Vlasov codes should be used to study individual sections of the driver and chamber very carefully; consistency may be assured by linking data from the PIC sequence, and knowledge gained may feed back into that sequence.« less

  17. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  18. Identifying parasitic and saprotrophic interactions of freshwater chytrids with a microalga

    NASA Astrophysics Data System (ADS)

    Ward, C.; Longcore, J. E.; Carney, L. T.; Mayali, X.; Pett-Ridge, J.; Thelen, M. P.; Stuart, R.

    2016-12-01

    Despite having long been regarded as ecologically insignificant, aquatic fungi may be key regulators of carbon cycling in phytoplankton-dominated freshwater ecosystems. For several decades, it has been known that through infection chytrids and other parasitic fungi can cause major declines in natural algal populations and the release of large quantities of organic matter into the water column. Additionally, as in other environments fungi may be critically important in the decomposition of refractory organic matter, although to our knowledge this has never been investigated in pelagic freshwater ecosystems. We have a limited understanding of how fungi can interact with phytoplankton or phytoplankton-derived organic matter, and logistical difficulties complicate their study in the environment. Here, we have developed a model green alga-chytrid system to characterize the interactions under varying host physiologies and to investigate how these interactions influence the physiological and metabolic outcomes of both members. Chytrid infection was clearly linked to algal growth stage in the fungal isolate belonging to Rhizophydiales with infectivity only in late cyst stage, while the isolate belonging to Paraphysoderma could infect in both early and late cyst stages. To test whether freshwater chytrids can metabolize algal-derived organic matter, fungal isolates were grown axenically in algal spent media from different growth stages. The Rhizophydiales isolate grew on algal exudate from early cyst stage, while the Paraphysodermaisolate grew on exudates from both growth stages. Ongoing work has focused on using biochemical and multi-omic approaches to study the mechanistic underpinnings of algal-fungal interactions and to better understand the factors contributing to growth stage- and strain-specific differences. Together, these findings suggest that fungi may play a dual role in regulating carbon cycling in freshwater ecosystems via parasitic and saprotrophic strategies. This research was supported by the U.S. DOE Office of Science through the Office of Biological and Environmental Research under FWP SCW1039 and the Office of Energy Efficiency and Renewable Energy under FWP 29886. Work was performed under the auspices of the U.S. Department of Energy under Contract DE-AC52-07NA27344.

  19. Studying astrophysical particle acceleration with laser-driven plasmas

    NASA Astrophysics Data System (ADS)

    Fiuza, Frederico

    2016-10-01

    The acceleration of non-thermal particles in plasmas is critical for our understanding of explosive astrophysical phenomena, from solar flares to gamma ray bursts. Particle acceleration is thought to be mediated by collisionless shocks and magnetic reconnection. The microphysics underlying these processes and their ability to efficiently convert flow and magnetic energy into non-thermal particles, however, is not yet fully understood. By performing for the first time ab initio 3D particle-in-cell simulations of the interaction of both magnetized and unmagnetized laser-driven plasmas, it is now possible to identify the optimal parameters for the study of particle acceleration in the laboratory relevant to astrophysical scenarios. It is predicted for the Omega and NIF laser conditions that significant non-thermal acceleration can occur during magnetic reconnection of laser-driven magnetized plasmas. Electrons are accelerated by the electric field near the X-points and trapped in contracting magnetic islands. This leads to a power-law tail extending to nearly a hundred times the thermal energy of the plasma and that contains a large fraction of the magnetic energy. The study of unmagnetized interpenetrating plasmas also reveals the possibility of forming collisionless shocks mediated by the Weibel instability on NIF. Under such conditions, both electrons and ions can be energized by scattering out of the Weibel-mediated turbulence. This also leads to power-law spectra that can be detected experimentally. The resulting experimental requirements to probe the microphysics of plasma particle acceleration will be discussed, paving the way for the first experiments of these important processes in the laboratory. As a result of these simulations and theoretical analysis, there are new experiments being planned on the Omega, NIF, and LCLS laser facilities to test these theoretical predictions. This work was supported by the SLAC LDRD program and DOE Office of Science, Fusion Energy Science (FWP 100182).

  20. Warm Dense Matter Demonstrating Non-Drude Conductivity from Observations of Nonlinear Plasmon Damping

    NASA Astrophysics Data System (ADS)

    Witte, Bastian B. L.

    2017-10-01

    The thermal and electrical conductivity, equation of state and the spectral opacity in warm dense matter (WDM) are essential properties for modeling, e.g., fusion experiments or the magnetic field generation in planets. In the last decade it has been shown that x-ray Thomson scattering (XRTS) is an effective tool to determine plasma parameters like temperature and density in the WDM regime. Recently, the electrical conductivity was extracted from XRTS experiments for the first time. The spectrally resolved scattering data of aluminum, isochorically heated by the Linac Coherent Light Source (LCLS), show strong dependence on electron correlations. Therefore, the damping of plasmons, the collective electron oscillations, has to be treated beyond perturbation theory. We present results for the dynamic transport properties in warm dense aluminum using density-functional-theory molecular dynamics (DFT-MD) simulations. The choice of the exchange-correlation (XC) functional, describing the interactions in the electronic subsystem, has significant impact on the ionization energy of bound electrons and the dynamic dielectric function. Our newly developed method for the calculation of XRTS signals including plasmon and bound-free transitions is based on transition matrix elements together with ionic contributions using uniquely DFT-MD simulations. The results show excellent agreement with the LCLS data if hybrid functionals are applied. The experimental finding of nonlinear plasmon damping is caused by the non-Drude conductivity in warm dense aluminum. Here, we show further validation by comparing with x-ray absorption data. These findings enable new insights into the impact of XC functionals on calculated properties of WDM and allow detailed predictions for future experiments at the unprecedented densities on the NIF. This work was performed in collaboration with P. Sperling, S.H. Glenzer, R. Redmer and was supported by the DFG via the Collaborative Research Center SFB 652 and the DOE Office of Science, Fusion Energy Science under Grant No. FWP 100182.

  1. A pervasive parallel framework for visualization: final report for FWP 10-014707

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.

    2014-01-01

    We are on the threshold of a transformative change in the basic architecture of highperformance computing. The use of accelerator processors, characterized by large core counts, shared but asymmetrical memory, and heavy thread loading, is quickly becoming the norm in high performance computing. These accelerators represent significant challenges in updating our existing base of software. An intrinsic problem with this transition is a fundamental programming shift from message passing processes to much more fine thread scheduling with memory sharing. Another problem is the lack of stability in accelerator implementation; processor and compiler technology is currently changing rapidly. This report documentsmore » the results of our three-year ASCR project to address these challenges. Our project includes the development of the Dax toolkit, which contains the beginnings of new algorithms for a new generation of computers and the underlying infrastructure to rapidly prototype and build further algorithms as necessary.« less

  2. From genome to drug lead: identification of a small-molecule inhibitor of the SARS virus.

    PubMed

    Dooley, Andrea J; Shindo, Nice; Taggart, Barbara; Park, Jewn-Giew; Pang, Yuan-Ping

    2006-02-15

    Virtual screening, a fast, computational approach to identify drug leads [Perola, E.; Xu, K.; Kollmeyer, T. M.; Kaufmann, S. H.; Prendergast, F. G. J. Med. Chem.2000, 43, 401; Miller, M. A. Nat. Rev. Drug Disc.2002, 1 220], is limited by a known challenge in crystallographically determining flexible regions of proteins. This approach has not been able to identify active inhibitors of the severe acute respiratory syndrome-associated coronavirus (SARS-CoV) using solely the crystal structures of a SARS-CoV cysteine proteinase with a flexible loop in the active site [Yang, H. T.; Yang, M. J.; Ding, Y.; Liu, Y. W.; Lou, Z. Y. Proc. Natl. Acad. Sci. U.S.A.2003, 100, 13190; Jenwitheesuk, E.; Samudrala, R. Bioorg. Med. Chem. Lett.2003, 13, 3989; Rajnarayanan, R. V.; Dakshanamurthy, S.; Pattabiraman, N. Biochem. Biophys. Res. Commun.2004, 321, 370; Du, Q.; Wang, S.; Wei, D.; Sirois, S.; Chou, K. Anal. Biochem.2005, 337, 262; Du, Q.; Wang, S.; Zhu, Y.; Wei, D.; Guo, H. Peptides2004, 25, 1857; Lee, V.; Wittayanarakul, K.; Remsungenen, T.; Parasuk, V.; Sompornpisut, P. Science (Asia)2003, 29, 181; Toney, J.; Navas-Martin, S.; Weiss, S.; Koeller, A. J. Med. Chem.2004, 47, 1079; Zhang, X. W.; Yap, Y. L. Bioorg. Med. Chem.2004, 12, 2517]. This article demonstrates a genome-to-drug-lead approach that uses terascale computing to model flexible regions of proteins, thus permitting the utilization of genetic information to identify drug leads expeditiously. A small-molecule inhibitor of SARS-CoV, exhibiting an effective concentration (EC50) of 23 microM in cell-based assays, was identified through virtual screening against a computer-predicted model of the cysteine proteinase. Screening against two crystal structures of the same proteinase failed to identify the 23-microM inhibitor. This study suggests that terascale computing can complement crystallography, broaden the scope of virtual screening, and accelerate the development of therapeutics to treat emerging infectious diseases such as SARS and Bird Flu.

  3. Seasonal-Scale Optimization of Conventional Hydropower Operations in the Upper Colorado System

    NASA Astrophysics Data System (ADS)

    Bier, A.; Villa, D.; Sun, A.; Lowry, T. S.; Barco, J.

    2011-12-01

    Sandia National Laboratories is developing the Hydropower Seasonal Concurrent Optimization for Power and the Environment (Hydro-SCOPE) tool to examine basin-wide conventional hydropower operations at seasonal time scales. This tool is part of an integrated, multi-laboratory project designed to explore different aspects of optimizing conventional hydropower operations. The Hydro-SCOPE tool couples a one-dimensional reservoir model with a river routing model to simulate hydrology and water quality. An optimization engine wraps around this model framework to solve for long-term operational strategies that best meet the specific objectives of the hydrologic system while honoring operational and environmental constraints. The optimization routines are provided by Sandia's open source DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) software. Hydro-SCOPE allows for multi-objective optimization, which can be used to gain insight into the trade-offs that must be made between objectives. The Hydro-SCOPE tool is being applied to the Upper Colorado Basin hydrologic system. This system contains six reservoirs, each with its own set of objectives (such as maximizing revenue, optimizing environmental indicators, meeting water use needs, or other objectives) and constraints. This leads to a large optimization problem with strong connectedness between objectives. The systems-level approach used by the Hydro-SCOPE tool allows simultaneous analysis of these objectives, as well as understanding of potential trade-offs related to different objectives and operating strategies. The seasonal-scale tool will be tightly integrated with the other components of this project, which examine day-ahead and real-time planning, environmental performance, hydrologic forecasting, and plant efficiency.

  4. Computational challenges in atomic, molecular and optical physics.

    PubMed

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  5. Interactive Terascale Particle Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David; Green, Bryan; Moran, Patrick

    2004-01-01

    This paper describes the methods used to produce an interactive visualization of a 2 TB computational fluid dynamics (CFD) data set using particle tracing (streaklines). We use the method introduced by Bruckschen et al. [2001] that pre-computes a large number of particles, stores them on disk using a space-filling curve ordering that minimizes seeks, and then retrieves and displays the particles according to the user's command. We describe how the particle computation can be performed using a PC cluster, how the algorithm can be adapted to work with a multi-block curvilinear mesh, and how the out-of-core visualization can be scaled to 296 billion particles while still achieving interactive performance on PG hardware. Compared to the earlier work, our data set size and total number of particles are an order of magnitude larger. We also describe a new compression technique that allows the lossless compression of the particles by 41% and speeds the particle retrieval by about 30%.

  6. Solving The Longstanding Problem Of Low-Energy Nuclear Reactions At the Highest Microscopic Level - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quaglioni, S.

    2016-09-22

    A 2011 DOE-NP Early Career Award (ECA) under Field Work Proposal (FWP) SCW1158 supported the project “Solving the Long-Standing Problem of Low-Energy Nuclear Reactions at the Highest Microscopic Level” in the five-year period from June 15, 2011 to June 14, 2016. This project, led by PI S. Quaglioni, aimed at developing a comprehensive and computationally efficient framework to arrive at a unified description of structural properties and reactions of light nuclei in terms of constituent protons and neutrons interacting through nucleon-nucleon (NN) and three-nucleon (3N) forces. Specifically, the project had three main goals: 1) arriving at the accurate predictions formore » fusion reactions that power stars and Earth-based fusion facilities; 2) realizing a comprehensive description of clustering and continuum effects in exotic nuclei, including light Borromean systems; and 3) achieving fundamental understanding of the role of the 3N force in nuclear reactions and nuclei at the drip line.« less

  7. A report documenting the completion of the Los Alamos National Laboratory portion of the ASC level II milestone ""Visualization on the supercomputing platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, James P; Patchett, John M; Lo, Li - Ta

    2011-01-24

    This report provides documentation for the completion of the Los Alamos portion of the ASC Level II 'Visualization on the Supercomputing Platform' milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratory and Los Alamos National Laboratory. The milestone text is shown in Figure 1 with the Los Alamos portions highlighted in boldfaced text. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is the most computationally intensive portion of the visualization process. Formore » terascale platforms, commodity clusters with graphics processors (GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the perfromance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. In conclusion, we improved CPU-based rendering performance by a a factor of 2-10 times on our tests. In addition, we evaluated CPU and CPU-based rendering performance. We encourage production visualization experts to consider using CPU-based rendering solutions when it is appropriate. For example, on remote supercomputers CPU-based rendering can offer a means of viewing data without having to offload the data or geometry onto a CPU-based visualization system. In terms of comparative performance of the CPU and CPU we believe that further optimizations of the performance of both CPU or CPU-based rendering are possible. The simulation community is currently confronting this reality as they work to port their simulations to different hardware architectures. What is interesting about CPU rendering of massive datasets is that for part two decades CPU performance has significantly outperformed CPU-based systems. Based on our advancements, evaluations and explorations we believe that CPU-based rendering has returned as one viable option for the visualization of massive datasets.« less

  8. Developing Models for Predictive Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, John B; Jones, Philip W

    2007-01-01

    The Community Climate System Model results from a multi-agency collaboration designed to construct cutting-edge climate science simulation models for a broad research community. Predictive climate simulations are currently being prepared for the petascale computers of the near future. Modeling capabilities are continuously being improved in order to provide better answers to critical questions about Earth's climate. Climate change and its implications are front page news in today's world. Could global warming be responsible for the July 2006 heat waves in Europe and the United States? Should more resources be devoted to preparing for an increase in the frequency of strongmore » tropical storms and hurricanes like Katrina? Will coastal cities be flooded due to a rise in sea level? The National Climatic Data Center (NCDC), which archives all weather data for the nation, reports that global surface temperatures have increased over the last century, and that the rate of increase is three times greater since 1976. Will temperatures continue to climb at this rate, will they decline again, or will the rate of increase become even steeper? To address such a flurry of questions, scientists must adopt a systematic approach and develop a predictive framework. With responsibility for advising on energy and technology strategies, the DOE is dedicated to advancing climate research in order to elucidate the causes of climate change, including the role of carbon loading from fossil fuel use. Thus, climate science--which by nature involves advanced computing technology and methods--has been the focus of a number of DOE's SciDAC research projects. Dr. John Drake (ORNL) and Dr. Philip Jones (LANL) served as principal investigators on the SciDAC project, 'Collaborative Design and Development of the Community Climate System Model for Terascale Computers.' The Community Climate System Model (CCSM) is a fully-coupled global system that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. The collaborative SciDAC team--including over a dozen researchers at institutions around the country--developed, validated, documented, and optimized the performance of CCSM using the latest software engineering approaches, computational technology, and scientific knowledge. Many of the factors that must be accounted for in a comprehensive model of the climate system are illustrated in figure 1.« less

  9. Enabling Linked Science in Global Climate Uncertainty Quantification (UQ) Research

    NASA Astrophysics Data System (ADS)

    Elsethagen, T.; Stephan, E.; Lin, G.; Williams, D.; Banks, E.

    2012-12-01

    This paper shares a real-world global climate UQ science use case and illustrates how a linked science application called Provenance Environment (ProvEn), currently being developed, enables and facilitates scientific teams to publish, share, link, and discover new links over their UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. This research claims that scientists using this linked science approach will not only allow them to greatly benefit from understanding a particular dataset within a knowledge context, a benefit can also be seen by the cross reference of knowledge among the numerous UQ studies being stored in ESGF. ProvEn collects native forms of data provenance resources as the UQ study is carried out. The native data provenance resources can be collected from a variety of sources such as scripts, a workflow engine log, simulation log files, scientific team members etc. Schema alignment is used to translate the native forms of provenance into a set of W3C PROV-O semantic statements used as a common interchange format which will also contain URI references back to resources in the UQ study dataset for querying and cross referencing. ProvEn leverages Fedora Commons' digital object model in a Resource Oriented Architecture (ROA) (i.e. a RESTful framework) to logically organize and partition native and translated provenance resources by UQ study. The ROA also provides scientists the means to both search native and translated forms of provenance.

  10. Terascale Optimal PDE Simulations (TOPS) Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Professor Olof B. Widlund

    2007-07-09

    Our work has focused on the development and analysis of domain decomposition algorithms for a variety of problems arising in continuum mechanics modeling. In particular, we have extended and analyzed FETI-DP and BDDC algorithms; these iterative solvers were first introduced and studied by Charbel Farhat and his collaborators, see [11, 45, 12], and by Clark Dohrmann of SANDIA, Albuquerque, see [43, 2, 1], respectively. These two closely related families of methods are of particular interest since they are used more extensively than other iterative substructuring methods to solve very large and difficult problems. Thus, the FETI algorithms are part ofmore » the SALINAS system developed by the SANDIA National Laboratories for very large scale computations, and as already noted, BDDC was first developed by a SANDIA scientist, Dr. Clark Dohrmann. The FETI algorithms are also making inroads in commercial engineering software systems. We also note that the analysis of these algorithms poses very real mathematical challenges. The success in developing this theory has, in several instances, led to significant improvements in the performance of these algorithms. A very desirable feature of these iterative substructuring and other domain decomposition algorithms is that they respect the memory hierarchy of modern parallel and distributed computing systems, which is essential for approaching peak floating point performance. The development of improved methods, together with more powerful computer systems, is making it possible to carry out simulations in three dimensions, with quite high resolution, relatively easily. This work is supported by high quality software systems, such as Argonne's PETSc library, which facilitates code development as well as the access to a variety of parallel and distributed computer systems. The success in finding scalable and robust domain decomposition algorithms for very large number of processors and very large finite element problems is, e.g., illustrated in [24, 25, 26]. This work is based on [29, 31]. Our work over these five and half years has, in our opinion, helped advance the knowledge of domain decomposition methods significantly. We see these methods as providing valuable alternatives to other iterative methods, in particular, those based on multi-grid. In our opinion, our accomplishments also match the goals of the TOPS project quite closely.« less

  11. The Zooniverse

    NASA Astrophysics Data System (ADS)

    Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.

    2009-12-01

    The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.

  12. Search for jet extinction in the inclusive jet-pt spectrum from proton-proton collisions at sqrt(s) = 8 TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khachatryan, Vardan; et al.,

    The first search at the LHC for the extinction of QCD jet production is presented, using data collected with the CMS detector corresponding to an integrated luminosity of 10.7 inverse femtobarns of proton-proton collisions at a center-of-mass energy of 8 TeV. The extinction model studied in this analysis is motivated by the search for signatures of strong gravity at the TeV scale (terascale gravity) and assumes the existence of string couplings in the strong-coupling limit. In this limit, the string model predicts the suppression of all high-transverse-momentum standard model processes, including jet production, beyond a certain energy scale. To testmore » this prediction, the measured transverse-momentum spectrum is compared to the theoretical prediction of the standard model. No significant deficit of events is found at high transverse momentum. A 95% confidence level lower limit of 3.3 TeV is set on the extinction mass scale.« less

  13. Search for jet extinction in the inclusive jet-pT spectrum from proton-proton collisions at √s =8 TeV

    NASA Astrophysics Data System (ADS)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Bergauer, T.; Dragicevic, M.; Erö, J.; Fabjan, C.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hartl, C.; Hörmann, N.; Hrubec, J.; Jeitler, M.; Kiesenhofer, W.; Knünz, V.; Krammer, M.; Krätschmer, I.; Liko, D.; Mikulec, I.; Rabady, D.; Rahbaran, B.; Rohringer, H.; Schöfbeck, R.; Strauss, J.; Taurok, A.; Treberer-Treberspurg, W.; Waltenberger, W.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Alderweireldt, S.; Bansal, M.; Bansal, S.; Cornelis, T.; De Wolf, E. A.; Janssen, X.; Knutsson, A.; Luyckx, S.; Ochesanu, S.; Roland, B.; Rougny, R.; Van De Klundert, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Blekman, F.; Blyweert, S.; D'Hondt, J.; Daci, N.; Heracleous, N.; Kalogeropoulos, A.; Keaveney, J.; Kim, T. J.; Lowette, S.; Maes, M.; Olbrechts, A.; Python, Q.; Strom, D.; Tavernier, S.; Van Doninck, W.; Van Mulders, P.; Van Onsem, G. P.; Villella, I.; Caillol, C.; Clerbaux, B.; De Lentdecker, G.; Dobur, D.; Favart, L.; Gay, A. P. R.; Grebenyuk, A.; Léonard, A.; Mohammadi, A.; Perniè, L.; Reis, T.; Seva, T.; Thomas, L.; Vander Velde, C.; Vanlaer, P.; Wang, J.; Adler, V.; Beernaert, K.; Benucci, L.; Cimmino, A.; Costantini, S.; Crucy, S.; Dildick, S.; Fagot, A.; Garcia, G.; Klein, B.; Mccartin, J.; Ocampo Rios, A. A.; Ryckbosch, D.; Salva Diblen, S.; Sigamani, M.; Strobbe, N.; Thyssen, F.; Tytgat, M.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Beluffi, C.; Bruno, G.; Castello, R.; Caudron, A.; Ceard, L.; Da Silveira, G. G.; Delaere, C.; du Pree, T.; Favart, D.; Forthomme, L.; Giammanco, A.; Hollar, J.; Jez, P.; Komm, M.; Lemaitre, V.; Liao, J.; Nuttens, C.; Pagano, D.; Pin, A.; Piotrzkowski, K.; Popov, A.; Quertenmont, L.; Selvaggi, M.; Vidal Marono, M.; Vizan Garcia, J. M.; Beliy, N.; Caebergs, T.; Daubie, E.; Hammad, G. H.; Alves, G. A.; Correa Martins Junior, M.; Dos Reis Martins, T.; Pol, M. E.; Aldá Júnior, W. L.; Carvalho, W.; Chinellato, J.; Custódio, A.; Da Costa, E. M.; De Jesus Damiao, D.; De Oliveira Martins, C.; Fonseca De Souza, S.; Malbouisson, H.; Malek, M.; Matos Figueiredo, D.; Mundim, L.; Nogima, H.; Prado Da Silva, W. L.; Santaolalla, J.; Santoro, A.; Sznajder, A.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Bernardes, C. A.; Dias, F. A.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Mercadante, P. G.; Novaes, S. F.; Padula, Sandra S.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Marinov, A.; Piperov, S.; Rodozov, M.; Sultanov, G.; Vutova, M.; Dimitrov, A.; Glushkov, I.; Hadjiiska, R.; Kozhuharov, V.; Litov, L.; Pavlov, B.; Petkov, P.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Chen, M.; Du, R.; Jiang, C. H.; Liang, D.; Liang, S.; Plestina, R.; Tao, J.; Wang, X.; Wang, Z.; Asawatangtrakuldee, C.; Ban, Y.; Guo, Y.; Li, Q.; Li, W.; Liu, S.; Mao, Y.; Qian, S. J.; Wang, D.; Zhang, L.; Zou, W.; Avila, C.; Chaparro Sierra, L. F.; Florez, C.; Gomez, J. P.; Gomez Moreno, B.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Polic, D.; Puljak, I.; Antunovic, Z.; Kovac, M.; Brigljevic, V.; Kadija, K.; Luetic, J.; Mekterovic, D.; Sudic, L.; Attikis, A.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Bodlak, M.; Finger, M.; Finger, M.; Assran, Y.; Elgammal, S.; Mahmoud, M. A.; Radi, A.; Kadastik, M.; Murumaa, M.; Raidal, M.; Tiko, A.; Eerola, P.; Fedi, G.; Voutilainen, M.; Härkönen, J.; Karimäki, V.; Kinnunen, R.; Kortelainen, M. J.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Peltola, T.; Tuominen, E.; Tuominiemi, J.; Tuovinen, E.; Wendland, L.; Tuuva, T.; Besancon, M.; Couderc, F.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Favaro, C.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Malcles, J.; Nayak, A.; Rander, J.; Rosowsky, A.; Titov, M.; Baffioni, S.; Beaudette, F.; Busson, P.; Charlot, C.; Dahms, T.; Dalchenko, M.; Dobrzynski, L.; Filipovic, N.; Florent, A.; Granier de Cassagnac, R.; Mastrolorenzo, L.; Miné, P.; Mironov, C.; Naranjo, I. N.; Nguyen, M.; Ochando, C.; Paganini, P.; Salerno, R.; Sauvan, J. B.; Sirois, Y.; Veelken, C.; Yilmaz, Y.; Zabi, A.; Agram, J.-L.; Andrea, J.; Aubin, A.; Bloch, D.; Brom, J.-M.; Chabert, E. C.; Collard, C.; Conte, E.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Goetzmann, C.; Le Bihan, A.-C.; Van Hove, P.; Gadrat, S.; Beauceron, S.; Beaupere, N.; Boudoul, G.; Brochet, S.; Carrillo Montoya, C. A.; Chasserat, J.; Chierici, R.; Contardo, D.; Depasse, P.; El Mamouni, H.; Fan, J.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Kurca, T.; Lethuillier, M.; Mirabito, L.; Perries, S.; Ruiz Alvarez, J. D.; Sabes, D.; Sgandurra, L.; Sordini, V.; Vander Donckt, M.; Verdier, P.; Viret, S.; Xiao, H.; Tsamalaidze, Z.; Autermann, C.; Beranek, S.; Bontenackels, M.; Calpas, B.; Edelhoff, M.; Feld, L.; Hindrichs, O.; Klein, K.; Ostapchuk, A.; Perieanu, A.; Raupach, F.; Sammet, J.; Schael, S.; Sprenger, D.; Weber, H.; Wittmer, B.; Zhukov, V.; Ata, M.; Caudron, J.; Dietz-Laursonn, E.; Duchardt, D.; Erdmann, M.; Fischer, R.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Klingebiel, D.; Knutzen, S.; Kreuzer, P.; Merschmeyer, M.; Meyer, A.; Olschewski, M.; Padeken, K.; Papacz, P.; Reithler, H.; Schmitz, S. A.; Sonnenschein, L.; Teyssier, D.; Thüer, S.; Weber, M.; Cherepanov, V.; Erdogan, Y.; Flügge, G.; Geenen, H.; Geisler, M.; Haj Ahmad, W.; Hoehle, F.; Kargoll, B.; Kress, T.; Kuessel, Y.; Lingemann, J.; Nowack, A.; Nugent, I. M.; Perchalla, L.; Pooth, O.; Stahl, A.; Asin, I.; Bartosik, N.; Behr, J.; Behrenhoff, W.; Behrens, U.; Bell, A. J.; Bergholz, M.; Bethani, A.; Borras, K.; Burgmeier, A.; Cakir, A.; Calligaris, L.; Campbell, A.; Choudhury, S.; Costanza, F.; Diez Pardos, C.; Dooling, S.; Dorland, T.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Flucke, G.; Garay Garcia, J.; Geiser, A.; Gunnellini, P.; Hauk, J.; Hellwig, G.; Hempel, M.; Horton, D.; Jung, H.; Kasemann, M.; Katsas, P.; Kieseler, J.; Kleinwort, C.; Krücker, D.; Lange, W.; Leonard, J.; Lipka, K.; Lobanov, A.; Lohmann, W.; Lutz, B.; Mankel, R.; Marfin, I.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Novgorodova, O.; Nowak, F.; Ntomari, E.; Perrey, H.; Pitzl, D.; Placakyte, R.; Raspereza, A.; Ribeiro Cipriano, P. M.; Ron, E.; Sahin, M. Ö.; Salfeld-Nebgen, J.; Saxena, P.; Schmidt, R.; Schoerner-Sadenius, T.; Schröder, M.; Spannagel, S.; Vargas Trevino, A. D. R.; Walsh, R.; Wissing, C.; Aldaya Martin, M.; Blobel, V.; Centis Vignali, M.; Erfle, J.; Garutti, E.; Goebel, K.; Görner, M.; Gosselink, M.; Haller, J.; Höing, R. S.; Kirschenmann, H.; Klanner, R.; Kogler, R.; Lange, J.; Lapsien, T.; Lenz, T.; Marchesini, I.; Ott, J.; Peiffer, T.; Pietsch, N.; Rathjens, D.; Sander, C.; Schettler, H.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Seidel, M.; Sibille, J.; Sola, V.; Stadie, H.; Steinbrück, G.; Troendle, D.; Usai, E.; Vanelderen, L.; Barth, C.; Baus, C.; Berger, J.; Böser, C.; Butz, E.; Chwalek, T.; De Boer, W.; Descroix, A.; Dierlamm, A.; Feindt, M.; Hartmann, F.; Hauth, T.; Husemann, U.; Katkov, I.; Kornmayer, A.; Kuznetsova, E.; Lobelle Pardo, P.; Mozer, M. U.; Müller, Th.; Nürnberg, A.; Quast, G.; Rabbertz, K.; Ratnikov, F.; Röcker, S.; Simonis, H. J.; Stober, F. M.; Ulrich, R.; Wagner-Kuhr, J.; Wayand, S.; Weiler, T.; Wolf, R.; Anagnostou, G.; Daskalakis, G.; Geralis, T.; Giakoumopoulou, V. A.; Kyriakis, A.; Loukas, D.; Markou, A.; Markou, C.; Psallidas, A.; Topsis-Giotis, I.; Gouskos, L.; Panagiotou, A.; Saoulidou, N.; Stiliaris, E.; Aslanoglou, X.; Evangelou, I.; Flouris, G.; Foudas, C.; Kokkas, P.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Bencze, G.; Hajdu, C.; Hidas, P.; Horvath, D.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Zsigmond, A. J.; Beni, N.; Czellar, S.; Karancsi, J.; Molnar, J.; Palinkas, J.; Szillasi, Z.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Swain, S. K.; Beri, S. B.; Bhatnagar, V.; Dhingra, N.; Gupta, R.; Kalsi, A. K.; Kaur, M.; Mittal, M.; Nishu, N.; Singh, J. B.; Kumar, Ashok; Kumar, Arun; Ahuja, S.; Bhardwaj, A.; Choudhary, B. C.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Ranjan, K.; Sharma, V.; Banerjee, S.; Bhattacharya, S.; Chatterjee, K.; Dutta, S.; Gomber, B.; Jain, Sa.; Jain, Sh.; Khurana, R.; Modak, A.; Mukherjee, S.; Roy, D.; Sarkar, S.; Sharan, M.; Abdulsalam, A.; Dutta, D.; Kailas, S.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Topkar, A.; Aziz, T.; Banerjee, S.; Chatterjee, R. M.; Dewanjee, R. K.; Dugad, S.; Ganguly, S.; Ghosh, S.; Guchait, M.; Gurtu, A.; Kole, G.; Kumar, S.; Maity, M.; Majumder, G.; Mazumdar, K.; Mohanty, G. B.; Parida, B.; Sudhakar, K.; Wickramage, N.; Bakhshiansohi, H.; Behnamian, H.; Etesami, S. M.; Fahim, A.; Goldouzian, R.; Jafari, A.; Khakzad, M.; Mohammadi Najafabadi, M.; Naseri, M.; Paktinat Mehdiabadi, S.; Safarzadeh, B.; Zeinali, M.; Felcini, M.; Grunewald, M.; Abbrescia, M.; Barbone, L.; Calabria, C.; Chhibra, S. S.; Colaleo, A.; Creanza, D.; De Filippis, N.; De Palma, M.; Fiore, L.; Iaselli, G.; Maggi, G.; Maggi, M.; My, S.; Nuzzo, S.; Pompili, A.; Pugliese, G.; Radogna, R.; Selvaggi, G.; Silvestris, L.; Singh, G.; Venditti, R.; Verwilligen, P.; Zito, G.; Abbiendi, G.; Benvenuti, A. C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Campanini, R.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Codispoti, G.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Montanari, A.; Navarria, F. L.; Perrotta, A.; Primavera, F.; Rossi, A. M.; Rovelli, T.; Siroli, G. P.; Tosi, N.; Travaglini, R.; Albergo, S.; Cappello, G.; Chiorboli, M.; Costa, S.; Giordano, F.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Gallo, E.; Gonzi, S.; Gori, V.; Lenzi, P.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Tropiano, A.; Benussi, L.; Bianco, S.; Fabbri, F.; Piccolo, D.; Ferro, F.; Lo Vetere, M.; Robutti, E.; Tosi, S.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Gerosa, R.; Ghezzi, A.; Govoni, P.; Lucchini, M. T.; Malvezzi, S.; Manzoni, R. A.; Martelli, A.; Marzocchi, B.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Ragazzi, S.; Redaelli, N.; Tabarelli de Fatis, T.; Buontempo, S.; Cavallo, N.; Di Guida, S.; Fabozzi, F.; Iorio, A. O. M.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Azzi, P.; Bacchetta, N.; Bisello, D.; Branca, A.; Carlin, R.; Dall'Osso, M.; Dorigo, T.; Galanti, M.; Gasparini, F.; Giubilato, P.; Gozzelino, A.; Kanishchev, K.; Lacaprara, S.; Margoni, M.; Meneguzzo, A. T.; Montecassiano, F.; Passaseo, M.; Pazzini, J.; Pozzobon, N.; Ronchese, P.; Simonetto, F.; Torassa, E.; Tosi, M.; Vanini, S.; Zotto, P.; Zucchetta, A.; Zumerle, G.; Gabusi, M.; Ratti, S. P.; Riccardi, C.; Salvini, P.; Vitulo, P.; Biasini, M.; Bilei, G. M.; Ciangottini, D.; Fanò, L.; Lariccia, P.; Mantovani, G.; Menichelli, M.; Romeo, F.; Saha, A.; Santocchia, A.; Spiezia, A.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Bernardini, J.; Boccali, T.; Broccolo, G.; Castaldi, R.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fiori, F.; Foà, L.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Moon, C. S.; Palla, F.; Rizzi, A.; Savoy-Navarro, A.; Serban, A. T.; Spagnolo, P.; Squillacioti, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Vernieri, C.; Barone, L.; Cavallari, F.; Del Re, D.; Diemoz, M.; Grassi, M.; Jorda, C.; Longo, E.; Margaroli, F.; Meridiani, P.; Micheli, F.; Nourbakhsh, S.; Organtini, G.; Paramatti, R.; Rahatlou, S.; Rovelli, C.; Santanastasio, F.; Soffi, L.; Traczyk, P.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Bellan, R.; Biino, C.; Cartiglia, N.; Casasso, S.; Costa, M.; Degano, A.; Demaria, N.; Finco, L.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Musich, M.; Obertino, M. M.; Ortona, G.; Pacher, L.; Pastrone, N.; Pelliccioni, M.; Pinna Angioni, G. L.; Potenza, A.; Romero, A.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Tamponi, U.; Belforte, S.; Candelise, V.; Casarsa, M.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; La Licata, C.; Marone, M.; Montanino, D.; Schizzi, A.; Umer, T.; Zanetti, A.; Chang, S.; Kropivnitskaya, A.; Nam, S. K.; Kim, D. H.; Kim, G. N.; Kim, M. S.; Kong, D. J.; Lee, S.; Oh, Y. D.; Park, H.; Sakharov, A.; Son, D. C.; Kim, J. Y.; Song, S.; Choi, S.; Gyun, D.; Hong, B.; Jo, M.; Kim, H.; Kim, Y.; Lee, B.; Lee, K. S.; Park, S. K.; Roh, Y.; Choi, M.; Kim, J. H.; Park, I. C.; Park, S.; Ryu, G.; Ryu, M. S.; Choi, Y.; Choi, Y. K.; Goh, J.; Kwon, E.; Lee, J.; Seo, H.; Yu, I.; Juodagalvis, A.; Komaragiri, J. R.; Castilla-Valdez, H.; De La Cruz-Burelo, E.; Heredia-de La Cruz, I.; Lopez-Fernandez, R.; Sanchez-Hernandez, A.; Carrillo Moreno, S.; Vazquez Valencia, F.; Pedraza, I.; Salazar Ibarguen, H. A.; Casimiro Linares, E.; Morelos Pineda, A.; Krofcheck, D.; Butler, P. H.; Reucroft, S.; Ahmad, A.; Ahmad, M.; Hassan, Q.; Hoorani, H. R.; Khalid, S.; Khan, W. A.; Khurshid, T.; Shah, M. A.; Shoaib, M.; Bialkowska, H.; Bluj, M.; Boimska, B.; Frueboes, T.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Zalewski, P.; Brona, G.; Bunkowski, K.; Cwiok, M.; Dominik, W.; Doroba, K.; Kalinowski, A.; Konecki, M.; Krolikowski, J.; Misiura, M.; Olszewski, M.; Wolszczak, W.; Bargassa, P.; Beirão Da Cruz E Silva, C.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Nguyen, F.; Rodrigues Antunes, J.; Seixas, J.; Varela, J.; Vischia, P.; Afanasiev, S.; Bunin, P.; Gavrilenko, M.; Golutvin, I.; Karjavin, V.; Konoplyanikov, V.; Lanev, A.; Malakhov, A.; Matveev, V.; Moisenz, P.; Palichik, V.; Perelygin, V.; Savina, M.; Shmatov, S.; Shulha, S.; Skatchkov, N.; Smirnov, V.; Zarubin, A.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Vorobyev, An.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Kirsanov, M.; Krasnikov, N.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Safronov, G.; Semenov, S.; Spiridonov, A.; Stolin, V.; Vlasov, E.; Zhokin, A.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Vinogradov, A.; Belyaev, A.; Boos, E.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Snigirev, A.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Kachanov, V.; Kalinin, A.; Konstantinov, D.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Dordevic, M.; Ekmedzic, M.; Milosevic, J.; Alcaraz Maestre, J.; Battilana, C.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; De La Cruz, B.; Delgado Peris, A.; Domínguez Vázquez, D.; Escalante Del Valle, A.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Merino, G.; Navarro De Martino, E.; Pérez-Calero Yzquierdo, A.; Puerta Pelayo, J.; Quintario Olmeda, A.; Redondo, I.; Romero, L.; Soares, M. S.; Albajar, C.; de Trocóniz, J. F.; Missiroli, M.; Brun, H.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Lloret Iglesias, L.; Brochero Cifuentes, J. A.; Cabrillo, I. J.; Calderon, A.; Duarte Campderros, J.; Fernandez, M.; Gomez, G.; Graziano, A.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Munoz Sanchez, F. J.; Piedra Gomez, J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Bachtis, M.; Baillon, P.; Ball, A. H.; Barney, D.; Benaglia, A.; Bendavid, J.; Benhabib, L.; Benitez, J. F.; Bernet, C.; Bianchi, G.; Bloch, P.; Bocci, A.; Bonato, A.; Bondu, O.; Botta, C.; Breuker, H.; Camporesi, T.; Cerminara, G.; Christiansen, T.; Colafranceschi, S.; D'Alfonso, M.; d'Enterria, D.; Dabrowski, A.; David, A.; De Guio, F.; De Roeck, A.; De Visscher, S.; Dobson, M.; Dupont-Sagorin, N.; Elliott-Peisert, A.; Eugster, J.; Franzoni, G.; Funk, W.; Giffels, M.; Gigi, D.; Gill, K.; Giordano, D.; Girone, M.; Glege, F.; Guida, R.; Gundacker, S.; Guthoff, M.; Hammer, J.; Hansen, M.; Harris, P.; Hegeman, J.; Innocente, V.; Janot, P.; Kousouris, K.; Krajczar, K.; Lecoq, P.; Lourenço, C.; Magini, N.; Malgeri, L.; Mannelli, M.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moortgat, F.; Morovic, S.; Mulders, M.; Musella, P.; Orsini, L.; Pape, L.; Perez, E.; Perrozzi, L.; Petrilli, A.; Petrucciani, G.; Pfeiffer, A.; Pierini, M.; Pimiä, M.; Piparo, D.; Plagge, M.; Racz, A.; Rolandi, G.; Rovere, M.; Sakulin, H.; Schäfer, C.; Schwick, C.; Sekmen, S.; Sharma, A.; Siegrist, P.; Silva, P.; Simon, M.; Sphicas, P.; Spiga, D.; Steggemann, J.; Stieger, B.; Stoye, M.; Treille, D.; Tsirou, A.; Veres, G. I.; Vlimant, J. R.; Wardle, N.; Wöhri, H. K.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; König, S.; Kotlinski, D.; Langenegger, U.; Renker, D.; Rohe, T.; Bachmair, F.; Bäni, L.; Bianchini, L.; Bortignon, P.; Buchmann, M. A.; Casal, B.; Chanon, N.; Deisher, A.; Dissertori, G.; Dittmar, M.; Donegà, M.; Dünser, M.; Eller, P.; Grab, C.; Hits, D.; Lustermann, W.; Mangano, B.; Marini, A. C.; Martinez Ruiz del Arbol, P.; Meister, D.; Mohr, N.; Nägeli, C.; Nef, P.; Nessi-Tedaldi, F.; Pandolfi, F.; Pauss, F.; Peruzzi, M.; Quittnat, M.; Rebane, L.; Ronga, F. J.; Rossini, M.; Starodumov, A.; Takahashi, M.; Theofilatos, K.; Wallny, R.; Weber, H. A.; Amsler, C.; Canelli, M. F.; Chiochia, V.; De Cosa, A.; Hinzmann, A.; Hreus, T.; Ivova Rikova, M.; Kilminster, B.; Millan Mejias, B.; Ngadiuba, J.; Robmann, P.; Snoek, H.; Taroni, S.; Verzetti, M.; Yang, Y.; Cardaci, M.; Chen, K. H.; Ferro, C.; Kuo, C. M.; Lin, W.; Lu, Y. J.; Volpe, R.; Yu, S. S.; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Chen, P. H.; Dietz, C.; Grundler, U.; Hou, W.-S.; Kao, K. Y.; Lei, Y. J.; Liu, Y. F.; Lu, R.-S.; Majumder, D.; Petrakou, E.; Shi, X.; Tzeng, Y. M.; Wilken, R.; Asavapibhop, B.; Srimanobhas, N.; Suwonjandee, N.; Adiguzel, A.; Bakirci, M. N.; Cerci, S.; Dozen, C.; Dumanoglu, I.; Eskut, E.; Girgis, S.; Gokbulut, G.; Gurpinar, E.; Hos, I.; Kangal, E. E.; Kayis Topaksu, A.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Polatoz, A.; Sogut, K.; Sunar Cerci, D.; Tali, B.; Topakli, H.; Vergili, M.; Akin, I. V.; Bilin, B.; Bilmis, S.; Gamsizkan, H.; Karapinar, G.; Ocalan, K.; Surat, U. E.; Yalvac, M.; Zeyrek, M.; Gülmez, E.; Isildak, B.; Kaya, M.; Kaya, O.; Bahtiyar, H.; Barlas, E.; Cankocak, K.; Vardarlı, F. I.; Yücel, M.; Levchuk, L.; Sorokin, P.; Brooke, J. J.; Clement, E.; Cussans, D.; Flacher, H.; Frazier, R.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Jacob, J.; Kreczko, L.; Lucas, C.; Meng, Z.; Newbold, D. M.; Paramesvaran, S.; Poll, A.; Senkin, S.; Smith, V. J.; Williams, T.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Olaiya, E.; Petyt, D.; Shepherd-Themistocleous, C. H.; Thea, A.; Tomalin, I. R.; Womersley, W. J.; Worm, S. D.; Baber, M.; Bainbridge, R.; Buchmuller, O.; Burton, D.; Colling, D.; Cripps, N.; Cutajar, M.; Dauncey, P.; Davies, G.; Della Negra, M.; Dunne, P.; Ferguson, W.; Fulcher, J.; Futyan, D.; Gilbert, A.; Hall, G.; Iles, G.; Jarvis, M.; Karapostoli, G.; Kenzie, M.; Lane, R.; Lucas, R.; Lyons, L.; Magnan, A.-M.; Malik, S.; Marrouche, J.; Mathias, B.; Nash, J.; Nikitenko, A.; Pela, J.; Pesaresi, M.; Petridis, K.; Raymond, D. M.; Rogerson, S.; Rose, A.; Seez, C.; Sharp, P.; Tapper, A.; Vazquez Acosta, M.; Virdee, T.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leggat, D.; Leslie, D.; Martin, W.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Dittmann, J.; Hatakeyama, K.; Kasmi, A.; Liu, H.; Scarborough, T.; Charaf, O.; Cooper, S. I.; Henderson, C.; Rumerio, P.; Avetisyan, A.; Bose, T.; Fantasia, C.; Heister, A.; Lawson, P.; Richardson, C.; Rohlf, J.; Sperka, D.; St. John, J.; Sulak, L.; Alimena, J.; Bhattacharya, S.; Christopher, G.; Cutts, D.; Demiragli, Z.; Ferapontov, A.; Garabedian, A.; Heintz, U.; Jabeen, S.; Kukartsev, G.; Laird, E.; Landsberg, G.; Luk, M.; Narain, M.; Segala, M.; Sinthuprasith, T.; Speer, T.; Swanson, J.; Breedon, R.; Breto, G.; Calderon De La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Erbacher, R.; Gardner, M.; Ko, W.; Lander, R.; Miceli, T.; Mulhearn, M.; Pellett, D.; Pilot, J.; Ricci-Tam, F.; Searle, M.; Shalhout, S.; Smith, J.; Squires, M.; Stolp, D.; Tripathi, M.; Wilbur, S.; Yohay, R.; Cousins, R.; Everaerts, P.; Farrell, C.; Hauser, J.; Ignatenko, M.; Rakness, G.; Takasugi, E.; Valuev, V.; Weber, M.; Babb, J.; Clare, R.; Ellison, J.; Gary, J. W.; Hanson, G.; Heilman, J.; Jandir, P.; Kennedy, E.; Lacroix, F.; Liu, H.; Long, O. R.; Luthra, A.; Malberti, M.; Nguyen, H.; Shrinivas, A.; Sturdy, J.; Sumowidagdo, S.; Wimpenny, S.; Andrews, W.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; D'Agnolo, R. T.; Evans, D.; Holzner, A.; Kelley, R.; Lebourgeois, M.; Letts, J.; Macneill, I.; Olivito, D.; Padhi, S.; Palmer, C.; Pieri, M.; Sani, M.; Sharma, V.; Simon, S.; Sudano, E.; Tadel, M.; Tu, Y.; Vartak, A.; Würthwein, F.; Yagil, A.; Yoo, J.; Barge, D.; Bradmiller-Feld, J.; Campagnari, C.; Danielson, T.; Dishaw, A.; Flowers, K.; Franco Sevilla, M.; Geffert, P.; George, C.; Golf, F.; Incandela, J.; Justus, C.; Mccoll, N.; Richman, J.; Stuart, D.; To, W.; West, C.; Apresyan, A.; Bornheim, A.; Bunn, J.; Chen, Y.; Di Marco, E.; Duarte, J.; Mott, A.; Newman, H. B.; Pena, C.; Rogan, C.; Spiropulu, M.; Timciuc, V.; Wilkinson, R.; Xie, S.; Zhu, R. Y.; Azzolini, V.; Calamba, A.; Carroll, R.; Ferguson, T.; Iiyama, Y.; Paulini, M.; Russ, J.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Drell, B. R.; Ford, W. T.; Gaz, A.; Luiggi Lopez, E.; Nauenberg, U.; Smith, J. G.; Stenson, K.; Ulmer, K. A.; Wagner, S. R.; Alexander, J.; Chatterjee, A.; Chu, J.; Dittmer, S.; Eggert, N.; Hopkins, W.; Kreis, B.; Mirman, N.; Nicolas Kaufman, G.; Patterson, J. R.; Ryd, A.; Salvati, E.; Skinnari, L.; Sun, W.; Teo, W. D.; Thom, J.; Thompson, J.; Tucker, J.; Weng, Y.; Winstrom, L.; Wittich, P.; Winn, D.; Abdullin, S.; Albrow, M.; Anderson, J.; Apollinari, G.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Burkett, K.; Butler, J. N.; Cheung, H. W. K.; Chlebana, F.; Cihangir, S.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gottschalk, E.; Gray, L.; Green, D.; Grünendahl, S.; Gutsche, O.; Hanlon, J.; Hare, D.; Harris, R. M.; Hirschauer, J.; Hooberman, B.; Jindariani, S.; Johnson, M.; Joshi, U.; Kaadze, K.; Klima, B.; Kwan, S.; Linacre, J.; Lincoln, D.; Lipton, R.; Liu, T.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Martinez Outschoorn, V. I.; Maruyama, S.; Mason, D.; McBride, P.; Mishra, K.; Mrenna, S.; Musienko, Y.; Nahn, S.; Newman-Holmes, C.; O'Dell, V.; Prokofyev, O.; Sexton-Kennedy, E.; Sharma, S.; Soha, A.; Spalding, W. J.; Spiegel, L.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vidal, R.; Whitbeck, A.; Whitmore, J.; Yang, F.; Acosta, D.; Avery, P.; Bourilkov, D.; Carver, M.; Cheng, T.; Curry, D.; Das, S.; De Gruttola, M.; Di Giovanni, G. P.; Field, R. D.; Fisher, M.; Furic, I. K.; Hugon, J.; Konigsberg, J.; Korytov, A.; Kypreos, T.; Low, J. F.; Matchev, K.; Milenovic, P.; Mitselmakher, G.; Muniz, L.; Rinkevicius, A.; Shchutska, L.; Skhirtladze, N.; Snowball, M.; Yelton, J.; Zakaria, M.; Gaultney, V.; Hewamanage, S.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Adams, T.; Askew, A.; Bochenek, J.; Diamond, B.; Haas, J.; Hagopian, S.; Hagopian, V.; Johnson, K. F.; Prosper, H.; Veeraraghavan, V.; Weinberg, M.; Baarmand, M. M.; Hohlmann, M.; Kalakhety, H.; Yumiceva, F.; Adams, M. R.; Apanasevich, L.; Bazterra, V. E.; Berry, D.; Betts, R. R.; Bucinskaite, I.; Cavanaugh, R.; Evdokimov, O.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Khalatyan, S.; Kurt, P.; Moon, D. H.; O'Brien, C.; Silkworth, C.; Turner, P.; Varelas, N.; Albayrak, E. A.; Bilki, B.; Clarida, W.; Dilsiz, K.; Duru, F.; Haytmyradov, M.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Rahmat, R.; Sen, S.; Tan, P.; Tiras, E.; Wetzel, J.; Yetkin, T.; Yi, K.; Barnett, B. A.; Blumenfeld, B.; Bolognesi, S.; Fehling, D.; Gritsan, A. V.; Maksimovic, P.; Martin, C.; Swartz, M.; Baringer, P.; Bean, A.; Benelli, G.; Bruner, C.; Gray, J.; Kenny, R. P.; Murray, M.; Noonan, D.; Sanders, S.; Sekaric, J.; Stringer, R.; Wang, Q.; Wood, J. S.; Barfuss, A. F.; Chakaberia, I.; Ivanov, A.; Khalil, S.; Makouski, M.; Maravin, Y.; Saini, L. K.; Shrestha, S.; Svintradze, I.; Gronberg, J.; Lange, D.; Rebassoo, F.; Wright, D.; Baden, A.; Calvert, B.; Eno, S. C.; Gomez, J. A.; Hadley, N. J.; Kellogg, R. G.; Kolberg, T.; Lu, Y.; Marionneau, M.; Mignerey, A. C.; Pedro, K.; Skuja, A.; Tonjes, M. B.; Tonwar, S. C.; Apyan, A.; Barbieri, R.; Bauer, G.; Busza, W.; Cali, I. A.; Chan, M.; Di Matteo, L.; Dutta, V.; Gomez Ceballos, G.; Goncharov, M.; Gulhan, D.; Klute, M.; Lai, Y. S.; Lee, Y.-J.; Levin, A.; Luckey, P. D.; Ma, T.; Paus, C.; Ralph, D.; Roland, C.; Roland, G.; Stephans, G. S. F.; Stöckli, F.; Sumorok, K.; Velicanu, D.; Veverka, J.; Wyslouch, B.; Yang, M.; Zanetti, M.; Zhukova, V.; Dahmes, B.; De Benedetti, A.; Gude, A.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Mans, J.; Pastika, N.; Rusack, R.; Singovsky, A.; Tambe, N.; Turkewitz, J.; Acosta, J. G.; Oliveros, S.; Avdeeva, E.; Bloom, K.; Bose, S.; Claes, D. R.; Dominguez, A.; Gonzalez Suarez, R.; Keller, J.; Knowlton, D.; Kravchenko, I.; Lazo-Flores, J.; Malik, S.; Meier, F.; Snow, G. R.; Dolen, J.; Godshalk, A.; Iashvili, I.; Kharchilava, A.; Kumar, A.; Rappoccio, S.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Haley, J.; Massironi, A.; Morse, D. M.; Nash, D.; Orimoto, T.; Trocino, D.; Wood, D.; Zhang, J.; Hahn, K. A.; Kubik, A.; Mucia, N.; Odell, N.; Pollack, B.; Pozdnyakov, A.; Schmitt, M.; Stoynev, S.; Sung, K.; Velasco, M.; Won, S.; Brinkerhoff, A.; Chan, K. M.; Drozdetskiy, A.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kellams, N.; Lannon, K.; Luo, W.; Lynch, S.; Marinelli, N.; Pearson, T.; Planer, M.; Ruchti, R.; Valls, N.; Wayne, M.; Wolf, M.; Woodard, A.; Antonelli, L.; Brinson, J.; Bylsma, B.; Durkin, L. S.; Flowers, S.; Hill, C.; Hughes, R.; Kotov, K.; Ling, T. Y.; Puigh, D.; Rodenburg, M.; Smith, G.; Vuosalo, C.; Winer, B. L.; Wolfe, H.; Wulsin, H. W.; Berry, E.; Driga, O.; Elmer, P.; Hebda, P.; Hunt, A.; Koay, S. A.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Piroué, P.; Quan, X.; Saka, H.; Stickland, D.; Tully, C.; Werner, J. S.; Zenz, S. C.; Zuranski, A.; Brownson, E.; Mendez, H.; Ramirez Vargas, J. E.; Alagoz, E.; Barnes, V. E.; Benedetti, D.; Bolla, G.; Bortoletto, D.; De Mattia, M.; Everett, A.; Hu, Z.; Jha, M. K.; Jones, M.; Jung, K.; Kress, M.; Leonardo, N.; Lopes Pegna, D.; Maroussov, V.; Merkel, P.; Miller, D. H.; Neumeister, N.; Radburn-Smith, B. C.; Shipsey, I.; Silvers, D.; Svyatkovskiy, A.; Wang, F.; Xie, W.; Xu, L.; Yoo, H. D.; Zablocki, J.; Zheng, Y.; Parashar, N.; Stupak, J.; Adair, A.; Akgun, B.; Ecklund, K. M.; Geurts, F. J. M.; Li, W.; Michlin, B.; Padley, B. P.; Redjimi, R.; Roberts, J.; Zabel, J.; Betchart, B.; Bodek, A.; Covarelli, R.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Ferbel, T.; Garcia-Bellido, A.; Goldenzweig, P.; Han, J.; Harel, A.; Khukhunaishvili, A.; Miner, D. C.; Petrillo, G.; Vishnevskiy, D.; Ciesielski, R.; Demortier, L.; Goulianos, K.; Lungu, G.; Mesropian, C.; Arora, S.; Barker, A.; Chou, J. P.; Contreras-Campana, C.; Contreras-Campana, E.; Duggan, D.; Ferencek, D.; Gershtein, Y.; Gray, R.; Halkiadakis, E.; Hidas, D.; Lath, A.; Panwalkar, S.; Park, M.; Patel, R.; Rekovic, V.; Salur, S.; Schnetzer, S.; Seitz, C.; Somalwar, S.; Stone, R.; Thomas, S.; Thomassen, P.; Walker, M.; Rose, K.; Spanier, S.; York, A.; Bouhali, O.; Eusebi, R.; Flanagan, W.; Gilmore, J.; Kamon, T.; Khotilovich, V.; Krutelyov, V.; Montalvo, R.; Osipenkov, I.; Pakhotin, Y.; Perloff, A.; Roe, J.; Rose, A.; Safonov, A.; Sakuma, T.; Suarez, I.; Tatarinov, A.; Akchurin, N.; Cowden, C.; Damgov, J.; Dragoiu, C.; Dudero, P. R.; Faulkner, J.; Kovitanggoon, K.; Kunori, S.; Lee, S. W.; Libeiro, T.; Volobouev, I.; Appelt, E.; Delannoy, A. G.; Greene, S.; Gurrola, A.; Johns, W.; Maguire, C.; Mao, Y.; Melo, A.; Sharma, M.; Sheldon, P.; Snook, B.; Tuo, S.; Velkovska, J.; Arenton, M. W.; Boutle, S.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Li, H.; Lin, C.; Neu, C.; Wood, J.; Gollapinni, S.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Belknap, D. A.; Carlsmith, D.; Cepeda, M.; Dasu, S.; Duric, S.; Friis, E.; Hall-Wilton, R.; Herndon, M.; Hervé, A.; Klabbers, P.; Klukas, J.; Lanaro, A.; Lazaridis, C.; Levine, A.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Perry, T.; Pierro, G. A.; Polese, G.; Ross, I.; Sarangi, T.; Savin, A.; Smith, W. H.; Woods, N.; CMS Collaboration

    2014-08-01

    The first search at the LHC for the extinction of QCD jet production is presented, using data collected with the CMS detector corresponding to an integrated luminosity of 10.7 fb-1 of proton-proton collisions at a center-of-mass energy of 8 TeV. The extinction model studied in this analysis is motivated by the search for signatures of strong gravity at the TeV scale (terascale gravity) and assumes the existence of string couplings in the strong-coupling limit. In this limit, the string model predicts the suppression of all high-transverse-momentum standard model processes, including jet production, beyond a certain energy scale. To test this prediction, the measured transverse-momentum spectrum is compared to the theoretical prediction of the standard model. No significant deficit of events is found at high transverse momentum. A 95% confidence level lower limit of 3.3 TeV is set on the extinction mass scale.

  14. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-04-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3-) input functions by characterizing unsaturated zone NO3- transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous "vertical flux method" (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3- source concentration factor (which determines the local NO3- input concentration); unsaturated zone travel time; NO3- concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3- "extinction depth", the eventual steady state depth of the NO3- front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 - 0.86 and 0.22 - 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing performance was reasonable, considering that we limited the metamodel predictor variables to mappable factors as opposed to using all available VFM input variables. Relationships between metamodel predictor variables and mapped outputs were generally consistent with expectations, e.g. with greater source concentrations and NO3- at the groundwater table in areas of intensive crop use and well drained soils. Shorter unsaturated zone travel times in poorly drained areas likely indicated preferential flow through clay soils, and a tendency for fine grained deposits to collocate with areas of shallower water table. Numerical estimates of groundwater recharge were important in the metamodels and may have been a proxy for N input and redox conditions in the northern FWP, which had shallow predicted NO3- extinction depth. The metamodel results provide proof-of-concept for regional characterization of unsaturated zone NO3- transport processes in a statistical framework based on readily mappable GIS input variables.

  15. LLNL Scientists Use NERSC to Advance Global Aerosol Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergmann, D J; Chuang, C; Rotman, D

    2004-10-13

    While ''greenhouse gases'' have been the focus of climate change research for a number of years, DOE's ''Aerosol Initiative'' is now examining how aerosols (small particles of approximately micron size) affect the climate on both a global and regional scale. Scientists in the Atmospheric Science Division at Lawrence Livermore National Laboratory (LLNL) are using NERSC's IBM supercomputer and LLNL's IMPACT (atmospheric chemistry) model to perform simulations showing the historic effects of sulfur aerosols at a finer spatial resolution than ever done before. Simulations were carried out for five decades, from the 1950s through the 1990s. The results clearly show themore » effects of the changing global pattern of sulfur emissions. Whereas in 1950 the United States emitted 41 percent of the world's sulfur aerosols, this figure had dropped to 15 percent by 1990, due to conservation and anti-pollution policies. By contrast, the fraction of total sulfur emissions of European origin has only dropped by a factor of 2 and the Asian emission fraction jumped six fold during the same time, from 7 percent in 1950 to 44 percent in 1990. Under a special allocation of computing time provided by the Office of Science INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, Dan Bergmann, working with a team of LLNL scientists including Cathy Chuang, Philip Cameron-Smith, and Bala Govindasamy, was able to carry out a large number of calculations during the past month, making the aerosol project one of the largest users of NERSC resources. The applications ran on 128 and 256 processors. The objective was to assess the effects of anthropogenic (man-made) sulfate aerosols. The IMPACT model calculates the rate at which SO{sub 2} (a gas emitted by industrial activity) is oxidized and forms particles known as sulfate aerosols. These particles have a short lifespan in the atmosphere, often washing out in about a week. This means that their effects on climate tend to be more regional, occurring near the area where the SO{sub 2} is emitted. To accurately study these regional effects, Bergmann needed to run the simulations at a finer horizontal resolution, as the coarser resolution (typically 300km by 300km) of other climate models are insufficient for studying changes on a regional scale. Livermore's use of CAM3, the Community Atmospheric Model which is a high-resolution climate model developed at NCAR (with collaboration from DOE), allows a 100km by 100km grid to be applied. NERSC's terascale computing capability provided the needed computational horsepower to run the application at the finer level.« less

  16. Ultrafast visualization of the structural evolution of dense hydrogen towards warm dense matter

    NASA Astrophysics Data System (ADS)

    Fletcher, Luke

    2016-10-01

    Hot dense hydrogen far from equilibrium is ubiquitous in nature occurring during some of the most violent and least understood events in our universe such as during star formation, supernova explosions, and the creation of cosmic rays. It is also a state of matter important for applications in inertial confinement fusion research and in laser particle acceleration. Rapid progress occurred in recent years characterizing the high-pressure structural properties of dense hydrogen under static or dynamic compression. Here, we show that spectrally and angularly resolved x-ray scattering measure the thermodynamic properties of dense hydrogen and resolve the ultrafast evolution and relaxation towards thermodynamic equilibrium. These studies apply ultra-bright x-ray pulses from the Linac Coherent Light (LCLS) source. The interaction of rapidly heated cryogenic hydrogen with a high-peak power optical laser is visualized with intense LCLS x-ray pulses in a high-repetition rate pump-probe setting. We demonstrate that electron-ion coupling is affected by the small number of particles in the Debye screening cloud resulting in much slower ion temperature equilibration than predicted by standard theory. This work was supported by the DOE Office of Science, Fusion Energy Science under FWP 100182.

  17. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  18. Analysis Report for Exascale Storage Requirements for Scientific Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruwart, Thomas M.

    Over the next 10 years, the Department of Energy will be transitioning from Petascale to Exascale Computing resulting in data storage, networking, and infrastructure requirements to increase by three orders of magnitude. The technologies and best practices used today are the result of a relatively slow evolution of ancestral technologies developed in the 1950s and 1960s. These include magnetic tape, magnetic disk, networking, databases, file systems, and operating systems. These technologies will continue to evolve over the next 10 to 15 years on a reasonably predictable path. Experience with the challenges involved in transitioning these fundamental technologies from Terascale tomore » Petascale computing systems has raised questions about how these will scale another 3 or 4 orders of magnitude to meet the requirements imposed by Exascale computing systems. This report is focused on the most concerning scaling issues with data storage systems as they relate to High Performance Computing- and presents options for a path forward. Given the ability to store exponentially increasing amounts of data, far more advanced concepts and use of metadata will be critical to managing data in Exascale computing systems.« less

  19. Women in Physics in Germany, 2008

    NASA Astrophysics Data System (ADS)

    Kluge, Hanna

    2009-04-01

    The status of women in physics in Germany has not changed dramatically in the three years since the last IUPAP Women in Physics Conference was held in 2005. The salary of a woman remains approximately 25% lower than that of a man in a comparable professional position. The number of female professors is growing slowly. The number of young women beginning to study physics is around 20%. There is, however, a noticeable increase in organization and societal acceptance of female physicists, and an increasing amount of men taking part in this process. There is also increased acceptance and support of dual-career couples. The Helmholtz Alliance for "Physics at the Terascale" founded a dual-career option program. In 2008, the annual Conference of German Female Physicists (DPT) held in Muenster became an official conference of the DPG (German Physical Society). Various scientific groups working for equal opportunity have formed a "network of networks." At the DESY (German Electron Synchrotron), a group of women led by an equal opportunity officer is involved in the entire process of hiring new staff members in all positions, including directors.

  20. Metamodeling and mapping of nitrate flux in the unsaturated zone and groundwater, Wisconsin, USA

    USGS Publications Warehouse

    Nolan, Bernard T.; Green, Christopher T.; Juckem, Paul F.; Liao, Lixia; Reddy, James E.

    2018-01-01

    Nitrate contamination of groundwater in agricultural areas poses a major challenge to the sustainability of water resources. Aquifer vulnerability models are useful tools that can help resource managers identify areas of concern, but quantifying nitrogen (N) inputs in such models is challenging, especially at large spatial scales. We sought to improve regional nitrate (NO3−) input functions by characterizing unsaturated zone NO3− transport to groundwater through use of surrogate, machine-learning metamodels of a process-based N flux model. The metamodels used boosted regression trees (BRTs) to relate mappable landscape variables to parameters and outputs of a previous “vertical flux method” (VFM) applied at sampled wells in the Fox, Wolf, and Peshtigo (FWP) river basins in northeastern Wisconsin. In this context, the metamodels upscaled the VFM results throughout the region, and the VFM parameters and outputs are the metamodel response variables. The study area encompassed the domain of a detailed numerical model that provided additional predictor variables, including groundwater recharge, to the metamodels. We used a statistical learning framework to test a range of model complexities to identify suitable hyperparameters of the six BRT metamodels corresponding to each response variable of interest: NO3− source concentration factor (which determines the local NO3− input concentration); unsaturated zone travel time; NO3− concentration at the water table in 1980, 2000, and 2020 (three separate metamodels); and NO3− “extinction depth”, the eventual steady state depth of the NO3−front. The final metamodels were trained to 129 wells within the active numerical flow model area, and considered 58 mappable predictor variables compiled in a geographic information system (GIS). These metamodels had training and cross-validation testing R2 values of 0.52 – 0.86 and 0.22 – 0.38, respectively, and predictions were compiled as maps of the above response variables. Testing performance was reasonable, considering that we limited the metamodel predictor variables to mappable factors as opposed to using all available VFM input variables. Relationships between metamodel predictor variables and mapped outputs were generally consistent with expectations, e.g. with greater source concentrations and NO3− at the groundwater table in areas of intensive crop use and well drained soils. Shorter unsaturated zone travel times in poorly drained areas likely indicated preferential flow through clay soils, and a tendency for fine grained deposits to collocate with areas of shallower water table. Numerical estimates of groundwater recharge were important in the metamodels and may have been a proxy for N input and redox conditions in the northern FWP, which had shallow predicted NO3− extinction depth. The metamodel results provide proof-of-concept for regional characterization of unsaturated zone NO3− transport processes in a statistical framework based on readily mappable GIS input variables.

  1. FWP executive summaries, Basic Energy Sciences Materials Sciences Programs (SNL/NM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samara, G.A.

    1997-05-01

    The BES Materials Sciences Program has the central theme of Scientifically Tailored Materials. The major objective of this program is to combine Sandia`s expertise and capabilities in the areas of solid state sciences, advanced atomic-level diagnostics and materials synthesis and processing science to produce new classes of tailored materials as well as to enhance the properties of existing materials for US energy applications and for critical defense needs. Current core research in this program includes the physics and chemistry of ceramics synthesis and processing, the use of energetic particles for the synthesis and study of materials, tailored surfaces and interfacesmore » for materials applications, chemical vapor deposition sciences, artificially-structured semiconductor materials science, advanced growth techniques for improved semiconductor structures, transport in unconventional solids, atomic-level science of interfacial adhesion, high-temperature superconductors, and the synthesis and processing of nano-size clusters for energy applications. In addition, the program includes the following three smaller efforts initiated in the past two years: (1) Wetting and Flow of Liquid Metals and Amorphous Ceramics at Solid Interfaces, (2) Field-Structured Anisotropic Composites, and (3) Composition-Modulated Semiconductor Structures for Photovoltaic and Optical Technologies. The latter is a joint effort with the National Renewable Energy Laboratory. Separate summaries are given of individual research areas.« less

  2. Two-temperature equilibration in warm dense hydrogen measured with x-ray scattering from the LCLS

    NASA Astrophysics Data System (ADS)

    Fletcher, Luke; High Energy Density Sciences Collaboration

    2017-10-01

    Understanding the properties of warm dense hydrogen plasmas is critical for modeling stellar and planetary interiors, as well as for inertial confinement fusion (ICF) experiments. Of central importance are the electron-ion collision and equilibration times that determine the microscopic properties in a high energy density state. Spectrally and angularly resolved x-ray scattering measurements from fs-laser heated hydrogen have resolved the picosecond evolution and energy relaxation from a two-temperature plasma towards thermodynamic equilibrium in the warm dense matter regime. The interaction of rapidly heated cryogenic hydrogen irradiated by a 400 nm, 5x1017 W/cm2 , 70 fs-laser is visualized with ultra-bright 5.5 kev x-ray pulses from the Linac Coherent Light (LCLS) source in 1 Hz repetition rate pump-probe setting. We demonstrate that the energy relaxation is faster than many classical binary collision theories that use ad hoc cutoff parameters used in the Landau-Spitzer determination of the Coulomb logarithm. This work was supported by the DOE Office of Science, Fusion Energy Science under contract No. SF00515 and supported under FWP 100182 and DOE Office of Basic Energy Sciences, Materials Sciences and Engineering Division, contract DE-AC02-76SF00515.

  3. Modeling Reservoir-River Networks in Support of Optimizing Seasonal-Scale Reservoir Operations

    NASA Astrophysics Data System (ADS)

    Villa, D. L.; Lowry, T. S.; Bier, A.; Barco, J.; Sun, A.

    2011-12-01

    HydroSCOPE (Hydropower Seasonal Concurrent Optimization of Power and the Environment) is a seasonal time-scale tool for scenario analysis and optimization of reservoir-river networks. Developed in MATLAB, HydroSCOPE is an object-oriented model that simulates basin-scale dynamics with an objective of optimizing reservoir operations to maximize revenue from power generation, reliability in the water supply, environmental performance, and flood control. HydroSCOPE is part of a larger toolset that is being developed through a Department of Energy multi-laboratory project. This project's goal is to provide conventional hydropower decision makers with better information to execute their day-ahead and seasonal operations and planning activities by integrating water balance and operational dynamics across a wide range of spatial and temporal scales. This presentation details the modeling approach and functionality of HydroSCOPE. HydroSCOPE consists of a river-reservoir network model and an optimization routine. The river-reservoir network model simulates the heat and water balance of river-reservoir networks for time-scales up to one year. The optimization routine software, DAKOTA (Design Analysis Kit for Optimization and Terascale Applications - dakota.sandia.gov), is seamlessly linked to the network model and is used to optimize daily volumetric releases from the reservoirs to best meet a set of user-defined constraints, such as maximizing revenue while minimizing environmental violations. The network model uses 1-D approximations for both the reservoirs and river reaches and is able to account for surface and sediment heat exchange as well as ice dynamics for both models. The reservoir model also accounts for inflow, density, and withdrawal zone mixing, and diffusive heat exchange. Routing for the river reaches is accomplished using a modified Muskingum-Cunge approach that automatically calculates the internal timestep and sub-reach lengths to match the conditions of each timestep and minimize computational overhead. Power generation for each reservoir is estimated using a 2-dimensional regression that accounts for both the available head and turbine efficiency. The object-oriented architecture makes run configuration easy to update. The dynamic model inputs include inflow and meteorological forecasts while static inputs include bathymetry data, reservoir and power generation characteristics, and topological descriptors. Ensemble forecasts of hydrological and meteorological conditions are supplied in real-time by Pacific Northwest National Laboratory and are used as a proxy for uncertainty, which is carried through the simulation and optimization process to produce output that describes the probability that different operational scenario's will be optimal. The full toolset, which includes HydroSCOPE, is currently being tested on the Feather River system in Northern California and the Upper Colorado Storage Project.

  4. Acoustic Source Localization via Time Difference of Arrival Estimation for Distributed Sensor Networks Using Tera-Scale Optical Core Devices

    DOE PAGES

    Imam, Neena; Barhen, Jacob

    2009-01-01

    For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot bemore » readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.« less

  5. CephFS: a new generation storage platform for Australian high energy physics

    NASA Astrophysics Data System (ADS)

    Borges, G.; Crosby, S.; Boland, L.

    2017-10-01

    This paper presents an implementation of a Ceph file system (CephFS) use case at the ARC Center of Excellence for Particle Physics at the Terascale (CoEPP). CoEPP’s CephFS provides a posix-like file system on top of a Ceph RADOS object store, deployed on commodity hardware and without single points of failure. By delivering a unique file system namespace at different CoEPP centres spread across Australia, local HEP researchers can store, process and share data independently of their geographical locations. CephFS is also used as the back-end file system for a WLCG ATLAS user area at the Australian Tier-2. Dedicated SRM and XROOTD services, deployed on top of CoEPP’s CephFS, integrates it in ATLAS data distributed operations. This setup, while allowing Australian HEP researchers to trigger data movement via ATLAS grid tools, also enables local posix-like read access providing greater control to scientists of their data flows. In this article we will present details on CoEPP’s Ceph/CephFS implementation and report performance I/O metrics collected during the testing/tuning phase of the system.

  6. Correlations between soil respiration and soil properties in sugarcane areas under green and slash-and-burn management systems

    NASA Astrophysics Data System (ADS)

    Rodrigo Panosso, Alan; Milori, Débora M. B. P.; Marques Júnior, José; Martin-Neto, Ladislau; La Scala, Newton, Jr.

    2010-05-01

    Soil management causes changes in soil physical, chemical, and biological properties that consequently affect its CO2 emission. In this work we studied soil respiration (FCO2) in areas with sugarcane production in southern Brazil under two different sugarcane management systems: green (G), consisting of mechanized harvesting that produces a large amount of crop residues left on the soil surface, and slash-and-burn (SB), in which the residues are burned before manual harvest, leaving no residues on the soil surface. The study was conducted after the harvest period in two side-by-side grids installed in adjacent areas, having 20 measurement points each. The objective of this work was to determinate whether soil physical and chemical properties within each plot were useful in order to explain the spatial variability of FCO2, supposedly influence by each management system. Most of the soil physical properties studied showed no significant differences between management systems, but on the other hand most of the chemical properties differed significantly when SB and G areas were compared. Total FCO2 was 31% higher in the SB plot (729 g CO2 m-2) when compared to the G plot (557 g CO2 m-2) throughout the 70-day period after harvest studied. This seems to be related to the sensitivity of FCO2 to precipitation events, as respiration in this plot increased significantly with increases in soil moisture. Despite temporal variability showed to be positively related to soil moisture, inside each management system there was a negative correlation (p<0.01) between the spatial changes of FCO2 and soil moisture (MS), R= -0.56 and -0.59 for G and SB respectively. There was no spatial correlation between FCO2 and soil organic matter in each management system, however, the humification index (Hum) of organic matter was negatively linear correlated with FCO2 in SB (R= -0.53, p<0.05) while positively linear correlated in G area (R=0.42, p<0.10). The multiple regression model analysis applied in each management system indicates that 63% of the FCO2 spatial variability in G managed could be explained by the model: FCO2(G)= 4.11978 -0.07672MS + 0.0045Hum +1.5352K -0.04474FWP, where K and FWP are potassium content and free water porosity in G area, respectively. On the other hand, 75% of FCO2 spatial variability in SB managed plot was accounted by the model: FCO2(SB) = 10.66774 -0.08624MS -0.02904Hum -2.42548K. Therefore, soil moisture, humification index of organic matter and potassium level were the main properties able to explain the spatial variability of FCO2 in both sugarcane management systems. This result indicates that changes in sugarcane management systems could result in changes on the soil chemical properties, mostly, especially humification index of organic matter. It seems that in conversion from slash-and-burn to green harvest system, free water porosity turns to be an important aspect in order to explain part of FCO2 spatial variability in green managed system.

  7. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  8. Fluorescence and absorption spectroscopy for warm dense matter studies and ICF plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Hansen, Stephanie

    2017-10-01

    The burning core of an inertial confinement fusion (ICF) plasma at stagnation is surrounded by a shell of warm, dense matter whose properties are difficult both to model (due to a complex interplay of thermal, degeneracy, and strong coupling effects) and to diagnose (due to low emissivity and high opacity). We demonstrate a promising technique to study the warm dense shells of ICF plasmas based on the fluorescence emission of dopants or impurities in the shell material. This emission, which is driven by x-rays produced in the hot core, exhibits signature changes in response to compression and heating. High-resolution measurements of absorption and fluorescence features can refine our understanding of the electronic structure of material under high compression, improve our models of density-driven phenomena such as ionization potential depression and plasma polarization shifts, and help diagnose shell density, temperature, mass distribution, and residual motion in ICF plasmas at stagnation. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. This work was supported by the U.S. Department of Energy, Office of Science Early Career Research Program, Office of Fusion Energy Sciences under FWP-14-017426.

  9. Northwest Montana Wildlife Habitat Enhancement: Hungry Horse Elk Mitigation Project: Monitoring and Evaluation Plan.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casey, Daniel; Malta, Patrick

    Portions of two important elk (Cervus elaphus) winter ranges totalling 8749 acres were lost due to the construction of the Hungry Horse Dam hydroelectric facility. This habitat loss decreased the carrying capacity of the both the elk and the mule deer (Odocoileus hemionus). In 1985, using funds from the Bonneville Power Administration (BPA) as authorized by the Northwest Power Act, the Montana Department of Fish, Wildlife and Parks (FWP) completed a wildlife mitigation plan for Hungry Horse Reservoir. This plan identified habitat enhancement of currently-occupied winter range as the most cost-efficient, easily implemented mitigation alternative available to address these large-scalemore » losses of winter range. The Columbia Basin Fish and Wildlife Program, as amended in 1987, authorized BPA to fund winter range enhancement to meet an adjusted goal of 133 additional elk. A 28-month advance design phase of the BPA-funded project was initiated in September 1987. Primary goals of this phase of the project included detailed literature review, identification of enhancement areas, baseline (elk population and habitat) data collection, and preparation of 3-year and 10-year implementation plans. This document will serve as a site-specific habitat and population monitoring plan which outlines our recommendations for evaluating the results of enhancement efforts against mitigation goals. 25 refs., 13 figs., 7 tabs.« less

  10. Realistic calculations for c-coefficients of the isobaric mass multiplet equation in 1 p 0 f shell nuclei

    NASA Astrophysics Data System (ADS)

    Ormand, Erich; Brown, Alex; Hjorth-Jensen, Morten

    2017-09-01

    We present calculations for the c-coefficients of the isobaric mass multiplet equation for nuclei from A = 42 to A = 54 based on two-body effective interactions derived from three realistic nucleon-nucleon interactions: CD-Bonn, N3LO, and Argonne V18. The two-body effective interactions were derived using G-matrix or Vlowk augmented by perturbation theory extended to third order. We demonstrate a clear dependence in the c-coefficients on the short-ranged charge-symmetry breaking (CSB) part of the strong interaction, which is required to reproduce their overall behavior as a function of excitation (angular momentum). We find, however, that the CSB component in all three realistic nucleon-nucleon interactions is too large when compared to experiment, and that, furthermore, there is significant disagreement between each of the three interactions. This implies either: 1) a deficiency in our understanding of isospin-symmetry breaking in the nucleon-nucleon interaction, 2) significant isospin-symmetry breaking in the initial three-nucleon interaction, or 3) large contributions to isospin-symmetry breaking in three-nucleon interactions induced by the renormalization procedure. This work performed for the U.S. DOE by LLNL under Contract DE-AC52-07NA27344. WEO: DOE/NP FWP SCW0498. BAB: NSF Grant No. PHY-1404442. MHJ: NSF Grant No. PHY-1404159 and the Research Council of Norway contract ISP-Fysikk/216699.

  11. The MoEDAL Experiment at the LHC - a New Light on the Terascale Frontier

    NASA Astrophysics Data System (ADS)

    Pinfold, J. L.

    2015-07-01

    MoEDAL is a pioneering experiment designed to search for highly ionizing avatars of new physics such as magnetic monopoles or massive (pseudo-)stable charged particles. Its groundbreaking physics program defines a number of scenarios that yield potentially revolutionary insights into such foundational questions as: are there extra dimensions or new symmetries; what is the mechanism for the generation of mass; does magnetic charge exist; what is the nature of dark matter; and, how did the big-bang develop. MoEDAL's purpose is to meet such far-reaching challenges at the frontier of the field. The innovative MoEDAL detector employs unconventional methodologies tuned to the prospect of discovery physics. The largely passive MoEDAL detector, deployed at Point 8 on the LHC ring, has a dual nature. First, it acts like a giant camera, comprised of nuclear track detectors - analyzed offline by ultra fast scanning microscopes - sensitive only to new physics. Second, it is uniquely able to trap the particle messengers of physics beyond the Standard Model for further study. MoEDAL's radiation environment is monitored by a state-of-the-art real-time TimePix pixel detector array. A new MoEDAL sub-detector to extend MoEDAL's reach to millicharged, minimally ionizing, particles (MMIPs) is under study.

  12. Center for Extended Magnetohydrodynamic Modeling Cooperative Agreement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl R. Sovinec

    The Center for Extended Magnetohydrodynamic Modeling (CEMM) is developing computer simulation models for predicting the behavior of magnetically confined plasmas. Over the first phase of support from the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) initiative, the focus has been on macroscopic dynamics that alter the confinement properties of magnetic field configurations. The ultimate objective is to provide computational capabilities to predict plasma behavior—not unlike computational weather prediction—to optimize performance and to increase the reliability of magnetic confinement for fusion energy. Numerical modeling aids theoretical research by solving complicated mathematical models of plasma behavior including strong nonlinear effectsmore » and the influences of geometrical shaping of actual experiments. The numerical modeling itself remains an area of active research, due to challenges associated with simulating multiple temporal and spatial scales. The research summarized in this report spans computational and physical topics associated with state of the art simulation of magnetized plasmas. The tasks performed for this grant are categorized according to whether they are primarily computational, algorithmic, or application-oriented in nature. All involve the development and use of the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, which is described at http://nimrodteam.org. With respect to computation, we have tested and refined methods for solving the large algebraic systems of equations that result from our numerical approximations of the physical model. Collaboration with the Terascale Optimal PDE Solvers (TOPS) SciDAC center led us to the SuperLU_DIST software library [http://crd.lbl.gov/~xiaoye/SuperLU/] for solving large sparse matrices using direct methods on parallel computers. Switching to this solver library boosted NIMROD’s performance by a factor of five in typical large nonlinear simulations, which has been publicized as a success story of SciDAC-fostered collaboration. Furthermore, the SuperLU software does not assume any mathematical symmetry, and its generality provides an important capability for extending the physical model beyond magnetohydrodynamics (MHD). With respect to algorithmic and model development, our most significant accomplishment is the development of a new method for solving plasma models that treat electrons as an independent plasma component. These ‘two-fluid’ models encompass MHD and add temporal and spatial scales that are beyond the response of the ion species. Implementation and testing of a previously published algorithm did not prove successful for NIMROD, and the new algorithm has since been devised, analyzed, and implemented. Two-fluid modeling, an important objective of the original NIMROD project, is now routine in 2D applications. Algorithmic components for 3D modeling are in place and tested; though, further computational work is still needed for efficiency. Other algorithmic work extends the ion-fluid stress tensor to include models for parallel and gyroviscous stresses. In addition, our hot-particle simulation capability received important refinements that permitted completion of a benchmark with the M3D code. A highlight of our applications work is the edge-localized mode (ELM) modeling, which was part of the first-ever computational Performance Target for the DOE Office of Fusion Energy Science, see http://www.science.doe.gov/ofes/performancetargets.shtml. Our efforts allowed MHD simulations to progress late into the nonlinear stage, where energy is conducted to the wall location. They also produced a two-fluid ELM simulation starting from experimental information and demonstrating critical drift effects that are characteristic of two-fluid physics. Another important application is the internal kink mode in a tokamak. Here, the primary purpose of the study has been to benchmark the two main code development lines of CEMM, NIMROD and M3D, on a relevant nonlinear problem. Results from the two codes show repeating nonlinear relaxation events driven by the kink mode over quantitatively comparable timescales. The work has launched a more comprehensive nonlinear benchmarking exercise, where realistic transport effects have an important role.« less

  13. HEC Applications on Columbia Project

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2004-01-01

    NASA's Columbia system consists of a cluster of twenty 512 processor SGI Altix systems. Each of these systems is 3 TFLOP/s in peak performance - approximately the same as the entire compute capability at NAS just one year ago. Each 512p system is a single system image machine with one Linunx O5, one high performance file system, and one globally shared memory. The NAS Terascale Applications Group (TAG) is chartered to assist in scaling NASA's mission critical codes to at least 512p in order to significantly improve emergency response during flight operations, as well as provide significant improvements in the codes. and rate of scientific discovery across the scientifc disciplines within NASA's Missions. Recent accomplishments are 4x improvements to codes in the ocean modeling community, 10x performance improvements in a number of computational fluid dynamics codes used in aero-vehicle design, and 5x improvements in a number of space science codes dealing in extreme physics. The TAG group will continue its scaling work to 2048p and beyond (10240 cpus) as the Columbia system becomes fully operational and the upgrades to the SGI NUMAlink memory fabric are in place. The NUMlink uprades dramatically improve system scalability for a single application. These upgrades will allow a number of codes to execute faster at higher fidelity than ever before on any other system, thus increasing the rate of scientific discovery even further

  14. Mitigation for the Construction and Operation of Libby Dam, 2003-2004 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunnigan, James; DeShazer, Jay; Garrow, Larry

    2004-06-01

    ''Mitigation for the Construction and Operation of Libby Dam'' is part of the Northwest Power and Conservation Council's (NPCC) resident fish and wildlife program. The program was mandated by the Northwest Planning Act of 1980, and is responsible for mitigating for damages to fish and wildlife caused by hydroelectric development in the Columbia River Basin. The objective of Phase I of the project (1983 through 1987) was to maintain or enhance the Libby Reservoir fishery by quantifying seasonal water levels and developing ecologically sound operational guidelines. The objective of Phase II of the project (1988 through 1996) was to determinemore » the biological effects of reservoir operations combined with biotic changes associated with an aging reservoir. The objectives of Phase III of the project (1996 through present) are to implement habitat enhancement measures to mitigate for dam effects, to provide data for implementation of operational strategies that benefit resident fish, monitor reservoir and river conditions, and monitor mitigation projects for effectiveness. This project completes urgent and high priority mitigation actions as directed by the Kootenai Subbasin Plan. Montana FWP uses a combination of diverse techniques to collect a variety of physical and biological data within the Kootenai River Basin. These data serve several purposes including: the development and refinement of models used in management of water resources and operation of Libby Dam; investigations into the limiting factors of native fish populations, gathering basic life history information, tracking trends in endangered, threatened species, and the assessment of restoration or management activities intended to restore native fishes and their habitats.« less

  15. Resident Fish Stock above Chief Joseph and Grand Coulee Dams; 2003-2004 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Jason M.; McLellan, Jason G.; Butler, Chris

    2005-11-01

    In 1980, the United States Congress enacted the Northwest Power Planning and Conservation Act (PL 96-501, 1980), which established the Northwest Power and Conservation Council (NPCC), formerly the Northwest Power Planning Council. The NPCC was directed by Congress to develop a regional Power Plan and also the Columbia River Basin Fish and Wildlife Program (FWP) to restore or replace losses of fish caused by construction and operation of hydroelectric dams in the Columbia River Basin. In developing the FWP, Congress specifically directed NPCC to solicit recommendations for measures to be included in the Program from the region's fish and wildlifemore » agencies and Indian tribes. All measures adopted by the Council were also required to be consistent with the management objectives of the agencies and tribes [Section 4.(h)(6)(A)], the legal rights of Indian tribes in the region [Section 4.(h)(6)(D)] and be based upon and supported by the best available scientific knowledge [Section 4.(h)(6)(B)]. The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, also known as the Joint Stock Assessment Project (JSAP) specifically addresses NPPC Council measure 10.8B.26 of the 1994 program. The Joint Stock Assessment Project is a management tool using ecosystem principles to manage artificial and native fish assemblages in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (Blocked Area). A three-phase approach of this project will enhance the fisheries resources of the Blocked Area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information is housed in a central location, allowing managers to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP is designed and guided jointly by fisheries managers in the Blocked Area. The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of Blocked Area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the Blocked Area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. Common collection and analytical methodologies were developed in 1999. The project began addressing identified data gaps throughout the Blocked Area in 1999. Data collection of established projects and a variety of newly developed sampling projects are ongoing. Projects developed and undertaken by JSAP fisheries managers include investigations of the Pend Orielle River and its tributaries, the Little Spokane River and its tributaries, and water bodies within and near the Spokane Indian Reservation. Migration patterns of adfluvial and reservoir fish in Box Canyon Reservoir and its tributaries, a baseline assessment of Boundary Reservoir and its tributaries, ecological assessment of mountain lakes in Pend Oreille County, and assessments of streams and lakes on the Spokane Indian Reservation were completed by 2001. Assessments of the Little Spokane River and its tributaries, Spokane River below Spokane Falls, tributaries to the Pend Oreille River, small lakes in Pend Oreille County, WA, and water bodies within and near the Spokane Indian Reservation were conducted in 2002 and 2003. This work was done in accordance with the scope of work approved by Bonneville Power Administration (BPA).« less

  16. Yakima/Klickitat Fisheries Project Monitoring and Evaluation, Final Report For the Performance Period May 1, 2008 through April 30, 2009.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sampson, Melvin R.

    2009-07-30

    The Yakima-Klickitat Fisheries Project (YKFP) is a joint project of the Yakama Nation (lead entity) and the Washington State Department of Fish and Wildlife (WDFW) and is sponsored in large part by the Bonneville Power Administration (BPA) with oversight and guidance from the Northwest Power and Conservation Council (NPCC). It is among the largest and most complex fisheries management projects in the Columbia Basin in terms of data collection and management, physical facilities, habitat enhancement and management, and experimental design and research on fisheries resources. Using principles of adaptive management, the YKFP is attempting to evaluate all stocks historically presentmore » in the Yakima subbasin and apply a combination of habitat restoration and hatchery supplementation or reintroduction, to restore the Yakima Subbasin ecosystem with sustainable and harvestable populations of salmon, steelhead and other at-risk species. The original impetus for the YKFP resulted from the landmark fishing disputes of the 1970s, the ensuing legal decisions in United States versus Washington and United States versus Oregon, and the region's realization that lost natural production needed to be mitigated in upriver areas where these losses primarily occurred. The YKFP was first identified in the NPCC's 1982 Fish and Wildlife Program (FWP) and supported in the U.S. v Oregon 1988 Columbia River Fish Management Plan (CRFMP). A draft Master Plan was presented to the NPCC in 1987 and the Preliminary Design Report was presented in 1990. In both circumstances, the NPCC instructed the Yakama Nation, WDFW and BPA to carry out planning functions that addressed uncertainties in regard to the adequacy of hatchery supplementation for meeting production objectives and limiting adverse ecological and genetic impacts. At the same time, the NPCC underscored the importance of using adaptive management principles to manage the direction of the Project. The 1994 FWP reiterated the importance of proceeding with the YKFP because of the added production and learning potential the project would provide. The YKFP is unique in having been designed to rigorously test the efficacy of hatchery supplementation. Given the current dire situation of many salmon and steelhead stocks, and the heavy reliance on artificial propagation as a recovery tool, YKFP monitoring results will have great region-wide significance. Supplementation is envisioned as a means to enhance and sustain the abundance of wild and naturally-spawning populations at levels exceeding the cumulative mortality burden imposed on those populations by habitat degradation and by natural cycles in environmental conditions. A supplementation hatchery is properly operated as an adjunct to the natural production system in a watershed. By fully integrating the hatchery with a naturally-producing population, high survival rates for the component of the population in the hatchery can raise the average abundance of the total population (hatchery component + naturally-producing component) to a level that compensates for the high mortalities imposed by human development activities and fully seeds the natural environment. The objectives of the YKFP are to: use Ecosystem Diagnosis and Treatment (EDT) and other modeling tools to facilitate planning for project activities, enhance existing stocks, re-introduce extirpated stocks, protect and restore habitat in the Yakima Subbasin, and operate using a scientifically rigorous process that will foster application of the knowledge gained about hatchery supplementation and habitat restoration throughout the Columbia River Basin. The YKFP is still in the early stages of evaluation, and as such the data and findings presented in this report should be considered preliminary until results are published in the peer-reviewed literature. The following is a brief summary of current YKFP activities by species.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Jason M.; McLellan, Jason G.; Butler, Chris

    In 1980, the United States Congress enacted the Northwest Power Planning and Conservation Act (PL 96-501, 1980), which established the Northwest Power and Conservation Council (NPCC), formerly the Northwest Power Planning Council. The NPCC was directed by Congress to develop a regional Power Plan and also the Columbia River Basin Fish and Wildlife Program (FWP) to restore or replace losses of fish caused by construction and operation of hydroelectric dams in the Columbia River Basin. In developing the FWP, Congress specifically directed NPCC to solicit recommendations for measures to be included in the Program from the region's fish and wildlifemore » agencies and Indian tribes. All measures adopted by the Council were also required to be consistent with the management objectives of the agencies and tribes [Section 4.(h)(6)(A)], the legal rights of Indian tribes in the region [Section 4.(h)(6)(D)] and be based upon and supported by the best available scientific knowledge [Section 4.(h)(6)(B)]. The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, also known as the Joint Stock Assessment Project (JSAP) specifically addresses NPPC Council measure 10.8B.26 of the 1994 program. The Joint Stock Assessment Project is a management tool using ecosystem principles to manage artificial fish assemblages and native fish in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (Blocked Area). A three-phase approach of this project will enhance the fisheries resources of the Blocked Area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information is housed in a central location, allowing managers to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP is designed and guided jointly by fisheries managers in the Blocked Area. The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of Blocked Area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the Blocked Area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. Common collection and analytical methodologies were developed in 1999. In 1999, 2000, and 2001 the project began addressing some of the identified data gaps throughout the Blocked Area. Data collection of established projects and a variety of newly developed sampling projects are ongoing. Projects developed and undertaken by JSAP fisheries managers include investigations of the Pend Orielle River and its tributaries, the Little Spokane River and its tributaries, and water bodies within and near the Spokane Indian Reservation. Migration patterns of adfluvial and reservoir fish in Box Canyon Reservoir and its tributaries, a baseline assessment of Boundary Reservoir and its tributaries, ecological assessment of mountain lakes in Pend Oreille County, and assessments of streams and lakes on the Spokane Indian Reservation were completed by 2001. Assessments of the Little Spokane River and its tributaries, tributaries to the Pend Oreille River, small lakes in Pend Oreille County, WA, and water bodies within and near the Spokane Indian Reservation were conducted in 2002. This work was done in accordance with the scope of work approved by Bonneville Power Administration (BPA).« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Jason M.; McLellan, Jason G.; Butler, Chris

    In 1980, the United States Congress enacted the Northwest Power Planning and Conservation Act (PL 96-501, 1980), which established the Northwest Power and Conservation Council (NPCC), formerly the Northwest Power Planning Council. The NPCC was directed by Congress to develop a regional Power Plan and also the Columbia River Basin Fish and Wildlife Program (FWP) to restore or replace losses of fish caused by construction and operation of hydroelectric dams in the Columbia River Basin. In developing the FWP, Congress specifically directed NPCC to solicit recommendations for measures to be included in the Program from the region's fish and wildlifemore » agencies and Indian tribes. All measures adopted by the Council were also required to be consistent with the management objectives of the agencies and tribes [Section 4.(h)(6)(A)], the legal rights of Indian tribes in the region [Section 4.(h)(6)(D)] and be based upon and supported by the best available scientific knowledge [Section 4.(h)(6)(B)]. The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, also known as the Joint Stock Assessment Project (JSAP) specifically addresses NPPC Council measure 10.8B.26 of the 1994 program. The Joint Stock Assessment Project is a management tool using ecosystem principles to manage artificial and native fish assemblages in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (Blocked Area). A three-phase approach of this project will enhance the fisheries resources of the Blocked Area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information is housed in a central location, allowing managers to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP is designed and guided jointly by fisheries managers in the Blocked Area. The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of Blocked Area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the Blocked Area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. Common collection and analytical methodologies were developed in 1999. The project began addressing identified data gaps throughout the Blocked Area in 1999. Data collection of established projects and a variety of newly developed sampling projects are ongoing. Projects developed and undertaken by JSAP fisheries managers include investigations of the Pend Orielle River and its tributaries, the Little Spokane River and its tributaries, and water bodies within and near the Spokane Indian Reservation. Migration patterns of adfluvial and reservoir fish in Box Canyon Reservoir and its tributaries, a baseline assessment of Boundary Reservoir and its tributaries, ecological assessment of mountain lakes in Pend Oreille County, and assessments of streams and lakes on the Spokane Indian Reservation were completed by 2001. Assessments of the Little Spokane River and its tributaries, Spokane River below Spokane Falls, tributaries to the Pend Oreille River, small lakes in Pend Oreille County, WA, and water bodies within and near the Spokane Indian Reservation were conducted in 2002 and 2003. This work was done in accordance with the scope of work approved by Bonneville Power Administration (BPA).« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Jason M.; McLellan, Jason G.; O'Connor, Dick

    In 1980, the United States Congress enacted the Northwest Power Planning and Conservation Act (PL 96-501, 1980), which established the Northwest Power Planning Council (NPPC). The NPPC was directed by Congress to develop a regional Power Plan and also the Columbia River Basin Fish and Wildlife Program (FWP) to restore or replace losses of fish caused by construction and operation of hydroelectric dams in the Columbia River Basin. In developing the FWP, Congress specifically directed NPPC to solicit recommendations for measures to be included in the Program from the region's fish and wildlife agencies and Indian tribes. All measures adoptedmore » by the Council were also required to be consistent with the management objectives of the agencies and tribes [Section 4.(h)(6)(A)], the legal rights of Indian tribes in the region [Section 4.(h)(6)(D)] and be based upon and supported by the best available scientific knowledge [Section 4.(h)(6)(B)]. The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, also known as the Joint Stock Assessment Project (JSAP) specifically addresses NPPC Council measure 10.8B.26 of the 1994 program. The Joint Stock Assessment Project is a management tool using ecosystem principles to manage artificial fish assemblages and native fish in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (Blocked Area). A three-phase approach of this project will enhance the fisheries resources of the Blocked Area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information is housed in a central location, allowing managers to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP is designed and guided jointly by fisheries managers in the Blocked Area and the Columbia Basin Blocked Area Management Plan (1998). The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of Blocked Area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the Blocked Area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. Common collection and analytical methodologies were developed in 1999. In 1999, 2000, and 2001 the project began addressing some of the identified data gaps throughout the Blocked Area. Data collection of established projects and a variety of newly developed sampling projects are ongoing. Projects developed and undertaken by JSAP fisheries managers include investigations of the Pend Orielle River and its tributaries, the Little Spokane River and its tributaries, and water bodies within and near the Spokane Indian Reservation. Migration patterns of adfluvial and reservoir fish in Box Canyon Reservoir and its tributaries, a baseline assessment of Boundary Reservoir and its tributaries, ecological assessment of mountain lakes in Pend Oreille County, and assessments of seven streams and four lakes on the Spokane Indian Reservation were completed by 2000. Assessments of the Little Spokane River and its tributaries, tributaries to the Pend Oreille River, small lakes in southern Pend Oreille County, and water bodies within and near the Spokane Indian Reservation were conducted in 2001. This work was done in accordance with the scope of work approved by Bonneville Power Administration (BPA).« less

  20. The Virtual Earthquake and Seismology Research Community e-science environment in Europe (VERCE) FP7-INFRA-2011-2 project

    NASA Astrophysics Data System (ADS)

    Vilotte, J.-P.; Atkinson, M.; Michelini, A.; Igel, H.; van Eck, T.

    2012-04-01

    Increasingly dense seismic and geodetic networks are continuously transmitting a growing wealth of data from around the world. The multi-use of these data leaded the seismological community to pioneer globally distributed open-access data infrastructures, standard services and formats, e.g., the Federation of Digital Seismic Networks (FDSN) and the European Integrated Data Archives (EIDA). Our ability to acquire observational data outpaces our ability to manage, analyze and model them. Research in seismology is today facing a fundamental paradigm shift. Enabling advanced data-intensive analysis and modeling applications challenges conventional storage, computation and communication models and requires a new holistic approach. It is instrumental to exploit the cornucopia of data, and to guarantee optimal operation and design of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of the seismological data-intensive applications in data analysis and modeling. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of those applications, and integrating the data infrastructures with Grid, Cloud and HPC infrastructures. It will allow prototyping solutions for new use cases as they emerge within the European Plate Observatory Systems (EPOS), the ESFRI initiative of the solid Earth community. Computational seismology, and information management, is increasingly revolving around massive amounts of data that stem from: (1) the flood of data from the observational systems; (2) the flood of data from large-scale simulations and inversions; (3) the ability to economically store petabytes of data online; (4) the evolving Internet and Data-aware computing capabilities. As data-intensive applications are rapidly increasing in scale and complexity, they require additional services-oriented architectures offering a virtualization-based flexibility for complex and re-usable workflows. Scientific information management poses computer science challenges: acquisition, organization, query and visualization tasks scale almost linearly with the data volumes. Commonly used FTP-GREP metaphor allows today to scan gigabyte-sized datasets but will not work for scanning terabyte-sized continuous waveform datasets. New data analysis and modeling methods, exploiting the signal coherence within dense network arrays, are nonlinear. Pair-algorithms on N points scale as N2. Wave form inversion and stochastic simulations raise computing and data handling challenges These applications are unfeasible for tera-scale datasets without new parallel algorithms that use near-linear processing, storage and bandwidth, and that can exploit new computing paradigms enabled by the intersection of several technologies (HPC, parallel scalable database crawler, data-aware HPC). This issues will be discussed based on a number of core pilot data-intensive applications and use cases retained in VERCE. This core applications are related to: (1) data processing and data analysis methods based on correlation techniques; (2) cpu-intensive applications such as large-scale simulation of synthetic waveforms in complex earth systems, and full waveform inversion and tomography. We shall analyze their workflow and data flow, and their requirements for a new service-oriented architecture and a data-aware platform with services and tools. Finally, we will outline the importance of a new collaborative environment between seismology and computer science, together with the need for the emergence and the recognition of 'research technologists' mastering the evolving data-aware technologies and the data-intensive research goals in seismology.

  1. Helium Ion Microscope: A New Tool for Sub-nanometer Imaging of Soft Materials

    NASA Astrophysics Data System (ADS)

    Shutthanandan, V.; Arey, B.; Smallwood, C. R.; Evans, J. E.

    2017-12-01

    High-resolution inspection of surface details is needed in many biological and environmental researches to understand the Soil organic material (SOM)-mineral interactions along with identifying microbial communities and their interactions. SOM shares many imaging characteristics with biological samples and getting true surface details from these materials are challenging since they consist of low atomic number materials. FE-SEM imaging is the main imagining technique used to image these materials in the past. These SEM images often show loss of resolution and increase noise due to beam damage and charging issues. Newly developed Helium Ion Microscope (HIM), on the other hand can overcome these difficulties and give very fine details. HIM is very similar to scanning electron microscopy (SEM) but instead of using electrons as a probe beam, HIM uses helium ions with energy ranges from 5 to 40 keV. HIM offers a series of advantages compared to SEM such as nanometer and sub-nanometer image resolutions (about 0.35 nm), detailed surface topography, high surface sensitivity, low Z material imaging (especially for polymers and biological samples), high image contrast, and large depth of field. In addition, HIM also has the ability to image insulating materials without any conductive coatings so that surface details are not modified. In this presentation, several scientific applications across biology and geochemistry will be presented to highlight the effectiveness of this powerful microscope. Acknowledgements: Research was performed using the Environmental Molecular Sciences Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy's Office of Biological and Environmental Research and located at PNNL. Work was supported by DOE-BER Mesoscale to Molecules Bioimaging Project FWP# 66382.

  2. Effects of Light Stress on Extracellular Cycling in a Cyanobacterial Biofilm Community

    NASA Astrophysics Data System (ADS)

    Stuart, R.; Mayali, X.; Pett-Ridge, J.; Weber, P. K.; Thelen, M.; Bebout, B.; Lipton, M. S.

    2015-12-01

    Cyanobacterial carbon excretion is crucial to carbon cycling in many microbial communities, but the nature and bioavailability of the carbon excreted is dependent on physiological function, which is often unknown. Cyanobacteria are the dominant primary producers in hypersaline mats and there is large reservoir of carbon in the extracellular matrix, but the nature and flux is understudied. In a previous study, we examined the macromolecular composition of the matrix of microbial mats from Elkhorn Slough in Monterey Bay, California and a unicyanobacterial culture, ESFC-1, isolated from the those mats, and found evidence for cyanobacterial degradation and re-uptake of extracellular organic matter. In this work, we further explore mechanisms of this degradation and re-uptake by examining effects of light using a combination of high-resolution imaging mass spectrometry (NanoSIMS) and metaproteomics of extracellular proteins. Based on these findings, we propose that mat Cyanobacteria store and recycle organic material from the mat extracellular matrix. Cyanobacteria can account for 70-90% of the biomass in the upper phototrophic layer of the mats, so their re-uptake of organic carbon and nitrogen has the potential to re-define organic matter availability in these systems. This work has implications for cyanobacterial adaptation to dynamic environments like microbial mats, where uptake of carbon and nitrogen in variable forms may be necessary to persist. This research was supported by the U.S. Department of Energy Office of Science, Office of Biological and Environmental Research Genomic Science program under FWP SCW1039. Work at LLNL was performed under the auspices of the U.S. Department of Energy under Contract DE-AC52-07NA27344.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springmeyer, R R; Brugger, E; Cook, R

    The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less

  4. Gigascale Silicon Photonic Transmitters Integrating HBT-based Carrier-injection Electroabsorption Modulator Structures

    NASA Astrophysics Data System (ADS)

    Fu, Enjin

    Demand for more bandwidth is rapidly increasing, which is driven by data intensive applications such as high-definition (HD) video streaming, cloud storage, and terascale computing applications. Next-generation high-performance computing systems require power efficient chip-to-chip and intra-chip interconnect yielding densities on the order of 1Tbps/cm2. The performance requirements of such system are the driving force behind the development of silicon integrated optical interconnect, providing a cost-effective solution for fully integrated optical interconnect systems on a single substrate. Compared to conventional electrical interconnect, optical interconnects have several advantages, including frequency independent insertion loss resulting in ultra wide bandwidth and link latency reduction. For high-speed optical transmitter modules, the optical modulator is a key component of the optical I/O channel. This thesis presents a silicon integrated optical transmitter module design based on a novel silicon HBT-based carrier injection electroabsorption modulator (EAM), which has the merits of wide optical bandwidth, high speed, low power, low drive voltage, small footprint, and high modulation efficiency. The structure, mechanism, and fabrication of the modulator structure will be discussed which is followed by the electrical modeling of the post-processed modulator device. The design and realization of a 10Gbps monolithic optical transmitter module integrating the driver circuit architecture and the HBT-based EAM device in a 130nm BiCMOS process is discussed. For high power efficiency, a 6Gbps ultra-low power driver IC implemented in a 130nm BiCMOS process is presented. The driver IC incorporates an integrated 27-1 pseudo-random bit sequence (PRBS) generator for reliable high-speed testing, and a driver circuit featuring digitally-tuned pre-emphasis signal strength. With outstanding drive capability, the driver module can be applied to a wide range of carrier injection modulators and light-emitting diodes (LED) with drive voltage requirements below 1.5V. Measurement results show an optical link based on a 70MHz red LED work well at 300Mbps by using the pre-emphasis driver module. A traveling wave electrode (TWE) modulator structure is presented, including a novel design methodology to address process limitations imposed by a commercial silicon fabrication technology. Results from 3D full wave EM simulation demonstrate the application of the design methodology to achieve specifications, including phase velocity matching, insertion loss, and impedance matching. Results show the HBT-based TWE-EAM system has the bandwidth higher than 60GHz.

  5. Characterizing phosphorus dynamics in tile-drained agricultural fieldsof eastern Wisconsin

    USGS Publications Warehouse

    Madison, Allison; Ruark, Matthew; Stuntebeck, Todd D.; Komiskey, Matthew J.; Good, Laura W.; Drummy, Nancy; Cooley, Eric

    2014-01-01

    Artificial subsurface drainage provides an avenue for the rapid transfer of phosphorus (P) from agricultural fields to surface waters. This is of particular interest in eastern Wisconsin, where there is a concentrated population of dairy farms and high clay content soils prone to macropore development. Through collaboration with private landowners, surface and tile drainage was measured and analyzed for dissolved reactive P (DRP) and total P (TP) losses at four field sites in eastern Wisconsin between 2005 and 2009. These sites, which received frequent manure applications, represent a range of crop management practices which include: two chisel plowed corn fields (CP1, CP2), a no-till corn–soybean field (NT), and a grazed pasture (GP). Subsurface drainage was the dominant pathway of water loss at each site accounting for 66–96% of total water discharge. Average annual flow-weighted (FW) TP concentrations were 0.88, 0.57, 0.21, and 1.32 mg L−1 for sites CP1, CP2, NT, and GP, respectively. Low TP concentrations at the NT site were due to tile drain interception of groundwater flow where large volumes of tile drainage water diluted the FW-TP concentrations. Subsurface pathways contributed between 17% and 41% of the TP loss across sites. On a drainage event basis, total drainage explained between 36% and 72% of the event DRP loads across CP1, CP2, and GP; there was no relationship between event drainflow and event DRP load at the NT site. Manure applications did not consistently increase P concentrations in drainflow, but annual FW-P concentrations were greater in years receiving manure applications compared to years without manure application. Based on these field measures, P losses from tile drainage must be integrated into field level P budgets and P loss calculations on heavily manured soils, while also acknowledging the unique drainage patterns observed in eastern Wisconsin.

  6. Characterizing phosphorus dynamics in tile-drained agricultural fields of eastern Wisconsin

    NASA Astrophysics Data System (ADS)

    Madison, Allison M.; Ruark, Matthew D.; Stuntebeck, Todd D.; Komiskey, Matthew J.; Good, Lara W.; Drummy, Nancy; Cooley, Eric T.

    2014-11-01

    Artificial subsurface drainage provides an avenue for the rapid transfer of phosphorus (P) from agricultural fields to surface waters. This is of particular interest in eastern Wisconsin, where there is a concentrated population of dairy farms and high clay content soils prone to macropore development. Through collaboration with private landowners, surface and tile drainage was measured and analyzed for dissolved reactive P (DRP) and total P (TP) losses at four field sites in eastern Wisconsin between 2005 and 2009. These sites, which received frequent manure applications, represent a range of crop management practices which include: two chisel plowed corn fields (CP1, CP2), a no-till corn-soybean field (NT), and a grazed pasture (GP). Subsurface drainage was the dominant pathway of water loss at each site accounting for 66-96% of total water discharge. Average annual flow-weighted (FW) TP concentrations were 0.88, 0.57, 0.21, and 1.32 mg L-1 for sites CP1, CP2, NT, and GP, respectively. Low TP concentrations at the NT site were due to tile drain interception of groundwater flow where large volumes of tile drainage water diluted the FW-TP concentrations. Subsurface pathways contributed between 17% and 41% of the TP loss across sites. On a drainage event basis, total drainage explained between 36% and 72% of the event DRP loads across CP1, CP2, and GP; there was no relationship between event drainflow and event DRP load at the NT site. Manure applications did not consistently increase P concentrations in drainflow, but annual FW-P concentrations were greater in years receiving manure applications compared to years without manure application. Based on these field measures, P losses from tile drainage must be integrated into field level P budgets and P loss calculations on heavily manured soils, while also acknowledging the unique drainage patterns observed in eastern Wisconsin.

  7. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  8. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  9. Preliminary Climate Uncertainty Quantification Study on Model-Observation Test Beds at Earth Systems Grid Federation Repository

    NASA Astrophysics Data System (ADS)

    Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.

    2012-12-01

    Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.

  10. Fish Passage Assessment: Big Canyon Creek Watershed, Technical Report 2004.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christian, Richard

    2004-02-01

    This report presents the results of the fish passage assessment as outlined as part of the Protect and Restore the Big Canyon Creek Watershed project as detailed in the CY2003 Statement of Work (SOW). As part of the Northwest Power Planning Council's Columbia Basin Fish and Wildlife Program (FWP), this project is one of Bonneville Power Administration's (BPA) many efforts at off-site mitigation for damage to salmon and steelhead runs, their migration, and wildlife habitat caused by the construction and operation of federal hydroelectric dams on the Columbia River and its tributaries. The proposed restoration activities within the Big Canyonmore » Creek watershed follow the watershed restoration approach mandated by the Fisheries and Watershed Program. Nez Perce Tribal Fisheries/Watershed Program vision focuses on protecting, restoring, and enhancing watersheds and treaty resources within the ceded territory of the Nez Perce Tribe under the Treaty of 1855 with the United States Federal Government. The program uses a holistic approach, which encompasses entire watersheds, ridge top to ridge top, emphasizing all cultural aspects. We strive toward maximizing historic ecosystem productive health, for the restoration of anadromous and resident fish populations. The Nez Perce Tribal Fisheries/Watershed Program (NPTFWP) sponsors the Protect and Restore the Big Canyon Creek Watershed project. The NPTFWP has the authority to allocate funds under the provisions set forth in their contract with BPA. In the state of Idaho vast numbers of relatively small obstructions, such as road culverts, block thousands of miles of habitat suitable for a variety of fish species. To date, most agencies and land managers have not had sufficient, quantifiable data to adequately address these barrier sites. The ultimate objective of this comprehensive inventory and assessment was to identify all barrier crossings within the watershed. The barriers were then prioritized according to the amount of habitat blocked at each site and the fish life history stages impacted. This assessment protocol will hopefully prove useful to other agencies and become a model for use in other watersheds.« less

  11. Lowering the Barriers to Using Data: Enabling Desktop-based HPD Science through Virtual Environments and Web Data Services

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Trenham, C. E.; Steer, A.; Evans, B. J. K.; Richards, C. J.; Smillie, J.; Allen, C.; Pringle, S.; Wang, J.; Wyborn, L. A.

    2016-12-01

    The Australian National Computational Infrastructure (NCI) provides access to petascale data in climate, weather, Earth observations, and genomics, and terascale data in astronomy, geophysics, ecology and land use, as well as social sciences. The data is centralized in a closely integrated High Performance Computing (HPC), High Performance Data (HPD) and cloud facility. Despite this, there remain significant barriers for many users to find and access the data: simply hosting a large volume of data is not helpful if researchers are unable to find, access, and use the data for their particular need. Use cases demonstrate we need to support a diverse range of users who are increasingly crossing traditional research discipline boundaries. To support their varying experience, access needs and research workflows, NCI has implemented an integrated data platform providing a range of services that enable users to interact with our data holdings. These services include: - A GeoNetwork catalog built on standardized Data Management Plans to search collection metadata, and find relevant datasets; - Web data services to download or remotely access data via OPeNDAP, WMS, WCS and other protocols; - Virtual Desktop Infrastructure (VDI) built on a highly integrated on-site cloud with access to both the HPC peak machine and research data collections. The VDI is a fully featured environment allowing visualization, code development and analysis to take place in an interactive desktop environment; and - A Learning Management System (LMS) containing User Guides, Use Case examples and Jupyter Notebooks structured into courses, so that users can self-teach how to use these facilities with examples from our system across a range of disciplines. We will briefly present these components, and discuss how we engage with data custodians and consumers to develop standardized data structures and services that support the range of needs. We will also highlight some key developments that have improved user experience in utilizing the services, particularly enabling transdisciplinary science. This work combines with other developments at NCI to increase the confidence of scientists from any field to undertake research and analysis on these important data collections regardless of their preferred work environment or level of skill.

  12. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  13. Conformational Dynamics and Proton Relay Positioning in Nickel Catalysts for Hydrogen Production and Oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franz, James A.; O'Hagan, Molly J.; Ho, Ming-Hsun

    2013-12-09

    The [Ni(PR2NR’2)2]2+ catalysts, (where PR2NR´2 is 1,5-R´-3,7-R-1,5-diaza-3,7-diphosphacyclooctane), are some of the fastest reported for hydrogen production and oxidation, however, chair/boat isomerization and the presence of a fifth solvent ligand have the potential to slow catalysis by incorrectly positioning the pendant amines or blocking the addition of hydrogen. Here, we report the structural dynamics of a series of [Ni(PR2NR’2)2]n+ complexes, characterized by NMR spectroscopy and theoretical modeling. A fast exchange process was observed for the [Ni(CH3CN)(PR2NR’2)2]2+ complexes which depends on the ligand. This exchange process was identified to occur through a three step mechanism including dissociation of the acetonitrile, boat/chair isomerizationmore » of each of the four rings identified by the phosphine ligands (including nitrogen inversion), and reassociation of acetonitrile on the opposite side of the complex. The rate of the chair/boat inversion can be influenced by varying the substituent on the nitrogen atom, but the rate of the overall exchange process is at least an order of magnitude faster than the catalytic rate in acetonitrile demonstrating that the structural dynamics of the [Ni(PR2NR´2)2]2+ complexes does not hinder catalysis. This material is based upon work supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences under FWP56073. Research by J.A.F., M.O., M-H. H., M.L.H, D.L.D. A.M.A., S. R. and R.M.B. was carried out in the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science. W.J.S. and S.L. were funded by the DOE Office of Science Early Career Research Program through the Office of Basic Energy Sciences. T.L. was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computational resources were provided at W. R. Wiley Environmental Molecular Science Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory; the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory; and the Jaguar supercomputer at Oak Ridge National Laboratory (INCITE 2008-2011 award supported by the Office of Science of the U.S. DOE under Contract No. DE-AC0500OR22725).« less

  14. Proceedings of the 2005 International Linear Collider Workshop (LCWS05)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hewett, JoAnne,; /SLAC

    2006-12-18

    Exploration of physics at the TeV scale holds the promise of addressing some of our most basic questions about the nature of matter, space, time, and energy. Discoveries of the Electroweak Symmetry Breaking mechanism, Supersymmetry, Extra Dimensions of space, Dark Matter particles, and new forces of nature are all possible. We have been waiting and planning for this exploration for over 20 years. In 2007 the Large Hadron Collider at CERN will begin its operation and will break into this new energy frontier. A new era of understanding will emerge as the LHC data maps out the Terascale. With themore » LHC discoveries, new compelling questions will arise. Responding to these questions will call for a new tool with greater sensitivity--the International Linear Collider. Historically, the most striking progress in the exploration of new energy frontiers has been made from combining results from hadron and electron-positron colliders. The precision measurements possible at the ILC will reveal the underlying theory which gave rise to the particles discovered at the LHC and will open the window to even higher energies. The world High Energy Physics community has reached an accord that an e+e- linear collider operating at 0.5-1.0 TeV would provide both unique and essential scientific opportunities; the community has endorsed with highest priority the construction of such a machine. A major milestone toward this goal was reached in August 2004 when the International Committee on Future Accelerators approved a recommendation for the technology of the future International Linear Collider. A global research and design effort is now underway to construct a global design report for the ILC. This endeavor is directed by Barry Barrish of the California Institute of Technology. The offer, made by Jonathan Dorfan on the behalf of ICFA, and acceptance of this directorship took place during the opening plenary session of this workshop. The 2005 International Linear Collider Workshop was held at Stanford University from 18 March through 22 March, 2005. This workshop was hosted by the Stanford Linear Accelerator Center and sponsored by the World Wide Study for future e+e- linear colliders. It was the eighth in a series of International Workshops (the first was held in Saariselka, Finland in 1991) devoted to the physics and detectors associated with high energy e+e- linear colliders. 397 physicists from 24 countries participated in the workshop. These proceedings represent the presentations and discussions which took place during the workshop. The contributions are comprised of physics studies, detector specifications, and accelerator design for the ILC. These proceedings are organized in two Volumes and include contributions from both the plenary and parallel sessions.« less

  15. Preface: SciDAC 2006

    NASA Astrophysics Data System (ADS)

    Tang, William M., Dr.

    2006-01-01

    The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.

  16. The 26th International Nuclear Physics Conference

    NASA Astrophysics Data System (ADS)

    It was a pleasure to welcome all delegates and accompanying persons to Adelaide for the 26th International Conference in Nuclear Physics, INPC2016. As the major meeting in our field, it was a wonderful opportunity to catch up with colleagues from around the world, learn about the very latest developments and share ideas. We were grateful for the support of the Commission on Nuclear Physics, C12, of the International Union of Pure and Applied Physics (IUPAP), which chose Adelaide to host this meeting. We were also honoured that the President of IUPAP, Prof. Bruce McKellar was present at the meeting to welcome delegates and participate in the proceedings. We acknowledge the financial support for the conference which was made available by a number of organisations. We were especially grateful to the major sponsors, the Adelaide Convention Bureau, the University of Adelaide, the Australian National University and ANSTO, as well as IUPAP, the ARC Centre of Excellence for Particle Physics at the Terascale (CoEPP) and several of the world's major nuclear physics laboratories, BNL, GSI, JLab and TRIUMF. As a result of these contributions we were able to offer support to attend the conference to more than 50 international students. Not only did we have a superb scientific program but, consistent with IUPAP guidelines, more than 40% of the invited plenary talks were presented by women. In order to reach out to the local community, Cynthia Keppel (from JLab) presented a public lecture on Hadron Beam Therapy on Tuesday evening, September 13th. As presenting a talk is now often a condition for financial support to attend an international conference, there were 11 simultaneous parallel sessions with more than 350 presentations. We are especially grateful to the International Advisory Committee, the Program Committee and the Conveners whose advice and hard work made it possible for all this to come together. I would also like to acknowledge the work of the Local Organising Committee and the conference management organisation, Arinex. I am especially grateful to Sharon Johnson and Silvana Santucci at the Centre for the Subatomic Structure of Matter (CSSM) who carried much of the responsibility for the complex task of bringing the conference together. Given that INPC is held only once every three years and rotates between Europe, North America and the rest of the world, it was a rare honour to have the opportunity to stage it in the Southern Hemisphere. This was the first time that it had been held in Australia and we were pleased that delegates had the opportunity to experience some of the delights of our country, from its remarkable scenery and wildlife to the great cities and food and wine. Heartfelt thanks to everyone who took part for a successful conference. Anthony Thomas Chair INPC2016

  17. HSI-Find: A Visualization and Search Service for Terascale Spectral Image Catalogs

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Smith, A. T.; Castano, R.; Palmer, E. E.; Xing, Z.

    2013-12-01

    Imaging spectrometers are remote sensing instruments commonly deployed on aircraft and spacecraft. They provide surface reflectance in hundreds of wavelength channels, creating data cubes known as hyperspecrtral images. They provide rich compositional information making them powerful tools for planetary and terrestrial science. These data products can be challenging to interpret because they contain datapoints numbering in the thousands (Dawn VIR) or millions (AVIRIS-C). Cross-image studies or exploratory searches involving more than one scene are rare; data volumes are often tens of GB per image and typical consumer-grade computers cannot store more than a handful of images in RAM. Visualizing the information in a single scene is challenging since the human eye can only distinguish three color channels out of the hundreds available. To date, analysis has been performed mostly on single images using purpose-built software tools that require extensive training and commercial licenses. The HSIFind software suite provides a scalable distributed solution to the problem of visualizing and searching large catalogs of spectral image data. It consists of a RESTful web service that communicates to a javascript-based browser client. The software provides basic visualization through an intuitive visual interface, allowing users with minimal training to explore the images or view selected spectra. Users can accumulate a library of spectra from one or more images and use these to search for similar materials. The result appears as an intensity map showing the extent of a spectral feature in a scene. Continuum removal can isolate diagnostic absorption features. The server-side mapping algorithm uses an efficient matched filter algorithm that can process a megapixel image cube in just a few seconds. This enables real-time interaction, leading to a new way of interacting with the data: the user can launch a search with a single mouse click and see the resulting map in seconds. This allows the user to quickly explore each image, ascertain the main units of surface material, localize outliers, and develop an understanding of the various materials' spectral characteristics. The HSIFind software suite is currently in beta testing at the Planetary Science Institute and a process is underway to release it under an open source license to the broader community. We believe it will benefit instrument operations during remote planetary exploration, where tactical mission decisions demand rapid analysis of each new dataset. The approach also holds potential for public spectral catalogs where its shallow learning curve and portability can make these datasets accessible to a much wider range of researchers. Acknowledgements: The HSIFind project acknowledges the NASA Advanced MultiMission Operating System (AMMOS) and the Multimission Ground Support Services (MGSS). E. Palmer is with the Planetary Science Institute, Tucson, AZ. Other authors are with the Jet Propulsion Laboratory, Pasadena, CA. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration. Copyright 2013, California Institute of Technology.

  18. Why the Petascale era will drive improvements in the management of the full lifecycle of earth science data.

    NASA Astrophysics Data System (ADS)

    Wyborn, L.

    2012-04-01

    The advent of the petascale era, in both storage and compute facilities, will offer new opportunities for earth scientists to transform the way they do their science and to undertake cross-disciplinary science at a global scale. No longer will data have to be averaged and subsampled: it can be analysed to its fullest resolution at national or even global scales. Much larger data volumes can be analysed in single passes and at higher resolution: large scale cross domain science is now feasible. However, in general, earth sciences have been slow to capitalise on the potential of these new petascale compute facilities: many struggle to even use terascale facilities. Our chances of using these new facilities will require a vast improvement in the management of the full life cycle of data: in reality it will need to be transformed. Many of our current issues with earth science data are historic and stem from the limitations of early data storage systems. As storage was so expensive, metadata was usually stored separate from the data and attached as a readme file. Likewise, attributes that defined uncertainty, reliability and traceability were recoded in lab note books and rarely stored with the data. Data were routinely transferred as files. The new opportunities require that the traditional discover, display and locally download and process paradigm is too limited. For data access and assimilation to be improved, data will need to be self describing. For heterogeneous data to be rapidly integrated attributes such as reliability, uncertainty and traceability will need to be systematically recorded with each observation. The petascale era also requires that individual data files be transformed and aggregated into calibrated data arrays or data cubes. Standards become critical and are the enablers of integration. These changes are common to almost every science discipline. What makes earth sciences unique is that many domains record time series data, particularly in the environmental geosciences areas (weathering, soil changes, climate change). The data life cycle will be measured in decades and centuries, not years. Preservation over such time spans is quite a challenge to the earth sciences as data will have to be managed over many evolutions of software and hardware. The focus has to be on managing the data and not the media. Currently storage is not an issue, but it is predicted that data volumes will soon exceed the effective storage media than can be physically manufactured. This means that organisations will have to think about disposal and destruction of data. For earth sciences, this will be a particularly sensitive issue. Petascale computing offers many new opportunities to the earth sciences and by 2020 exascale computers will be a reality. To fully realise these opportunities the earth sciences needs to actively and systematically rethink what the ramifications of these new systems will have on current practices for data storage, discovery, access and assimilation.

  19. Opening Comments: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2008-07-01

    Welcome to Seattle and the 2008 SciDAC Conference. This conference, the fourth in the series, is a continuation of the PI meetings we first began under SciDAC-1. I would like to start by thanking the organizing committee, and Rick Stevens in particular, for organizing this year's meeting. This morning I would like to look briefly at SciDAC, to give you a brief history of SciDAC and also look ahead to see where we plan to go over the next few years. I think the best description of SciDAC, at least the simulation part, comes from a quote from Dr Ray Orbach, DOE's Under Secretary for Science and Director of the Office of Science. In an interview that appeared in the SciDAC Review magazine, Dr Orbach said, `SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing'. Of course, that is extended not just to physical scientists, but also to biological scientists. This is a theme of computational science, this partnership among disciplines, which goes all the way back to the early 1980s and Ken Wilson. It's a unique thread within the Department of Energy. SciDAC-1, launched around the turn of the millennium, created a new generation of scientific simulation codes. It advocated building out mathematical and computing system software in support of science and a new collaboratory software environment for data. The original concept for SciDAC-1 had topical centers for the execution of the various science codes, but several corrections and adjustments were needed. The ASCR scientific computing infrastructure was also upgraded, providing the hardware facilities for the program. The computing facility that we had at that time was the big 3 teraflop/s center at NERSC and that had to be shared with the programmatic side supporting research across DOE. At the time, ESnet was just slightly over half a gig per sec of bandwidth; and the science being addressed was accelerator science, climate, chemistry, fusion, astrophysics, materials science, and QCD. We built out the national collaboratories from the ASCR office, and in addition we built Integrated Software Infrastructure Centers (ISICs). Of these, three were in applied mathematics, four in computer science (including a performance evaluation research center), and four were collaboratories or Grid projects having to do with data management. For science, there were remarkable breakthroughs in simulation, such as full 3D laboratory scale flame simulation. There were also significant improvements in application codes - from factors of almost 3 to more than 100 - and code improvement as people began to realize they had to integrate mathematics tools and computer science tools into their codes to take advantage of the parallelism of the day. The SciDAC data-mining tool, Sapphire, received a 2006 R&D 100 award. And the community as a whole worked well together and began building a publication record that was substantial. In 2006, we recompeted the program with similar goals - SciDAC-1 was very successful, and we wanted to continue that success and extend what was happening under SciDAC to the broader science community. We opened up the partnership to all of the Offices of Science and the NSF and the NNSA. The goal was to create comprehensive scientific computing software and the infrastructure for the software to enable scientific discovery in the physical, biological, and environmental sciences and take the simulations to an extreme scale, in this case petascale. We would also build out a new generation of data management tools. What we observed during SciDAC-1 was that the data and the data communities - both experimental data from large experimental facilities and observational data, along with simulation data - were expanding at a rate significantly faster than Moore's law. In the past few weeks, the FastBit indexing technology software tool for data analyses and data mining developed under SciDAC's Scientific Data Management project was recognized with an R&D 100 Award, selected by an independent judging panel and editors of R&D Magazine as one of the 100 most technologically significant products introduced into the marketplace over the past year. For SciDAC-2 we had nearly 250 proposals requesting a total of slightly over 1 billion in funding. Of course, we had nowhere near 1 billion. The facilities and the science we ended up with were not significantly different from what we had in SciDAC-1. But we had put in place substantially increased facilities for science. When SciDAC-1 was originally executed with the facilities at NERSC, there was significant impact on the resources at NERSC, because not only did we have an expanding portfolio of programmatic science, but we had the SciDAC projects that also needed to run at NERSC. Suddenly, NERSC was incredibly oversubscribed. With SciDAC-2, we had in place leadership-class computing facilities at Argonne with slightly more than half a petaflop and at Oak Ridge with slightly more than a quarter petaflop with an upgrade planned at the end of this year for a petaflop. And we increased the production computing capacity at NERSC to 104 teraflop/s just so that we would not impact the programmatic research and so that we would have a startup facility for SciDAC. At the end of the summer, NERSC will be at 360 teraflop/s. Both the Oak Ridge system and the principal resource at NERSC are Cray systems; Argonne has a different architecture, an IBM Blue Gene/P. At the same time, ESnet has been built out, and we are on a path where we will have dual rings around the country, from 10 to 40 gigabits per second - a factor of 20 to 80 over what was available during SciDAC-1. The science areas include accelerator science and simulation, astrophysics, climate modeling and simulation, computational biology, fusion science, high-energy physics, petabyte high-energy/ nuclear physics, materials science and chemistry, nuclear physics, QCD, radiation transport, turbulence, and groundwater reactive transport modeling and simulation. They were supported by new enabling technology centers and university-based institutes to develop an educational thread for the SciDAC program. There were four mathematics projects and four computer science projects; and under data management, we see a significant difference in that we are bringing up new visualization projects to support and sustain data-intensive science. When we look at the budgets, we see growth in the budget from just under 60 million for SciDAC-1 to just over 80 for SciDAC-2. Part of the growth is due to bringing in NSF and NNSA as new partners, and some of the growth is due to some program offices increasing their investment in SciDAC, while other program offices are constant or have decreased their investment. This is not a reflection of their priorities per se but, rather, a reflection of the budget process and the difficult times in Washington during the past two years. New activities are under way in SciDAC - the annual PI meeting has turned into what I would describe as the premier interdisciplinary computational science meeting, one of the best in the world. Doing interdisciplinary meetings is difficult because people tend to develop a focus for their particular subject area. But this is the fourth in the series; and since the first meeting in San Francisco, these conferences have been remarkably successful. For SciDAC-2 we also created an outreach magazine, SciDAC Review, which highlights scientific discovery as well as high-performance computing. It's been very successful in telling the non-practitioners what SciDAC and computational science are all about. The other new instrument in SciDAC-2 is an outreach center. As we go from computing at the terascale to computing at the petascale, we face the problem of narrowing our research community. The number of people who are `literate' enough to compute at the terascale is more than the number of those who can compute at the petascale. To address this problem, we established the SciDAC Outreach Center to bring people into the fold and educate them as to how we do SciDAC, how the teams are composed, and what it really means to compute at scale. The resources I have mentioned don't come for free. As part of the HECRTF law of 2005, Congress mandated that the Secretary would ensure that leadership-class facilities would be open to everyone across all agencies. So we took Congress at its word, and INCITE is our instrument for making allocations at the leadership-class facilities at Argonne and Oak Ridge, as well as smaller allocations at NERSC. Therefore, the selected proposals are very large projects that are computationally intensive, that compute at scale, and that have a high science impact. An important feature is that INCITE is completely open to anyone - there is no requirement of DOE Office of Science funding, and proposals are rigorously reviewed for both the science and the computational readiness. In 2008, more than 100 proposals were received, requesting about 600 million processor-hours. We allocated just over a quarter of a billion processor-hours. Astrophysics, materials science, lattice gauge theory, and high energy and nuclear physics were the major areas. These were the teams that were computationally ready for the big machines and that had significant science they could identify. In 2009, there will be a significant increase amount of time to be allocated, over half a billion processor-hours. The deadline is August 11 for new proposals and September 12 for renewals. We anticipate a significant increase in the number of requests this year. We expect you - as successful SciDAC centers, institutes, or partnerships - to compete for and win INCITE program allocation awards. If you have a successful SciDAC proposal, we believe it will make you successful in the INCITE review. We have the expectation that you will among those most prepared and most ready to use the machines and to compute at scale. Over the past 18 months, we have assembled a team to look across our computational science portfolio and to judge what are the 10 most significant science accomplishments. The ASCR office, as it goes forward with OMB, the new administration, and Congress, will be judged by the science we have accomplished. All of our proposals - such as for increasing SciDAC, increasing applied mathematics, and so on - are tied to what have we accomplished in science. And so these 10 big accomplishments are key to establishing credibility for new budget requests. Tony Mezzacappa, who chaired the committee, will also give a presentation on the ranking of these top 10, how they got there, and what the science is all about. Here is the list - numbers 2, 5, 6, 7, 9, and 10 are all SciDAC projects. RankTitle 1Modeling the Molecular Basis of Parkinson's Disease (Tsigelny) 2Discovery of the Standing Accretion Shock Instability and Pulsar Birth Mechanism in a Core-Collapse Supernova Evolution and Explosion (Blondin) 3Prediction and Design of Macromolecular Structures and Functions (Baker) 4Understanding How Lifted Flame Stabilized in a Hot Coflow (Yoo) 5New Insights from LCF-enabled Advanced Kinetic Simulations of Global Turbulence in Fusion Systems (Tang) 6High Transition Temperature Superconductivity: A High-Temperature Superconductive State and a Pairing Mechanism in 2-D Hubbard Model (Scalapino) 7 PETsc: Providing the Solvers for DOE High-Performance Simulations (Smith) 8 Via Lactea II, A Billion Particle Simulation of the Dark Matter Halo of the Milky Way (Madau) 9Probing the Properties of Water through Advanced Computing (Galli) 10First Provably Scalable Maxwell Solver Enables Scalable Electromagnetic Simulations (Kovel) So, what's the future going to look like for us? The office is putting together an initiative with the community, which we call the E3 Initiative. We're looking for a 10-year horizon for what's going to happen. Through the series of town hall meetings, which many of you participated in, we have produced a document on `Transforming Energy, the Environment and Science through simulations at the eXtreme Scale'; it can be found at http://www.science.doe.gov/ascr/ProgramDocuments/TownHall.pdf . We sometimes call it the Exascale initiative. Exascale computing is the gold-ring level of computing that seems just out of reach; but if we work hard and stretch, we just might be able to reach it. We envision that there will be a SciDAC-X, working at the extreme scale, with SciDAC teams that will perform and carry out science in the areas that will have a great societal impact, such as alternative fuels and transportation, combustion, climate, fusion science, high-energy physics, advanced fuel cycles, carbon management, and groundwater. We envision institutes for applied mathematics and computer science that probably will segue into algorithms because, at the extreme scale, we see the distinction between the applied math and the algorithm per se and its implementation in computer science as being inseparable. We envision an INCITE-X with multi-petaflop platforms, perhaps even exaflop computing resources. ESnet will be best in class - our 10-year plan calls for having 400 terabits per second capacity available in dual rings around the country, an enormously fast data communications network for moving large amounts of data. In looking at where we've been and where we are going, we can see that the gigaflops and teraflops era was a regime where we were following Moore's law through advances in clock speed. In the current regime, we're introducing massive parallelism, which I think is exemplified by Intel's announcement of their teraflop chip, where they envision more than a thousand cores on a chip. But in order to reach exascale, extrapolations talk about machines that require 100 megawatts of power in terms of current architectures. It's clearly going to require novel architectures, things we have perhaps not yet envisioned. It is of course an era of challenge. There will be an unpredictable evolution of hardware if we are to reach the exascale; and there will clearly be multilevel heterogeneous parallelism, including multilevel memory hierarchies. We have no idea right now as to the programming models needed to execute at such an extreme scale. We have been incredibly successful at the petascale - we know that already. Managing data and just getting communications to scale is an enormous challenge. And it's not just the extreme scaling. It's the rapid increase in complexity that represents the challenge. Let me end with a metaphor. In previous meetings we have talked about the road to petascale. Indeed, we have seen in hindsight that it was a road well traveled. But perhaps the road to exascale is not a road at all. Perhaps the metaphor will be akin to scaling the south face of K2. That's clearly not something all of us will be able to do, and probably computing at the exascale is not something all of us will do. But if we achieve that goal, perhaps the words of Emily Dickinson will best summarize where we will be. Perhaps in her words, looking backward and down, you will say: I climb the `Hill of Science' I view the landscape o'er; Such transcendental prospect I ne'er beheld before!

  20. The impact of SciDAC on US climate change research and the IPCCAR4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wehner, Michael

    2005-07-08

    SciDAC has invested heavily in climate change research. We offer a candid opinion as to the impact of the DOE laboratories' SciDAC projects on the upcoming Fourth Assessment Report of the Intergovernmental Panel on Climate Change. As a result of the direct importance of climate change to society, climate change research is highly coordinated at the international level. The Intergovernmental Panel on Climate Change (IPCC) is charged with providing regular reports on the state of climate change research to government policymakers. These reports are the product of thousands of scientists efforts. A series of reviews involving both scientists and policymakersmore » make them among the most reviewed documents produced in any scientific field. The high profile of these reports acts a driver to many researchers in the climate sciences. The Fourth Assessment Report (AR4) is scheduled to be released in 2007. SciDAC sponsored research has enabled the United States climate modeling community to make significant contributions to this report. Two large multi-Laboratory SciDAC projects are directly relevant to the activities of the IPCC. The first, entitled ''Collaborative Design and Development of the Community Climate System Model for Terascale Computers'', has made important software contributions to the recently released third version of the Community Climate System Model (CCSM3.0) developed at the National Center for Atmospheric Research. This is a multi-institutional project involving Los Alamos National Laboratory, Oak Ridge National Laboratory, Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, Argonne National Laboratory, Lawrence Livermore National Laboratory and the National Center for Atmospheric Research. The original principal investigators were Robert Malone and John B. Drake. The current principal investigators are Phil Jones and John B. Drake. The second project, entitled ''Earth System Grid II: Turning Climate Datasets into Community Resources'' aims to facilitate the distribution of the copious amounts of data produced by coupled climate model integrations to the general scientific community. This is also a multi-institutional project involving Argonne National Laboratory, Oak Ridge National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory and the National Center for Atmospheric Research. The principal investigators are Ian Foster, Don Middleton and Dean Williams. Perhaps most significant among the activities of the ''Collaborative Design'', project was the development of an efficient multi-processor coupling package. CCSM3.0 is an extraordinarily complicated physics code. The fully coupled model consists of separate submodels of the atmosphere, ocean, sea ice and land. In addition, comprehensive biogeochemistry and atmospheric chemistry submodels are under intensive current development. Each of these submodels is a large and sophisticated program in its own right. Furthermore, in the coupled model, each of the submodels, including the coupler, is a separate multiprocessor executable program. The coupler package must efficiently coordinate the communication as well as interpolate or aggregate information between these programs. This regridding function is necessary because each major subsystem (air, water or surface) is allowed to have its own independent grid.« less

  1. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and ability to exercise key hardware and software components. Possible early applications might include climate models; studies of the magnetic properties of nanoparticles as they relate to ultra-high density storage media; the rational design of chemical catalysts, the modeling of combustion processes that will lead to cleaner burning coal, and fusion and astrophysics research. I have presented just a few of the challenges that we look forward to on the road to petascale computing. Our road to petascale science might be paraphrased by the quote from e e cummings, ‘somewhere I have never traveled, gladly beyond any experience . . .’

  2. Connecting LHC signals with deep physics at the TeV scale and baryogenesis

    NASA Astrophysics Data System (ADS)

    Shu, Jing

    We address in this dissertation two primary questions aimed at deciphering collider signals at the Large Hadron Collider (LHC) to give a deep and concrete understanding of the TeV scale physics and to interpret the origin of baryon asymmetry in our universe. We are at a stage of exploring new physics at the terascale which is responsible for the electroweak symmetry breaking (EWSB) in the Standard Model (SM) of particle physics. The LHC, which begins its operation this year, will break us into such a new energy frontier and seek for the possible signals of new physics. Theorists have come up with many possible models beyond SM to explain the origin of EWSB. However, how we will determine the underlying physics from LHC data is still an open question. In the first part of this dissertation, we consider several examples to connect the expected LHC signals to the underlying physics in a completely model independent way. We first explore the Randall-Sundrum (RS) scenario, and use the collider signals of first Kaluza-Klein (KK) excitations of gluons to discriminate several commonly considered theories which attempt to render RS consistent with precision electroweak data. We then investigate top compositeness. We derive a bound for the energy scale of right handed top compositeness from top pair production at the Tevatron, and we find that the cross section to produce four tops will be greatly amplified by 3 orders of magnitude. We next consider the possibilities that the gauge symmetry in the underlying theory is violated in the incomplete theory that we can reconstruct from the LHC observables. We derive a model independent bound on the scale of new physics from unitarity of the S-matrix if we observe a new massive vector boson with nonzero axial couplings to fermions at LHC. Finally, we derive a generalized Landau-Yang theorem and apply it to the Z' decay into two Z bosons. We show that there is a phase shift in the azimuthal angle distribution in the normalized differential cross section and the anomalous coupling of Z'-Z-Z can be discriminated from the regular one at the 3s level when both Z bosons decay leptonically at the LHC. The origin of baryon asymmetry of the Universe (BAU) remains an important, unsolved problem for particle physics and cosmology, and is one of the motivations to search for possible new physics beyond SM. In the second part of this dissertation, we attempt to account for the baryon number generation in our universe through some novel mechanisms. We first systematically investigate models of baryogenesis from spontaneously Lorentz violating background (SLVB). We find that the sphaleron transitions will generate a nonzero B+L asymmetry in the presence of SLVB and we identify two scenarios of interest. We then consider the possibilities to generate a baryon asymmetry through an earlier time phase transition and address the question whether or not we can still test the baryogenesis mechanism at LHC/ILC if the electroweak phase transition is not strongly first order. We find a general framework and realize this idea in the top flavor model. We show that the realistic baryon density can be achieved in the natural parameter space of topflavor model.

  3. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  4. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

  5. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation.

    PubMed

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki; Ottesen, Bent; Konge, Lars; Dieckmann, Peter; Van der Vleuten, Cees

    2017-01-21

    Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities are called in-house training. In-house training facilities can be part of hospital departments and resemble to some extent simulation centres but often have less technical equipment. In situ simulation, introduced over the past decade, mainly comprises of team-based activities and occurs in patient care units with healthcare professionals in their own working environment. Thus, this intentional blend of simulation and real working environments means that in situ simulation brings simulation to the real working environment and provides training where people work. In situ simulation can be either announced or unannounced, the latter also known as a drill. This article presents and discusses the design of SBME and the advantage and disadvantage of the different simulation settings, such as training in simulation-centres, in-house simulations in hospital departments, announced or unannounced in situ simulations. Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence individual or team learning. However, hospital department-based simulations, such as in-house simulation and in situ simulation, lead to a gain in organisational learning. To our knowledge no studies have compared announced and unannounced in situ simulation. The literature suggests some improved organisational learning from unannounced in situ simulation; however, unannounced in situ simulation was also found to be challenging to plan and conduct, and more stressful among participants. The importance of setting, context and fidelity are discussed. Based on the current limited research we suggest that choice of setting for simulations does not seem to influence individual and team learning. Department-based local simulation, such as simulation in-house and especially in situ simulation, leads to gains in organisational learning. The overall objectives of simulation-based education and factors such as feasibility can help determine choice of simulation setting.

  6. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  7. [Malfunction simulation by spaceflight training simulator].

    PubMed

    Chang, Tian-chun; Zhang, Lian-hua; Xue, Liang; Lian, Shun-guo

    2005-04-01

    To implement malfunction simulation in spaceflight training simulator. The principle of malfunction simulation was defined according to spacecraft malfunction predict and its countermeasures. The malfunction patterns were classified, and malfunction type was confirmed. A malfunction simulation model was established, and the malfunction simulation was realized by math simulation. According to the requirement of astronaut training, a spacecraft subsystem malfunction simulation model was established and realized, such as environment control and life support, GNC, push, power supply, heat control, data management, measure control and communication, structure and so on. The malfunction simulation function implemented in the spaceflight training simulator satisfied the requirements for astronaut training.

  8. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  9. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  10. Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fairchild, B.T.

    1987-01-01

    The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less

  11. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  12. Advancing renal education: hybrid simulation, using simulated patients to enhance realism in haemodialysis education.

    PubMed

    Dunbar-Reid, Kylie; Sinclair, Peter M; Hudson, Denis

    2015-06-01

    Simulation is a well-established and proven teaching method, yet its use in renal education is not widely reported. Criticisms of simulation-based teaching include limited realism and a lack of authentic patient interaction. This paper discusses the benefits and challenges of high-fidelity simulation and suggests hybrid simulation as a complementary model to existing simulation programmes. Through the use of a simulated patient, hybrid simulation can improve the authenticity of renal simulation-based education while simultaneously teaching and assessing technologically enframed caring. © 2015 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  13. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  14. Operationalizing Healthcare Simulation Psychological Safety: A Descriptive Analysis of an Intervention.

    PubMed

    Henricksen, Jared W; Altenburg, Catherine; Reeder, Ron W

    2017-10-01

    Despite efforts to prepare a psychologically safe environment, simulation participants are occasionally psychologically distressed. Instructing simulation educators about participant psychological risks and having a participant psychological distress action plan available to simulation educators may assist them as they seek to keep all participants psychologically safe. A Simulation Participant Psychological Safety Algorithm was designed to aid simulation educators as they debrief simulation participants perceived to have psychological distress and categorize these events as mild (level 1), moderate (level 2), or severe (level 3). A prebrief dedicated to creating a psychologically safe learning environment was held constant. The algorithm was used for 18 months in an active pediatric simulation program. Data collected included level of participant psychological distress as perceived and categorized by the simulation team using the algorithm, type of simulation that participants went through, who debriefed, and timing of when psychological distress was perceived to occur during the simulation session. The Kruskal-Wallis test was used to evaluate the relationship between events and simulation type, events and simulation educator team who debriefed, and timing of event during the simulation session. A total of 3900 participants went through 399 simulation sessions between August 1, 2014, and January 26, 2016. Thirty-four (<1%) simulation participants from 27 sessions (7%) were perceived to have an event. One participant was perceived to have a severe (level 3) psychological distress event. Events occurred more commonly in high-intensity simulations, with novice learners and with specific educator teams. Simulation type and simulation educator team were associated with occurrence of events (P < 0.001). There was no association between event timing and event level. Severe psychological distress as categorized by simulation personnel using the Simulation Participant Psychological Safety Algorithm is rare, with mild and moderate events being more common. The algorithm was used to teach simulation educators how to assist a participant who may be psychologically distressed and document perceived event severity.

  15. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  16. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  17. Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK

    PubMed Central

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting

    2016-01-01

    This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs. PMID:27509507

  18. Tri-FAST Hardware-in-the-Loop Simulation. Volume I. Tri-FAST Hardware-in-the-Loop Simulation at the Advanced Simulation Center

    DTIC Science & Technology

    1979-03-28

    TECHNICAL REPORT T-79-43 TRI- FAST HARDWARE-IN-THE-LOOP SIMULATION Volume 1: Trn FAST Hardware-In-the. Loop Simulation at the Advanced Simulation...Identify by block number) Tri- FAST Hardware-in-the-Loop ACSL Advanced Simulation Center Simulation RF Target Models I a. AfIACT ( sin -oveme skit N nem...e n tdositr by block number) The purpose of this report is to document the Tri- FAST missile simulation development and the seeker hardware-in-the

  19. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  20. Genetic data simulators and their applications: an overview

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Gillanders, Elizabeth; Feuer, Eric J.

    2016-01-01

    Computer simulations have played an indispensable role in the development and application of statistical models and methods for genetic studies across multiple disciplines. The need to simulate complex evolutionary scenarios and pseudo-datasets for various studies has fueled the development of dozens of computer programs with varying reliability, performance, and application areas. To help researchers compare and choose the most appropriate simulators for their studies, we have created the Genetic Simulation Resources (GSR) website, which allows authors of simulation software to register their applications and describe them with more than 160 defined attributes. This article summarizes the properties of 93 simulators currently registered at GSR and provides an overview of the development and applications of genetic simulators. Unlike other review articles that address technical issues or compare simulators for particular application areas, we focus on software development, maintenance, and features of simulators, often from a historical perspective. Publications that cite these simulators are used to summarize both the applications of genetic simulations and the utilization of simulators. PMID:25504286

  1. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  2. Response of Flight Nurses in a Simulated Helicopter Environment.

    PubMed

    Kaniecki, David M; Hickman, Ronald L; Alfes, Celeste M; Reimer, Andrew P

    The purpose of this study was to determine if a helicopter flight simulator could provide a useful educational platform by creating experiences similar to those encountered by actual flight nurses. Flight nurse (FN) and non-FN participants completed a simulated emergency scenario in a flight simulator. Physiologic and psychological stress during the simulation was measured using heart rate and perceived stress scores. A questionnaire was then administered to assess the realism of the flight simulator. Subjects reported that the overall experience in the flight simulator was comparable with a real helicopter. Sounds, communications, vibrations, and movements in the simulator most approximated those of a real-life helicopter environment. Perceived stress levels of all participants increased significantly from 27 (on a 0-100 scale) before simulation to 51 at the peak of the simulation and declined thereafter to 28 (P < .001). Perceived stress levels of FNs increased significantly from 25 before simulation to 54 at the peak of the simulation and declined thereafter to 30 (P < .001). Perceived stress levels of non-FNs increased significantly from 31 before simulation to 49 at the peak of the simulation and declined thereafter to 25 (P < .001). There were no significant differences in perceived stress levels between FNs and non-FNs before (P = .58), during (P = .63), or after (P = .55) simulation. FNs' heart rates increased significantly from 77 before simulation to 100 at the peak of the simulation and declined thereafter to 72 (P < .001). The results of this study suggest that simulation of a critical care scenario in a high-fidelity helicopter flight simulator can provide a realistic helicopter transport experience and create physiologic and psychological stress for participants. Copyright © 2017 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.

  3. The effects of simulated fog and motion on simulator sickness in a driving simulator and the duration of after-effects.

    PubMed

    Dziuda, Lukasz; Biernacki, Marcin P; Baran, Paulina M; Truszczyński, Olaf E

    2014-05-01

    In the study, we checked: 1) how the simulator test conditions affect the severity of simulator sickness symptoms; 2) how the severity of simulator sickness symptoms changes over time; and 3) whether the conditions of the simulator test affect the severity of these symptoms in different ways, depending on the time that has elapsed since the performance of the task in the simulator. We studied 12 men aged 24-33 years (M = 28.8, SD = 3.26) using a truck simulator. The SSQ questionnaire was used to assess the severity of the symptoms of simulator sickness. Each of the subjects performed three 30-minute tasks running along the same route in a driving simulator. Each of these tasks was carried out in a different simulator configuration: A) fixed base platform with poor visibility; B) fixed base platform with good visibility; and C) motion base platform with good visibility. The measurement of the severity of the simulator sickness symptoms took place in five consecutive intervals. The results of the analysis showed that the simulator test conditions affect in different ways the severity of the simulator sickness symptoms, depending on the time which has elapsed since performing the task on the simulator. The simulator sickness symptoms persisted at the highest level for the test conditions involving the motion base platform. Also, when performing the tasks on the motion base platform, the severity of the simulator sickness symptoms varied depending on the time that had elapsed since performing the task. Specifically, the addition of motion to the simulation increased the oculomotor and disorientation symptoms reported as well as the duration of the after-effects. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Current status of endoscopic simulation in gastroenterology fellowship training programs.

    PubMed

    Jirapinyo, Pichamol; Thompson, Christopher C

    2015-07-01

    Recent guidelines have encouraged gastroenterology and surgical training programs to integrate simulation into their core endoscopic curricula. However, the role that simulation currently has within training programs is unknown. This study aims to assess the current status of simulation among gastroenterology fellowship programs. This questionnaire study consisted of 38 fields divided into two sections. The first section queried program directors' experience on simulation and assessed the current status of simulation at their institution. The second portion surveyed their opinion on the potential role of simulation on the training curriculum. The study was conducted at the 2013 American Gastroenterological Association Training Directors' Workshop in Phoenix, Arizona. The participants were program directors from Accreditation Council for Graduate Medical Education accredited gastroenterology training programs, who attended the workshop. The questionnaire was returned by 69 of 97 program directors (response rate of 71%). 42% of programs had an endoscopic simulator. Computerized simulators (61.5%) were the most common, followed by mechanical (30.8%) and animal tissue (7.7%) simulators, respectively. Eleven programs (15%) required fellows to use simulation prior to clinical cases. Only one program has a minimum number of hours fellows have to participate in simulation training. Current simulators are deemed as easy to use (76%) and good educational tools (65%). Problems are cost (72%) and accessibility (69%). The majority of program directors believe that there is a need for endoscopic simulator training, with only 8% disagreeing. Additionally, a majority believe there is a role for simulation prior to initiation of clinical cases with 15% disagreeing. Gastroenterology fellowship program directors widely recognize the importance of simulation. Nevertheless, simulation is used by only 42% of programs and only 15% of programs require that trainees use simulation prior to clinical cases. No programs currently use simulation as part of the evaluation process.

  5. Building the evidence on simulation validity: comparison of anesthesiologists' communication patterns in real and simulated cases.

    PubMed

    Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F

    2014-01-01

    Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.

  6. Simulation Activity in Otolaryngology Residencies.

    PubMed

    Deutsch, Ellen S; Wiet, Gregory J; Seidman, Michael; Hussey, Heather M; Malekzadeh, Sonya; Fried, Marvin P

    2015-08-01

    Simulation has become a valuable tool in medical education, and several specialties accept or require simulation as a resource for resident training or assessment as well as for board certification or maintenance of certification. This study investigates current simulation resources and activities in US otolaryngology residency programs and examines interest in advancing simulation training and assessment within the specialty. Web-based survey. US otolaryngology residency training programs. An electronic web-based survey was disseminated to all US otolaryngology program directors to determine their respective institutional and departmental simulation resources, existing simulation activities, and interest in further simulation initiatives. Descriptive results are reported. Responses were received from 43 of 104 (43%) residency programs. Simulation capabilities and resources are available in most respondents' institutions (78.6% report onsite resources; 73.8% report availability of models, manikins, and devices). Most respondents (61%) report limited simulation activity within otolaryngology. Areas of simulation are broad, addressing technical and nontechnical skills related to clinical training (94%). Simulation is infrequently used for research, credentialing, or systems improvement. The majority of respondents (83.8%) expressed interest in participating in multicenter trials of simulation initiatives. Most respondents from otolaryngology residency programs have incorporated some simulation into their curriculum. Interest among program directors to participate in future multicenter trials appears high. Future research efforts in this area should aim to determine optimal simulators and simulation activities for training and assessment as well as how to best incorporate simulation into otolaryngology residency training programs. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  7. To simulate or not to simulate: what are the questions?

    PubMed

    Dudai, Yadin; Evers, Kathinka

    2014-10-22

    Simulation is a powerful method in science and engineering. However, simulation is an umbrella term, and its meaning and goals differ among disciplines. Rapid advances in neuroscience and computing draw increasing attention to large-scale brain simulations. What is the meaning of simulation, and what should the method expect to achieve? We discuss the concept of simulation from an integrated scientific and philosophical vantage point and pinpoint selected issues that are specific to brain simulation.

  8. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  9. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  10. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  11. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  12. Residents' perceptions of simulation as a clinical learning approach.

    PubMed

    Walsh, Catharine M; Garg, Ankit; Ng, Stella L; Goyal, Fenny; Grover, Samir C

    2017-02-01

    Simulation is increasingly being integrated into medical education; however, there is little research into trainees' perceptions of this learning modality. We elicited trainees' perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4-6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Residents' perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents' markedly narrow perception of simulation's capacity to support non-technical skills development or its use beyond introductory learning. Trainees' learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees' a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation.

  13. A Simulation of Alternatives for Wholesale Inventory Replenishment

    DTIC Science & Technology

    2016-03-01

    algorithmic details. The last method is a mixed-integer, linear optimization model. Comparative Inventory Simulation, a discrete event simulation model, is...simulation; event graphs; reorder point; fill-rate; backorder; discrete event simulation; wholesale inventory optimization model 15. NUMBER OF PAGES...model. Comparative Inventory Simulation, a discrete event simulation model, is designed to find fill rates achieved for each National Item

  14. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  15. Future directions in flight simulation: A user perspective

    NASA Technical Reports Server (NTRS)

    Jackson, Bruce

    1993-01-01

    Langley Research Center was an early leader in simulation technology, including a special emphasis in space vehicle simulations such as the rendezvous and docking simulator for the Gemini program and the lunar landing simulator used before Apollo. In more recent times, Langley operated the first synergistic six degree of freedom motion platform (the Visual Motion Simulator, or VMS) and developed the first dual-dome air combat simulator, the Differential Maneuvering Simulator (DMS). Each Langley simulator was developed more or less independently from one another with different programming support. At present time, the various simulation cockpits, while supported by the same host computer system, run dissimilar software. The majority of recent investments in Langley's simulation facilities have been hardware procurements: host processors, visual systems, and most recently, an improved motion system. Investments in software improvements, however, have not been of the same order.

  16. DIMENSIONS OF SIMULATION.

    ERIC Educational Resources Information Center

    CRAWFORD, MEREDITH P.

    OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…

  17. Man-in-the-control-loop simulation of manipulators

    NASA Technical Reports Server (NTRS)

    Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.

  18. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines.

    PubMed

    Dubrowski, Adam; Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-11-02

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation - occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation - collectively, the building blocks of optimal healthcare.

  19. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines

    PubMed Central

    Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-01-01

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation – occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation – collectively, the building blocks of optimal healthcare.  PMID:26677421

  20. Conducting Simulation Studies in the R Programming Environment.

    PubMed

    Hallgren, Kevin A

    2013-10-12

    Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.

  1. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  2. Fluid Structural Analysis of Human Cerebral Aneurysm Using Their Own Wall Mechanical Properties

    PubMed Central

    Valencia, Alvaro; Burdiles, Patricio; Ignat, Miguel; Mura, Jorge; Rivera, Rodrigo; Sordo, Juan

    2013-01-01

    Computational Structural Dynamics (CSD) simulations, Computational Fluid Dynamics (CFD) simulation, and Fluid Structure Interaction (FSI) simulations were carried out in an anatomically realistic model of a saccular cerebral aneurysm with the objective of quantifying the effects of type of simulation on principal fluid and solid mechanics results. Eight CSD simulations, one CFD simulation, and four FSI simulations were made. The results allowed the study of the influence of the type of material elements in the solid, the aneurism's wall thickness, and the type of simulation on the modeling of a human cerebral aneurysm. The simulations use their own wall mechanical properties of the aneurysm. The more complex simulation was the FSI simulation completely coupled with hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness. The FSI simulation coupled in one direction using hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness is the one that presents the most similar results with respect to the more complex FSI simulation, requiring one-fourth of the calculation time. PMID:24151523

  3. NASA Lunar Regolith Simulant Program

    NASA Technical Reports Server (NTRS)

    Edmunson, J.; Betts, W.; Rickman, D.; McLemore, C.; Fikes, J.; Stoeser, D.; Wilson, S.; Schrader, C.

    2010-01-01

    Lunar regolith simulant production is absolutely critical to returning man to the Moon. Regolith simulant is used to test hardware exposed to the lunar surface environment, simulate health risks to astronauts, practice in situ resource utilization (ISRU) techniques, and evaluate dust mitigation strategies. Lunar regolith simulant design, production process, and management is a cooperative venture between members of the NASA Marshall Space Flight Center (MSFC) and the U.S. Geological Survey (USGS). The MSFC simulant team is a satellite of the Dust group based at Glenn Research Center. The goals of the cooperative group are to (1) reproduce characteristics of lunar regolith using simulants, (2) produce simulants as cheaply as possible, (3) produce simulants in the amount needed, and (4) produce simulants to meet users? schedules.

  4. Channel simulation to facilitate mobile-satellite communications research

    NASA Technical Reports Server (NTRS)

    Davarian, Faramaz

    1987-01-01

    The mobile-satellite-service channel simulator, which is a facility for an end-to-end hardware simulation of mobile satellite communications links is discussed. Propagation effects, Doppler, interference, band limiting, satellite nonlinearity, and thermal noise have been incorporated into the simulator. The propagation environment in which the simulator needs to operate and the architecture of the simulator are described. The simulator is composed of: a mobile/fixed transmitter, interference transmitters, a propagation path simulator, a spacecraft, and a fixed/mobile receiver. Data from application experiments conducted with the channel simulator are presented; the noise converison technique to evaluate interference effects, the error floor phenomenon of digital multipath fading links, and the fade margin associated with a noncoherent receiver are examined. Diagrams of the simulator are provided.

  5. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  6. The optical design and simulation of the collimated solar simulator

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Ma, Tao

    2018-01-01

    The solar simulator is a lighting device that can simulate the solar radiation. It has been widely used in the testing of solar cells, satellite space environment simulation and ground experiment, test and calibration precision of solar sensor. The solar simulator mainly consisted of short—arc xenon lamp, ellipsoidal reflectors, a group of optical integrator, field stop, aspheric folding mirror and collimating reflector. In this paper, the solar simulator's optical system basic size are given by calculation. Then the system is optically modeled with the Lighttools software, and the simulation analysis on solar simulator using the Monte Carlo ray -tracing technique is conducted. Finally, the simulation results are given quantitatively by diagrammatic form. The rationality of the design is verified on the basis of theory.

  7. Method and system for fault accommodation of machines

    NASA Technical Reports Server (NTRS)

    Goebel, Kai Frank (Inventor); Subbu, Rajesh Venkat (Inventor); Rausch, Randal Thomas (Inventor); Frederick, Dean Kimball (Inventor)

    2011-01-01

    A method for multi-objective fault accommodation using predictive modeling is disclosed. The method includes using a simulated machine that simulates a faulted actual machine, and using a simulated controller that simulates an actual controller. A multi-objective optimization process is performed, based on specified control settings for the simulated controller and specified operational scenarios for the simulated machine controlled by the simulated controller, to generate a Pareto frontier-based solution space relating performance of the simulated machine to settings of the simulated controller, including adjustment to the operational scenarios to represent a fault condition of the simulated machine. Control settings of the actual controller are adjusted, represented by the simulated controller, for controlling the actual machine, represented by the simulated machine, in response to a fault condition of the actual machine, based on the Pareto frontier-based solution space, to maximize desirable operational conditions and minimize undesirable operational conditions while operating the actual machine in a region of the solution space defined by the Pareto frontier.

  8. The Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.

    2007-01-01

    The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.

  9. Warriors Edge Simulation and Gaming System: The Squad Simulation

    DTIC Science & Technology

    2005-08-01

    Warriors Edge Simulation and Gaming System: The Squad Simulation by Mark Thomas and Gary Moss ARL-TR-3564 August 2005...Edge Simulation and Gaming System: The Squad Simulation Mark Thomas and Gary Moss Computational and Information Sciences Directorate, ARL...2004–30 September 2004 5a. CONTRACT NUMBER 5b. GRANT NUMBER 4. TITLE AND SUBTITLE Warriors Edge Simulation and Gaming System: The Squad

  10. Mathematical modeling and SAR simulation multifunction SAR technology efforts

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    The orbital SAR (synthetic aperture radar) simulation data was used in several simulation efforts directed toward advanced SAR development. Efforts toward simulating an operational radar, simulation of antenna polarization effects, and simulation of SAR images at serveral different wavelengths are discussed. Avenues for improvements in the orbital SAR simulation and its application to the development of advanced digital radar data processing schemes are indicated.

  11. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  12. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  13. Expansion of flight simulator capability for study and solution of aircraft directional control problems on runways

    NASA Technical Reports Server (NTRS)

    Kibbee, G. W.

    1978-01-01

    The development, evaluation, and evaluation results of a DC-9-10 runway directional control simulator are described. An existing wide bodied flight simulator was modified to this aircraft configuration. The simulator was structured to use either two of antiskid simulations; (1) an analog mechanization that used aircraft hardware; or (2) a digital software simulation. After the simulation was developed it was evaluated by 14 pilots who made 818 simulated flights. These evaluations involved landings, rejected takeoffs, and various ground maneuvers. Qualitatively most pilots evaluated the simulator as realistic with good potential especially for pilot training for adverse runway conditions.

  14. Physical Models and Virtual Reality Simulators in Otolaryngology.

    PubMed

    Javia, Luv; Sardesai, Maya G

    2017-10-01

    The increasing role of simulation in the medical education of future otolaryngologists has followed suit with other surgical disciplines. Simulators make it possible for the resident to explore and learn in a safe and less stressful environment. The various subspecialties in otolaryngology use physical simulators and virtual-reality simulators. Although physical simulators allow the operator to make direct contact with its components, virtual-reality simulators allow the operator to interact with an environment that is computer generated. This article gives an overview of the various types of physical simulators and virtual-reality simulators used in otolaryngology that have been reported in the literature. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Drape simulation and subjective assessment of virtual drape

    NASA Astrophysics Data System (ADS)

    Buyukaslan, E.; Kalaoglu, F.; Jevsnik, S.

    2017-10-01

    In this study, a commercial 3D virtual garment simulation software (Optitex) is used to simulate drape behaviours of five different fabrics. Mechanical properties of selected fabrics are measured by Fabric Assurance by Simple Testing (FAST) method. Measured bending, shear and extension properties of fabrics are inserted to the simulation software to achieve more realistic simulations. Simulation images of fabrics are shown to 27 people and they are asked to match real drape images of fabrics with simulated drape images. Fabric simulations of two fabrics were correctly matched by the majority of the test group. However, the other three fabrics’ simulations were mismatched by most of the people.

  16. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  17. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  18. Solving search problems by strongly simulating quantum circuits

    PubMed Central

    Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.

    2013-01-01

    Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585

  19. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  20. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  1. Requirements and Techniques for Developing and Measuring Simulant Materials

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Owens, Charles; Howard, Rick

    2006-01-01

    The 1989 workshop report entitled Workshop on Production and Uses of Simulated Lunar Materials and the Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage, NASA Technical Publication identify and reinforced a need for a set of standards and requirements for the production and usage of the lunar simulant materials. As NASA need prepares to return to the moon, a set of requirements have been developed for simulant materials and methods to produce and measure those simulants have been defined. Addressed in the requirements document are: 1) a method for evaluating the quality of any simulant of a regolith, 2) the minimum Characteristics for simulants of lunar regolith, and 3) a method to produce lunar regolith simulants needed for NASA's exploration mission. A method to evaluate new and current simulants has also been rigorously defined through the mathematics of Figures of Merit (FoM), a concept new to simulant development. A single FoM is conceptually an algorithm defining a single characteristic of a simulant and provides a clear comparison of that characteristic for both the simulant and a reference material. Included as an intrinsic part of the algorithm is a minimum acceptable performance for the characteristic of interest. The algorithms for the FoM for Standard Lunar Regolith Simulants are also explicitly keyed to a recommended method to make lunar simulants.

  2. The internal validity of arthroscopic simulators and their effectiveness in arthroscopic education.

    PubMed

    Slade Shantz, Jesse Alan; Leiter, Jeff R S; Gottschalk, Tania; MacDonald, Peter Benjamin

    2014-01-01

    The purpose of this systematic review was to identify standard procedures for the validation of arthroscopic simulators and determine whether simulators improve the surgical skills of users. Arthroscopic simulator validation studies and randomized trials assessing the effectiveness of arthroscopic simulators in education were identified from online databases, as well as, grey literature and reference lists. Only validation studies and randomized trials were included for review. Study heterogeneity was calculated and where appropriate, study results were combined employing a random effects model. Four hundred and thirteen studies were reviewed. Thirteen studies met the inclusion criteria assessing the construct validity of simulators. A pooled analysis of internal validation studies determined that simulators could discriminate between novice and experts, but not between novice and intermediate trainees on time of completion of a simulated task. Only one study assessed the utility of a knee simulator in training arthroscopic skills directly and demonstrated that the skill level of simulator-trained residents was greater than non-simulator-trained residents. Excessive heterogeneity exists in the literature to determine the internal and transfer validity of arthroscopic simulators currently available. Evidence suggests that simulators can discriminate between novice and expert users, but discrimination between novice and intermediate trainees in surgical education should be paramount. International standards for the assessment of arthroscopic simulator validity should be developed to increase the use and effectiveness of simulators in orthopedic surgery.

  3. Evaluating best educational practices, student satisfaction, and self-confidence in simulation: A descriptive study.

    PubMed

    Zapko, Karen A; Ferranto, Mary Lou Gemma; Blasiman, Rachael; Shelestak, Debra

    2018-01-01

    The National League for Nursing (NLN) has endorsed simulation as a necessary teaching approach to prepare students for the demanding role of professional nursing. Questions arise about the suitability of simulation experiences to educate students. Empirical support for the effect of simulation on patient outcomes is sparse. Most studies on simulation report only anecdotal results rather than data obtained using evaluative tools. The aim of this study was to examine student perception of best educational practices in simulation and to evaluate their satisfaction and self-confidence in simulation. This study was a descriptive study designed to explore students' perceptions of the simulation experience over a two-year period. Using the Jeffries framework, a Simulation Day was designed consisting of serial patient simulations using high and medium fidelity simulators and live patient actors. The setting for the study was a regional campus of a large Midwestern Research 2 university. The convenience sample consisted of 199 participants and included sophomore, junior, and senior nursing students enrolled in the baccalaureate nursing program. The Simulation Days consisted of serial patient simulations using high and medium fidelity simulators and live patient actors. Participants rotated through four scenarios that corresponded to their level in the nursing program. Data was collected in two consecutive years. Participants completed both the Educational Practices Questionnaire (Student Version) and the Student Satisfaction and Self-Confidence in Learning Scale. Results provide strong support for using serial simulation as a learning tool. Students were satisfied with the experience, felt confident in their performance, and felt the simulations were based on sound educational practices and were important for learning. Serial simulations and having students experience simulations more than once in consecutive years is a valuable method of clinical instruction. When conducted well, simulations can lead to increased student satisfaction and self-confidence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Prospects for Simulation and Gaming in Mathematics and Science Education

    ERIC Educational Resources Information Center

    Bloomer, Jacquetta

    1974-01-01

    The growth and potential of simulation and gaming techniques are examined in pure science, applied science and mathematics. The contribution of simulations, simulation games and non-simulation games are separately assessed with selective illustrations; in particular, indications for using simulated, as opposed to "live," experiments in science…

  5. A Simbol-X Event Simulator

    NASA Astrophysics Data System (ADS)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  6. Higher-level simulations of turbulent flows

    NASA Technical Reports Server (NTRS)

    Ferziger, J. H.

    1981-01-01

    The fundamentals of large eddy simulation are considered and the approaches to it are compared. Subgrid scale models and the development of models for the Reynolds-averaged equations are discussed as well as the use of full simulation in testing these models. Numerical methods used in simulating large eddies, the simulation of homogeneous flows, and results from full and large scale eddy simulations of such flows are examined. Free shear flows are considered with emphasis on the mixing layer and wake simulation. Wall-bounded flow (channel flow) and recent work on the boundary layer are also discussed. Applications of large eddy simulation and full simulation in meteorological and environmental contexts are included along with a look at the direction in which work is proceeding and what can be expected from higher-level simulation in the future.

  7. [Low Fidelity Simulation of a Zero-Y Robot

    NASA Technical Reports Server (NTRS)

    Sweet, Adam

    2001-01-01

    The item to be cleared is a low-fidelity software simulation model of a hypothetical freeflying robot designed for use in zero gravity environments. This simulation model works with the HCC simulation system that was developed by Xerox PARC and NASA Ames Research Center. HCC has been previously cleared for distribution. When used with the HCC software, the model computes the location and orientation of the simulated robot over time. Failures (such as a broken motor) can be injected into the simulation to produce simulated behavior corresponding to the failure. Release of this simulation will allow researchers to test their software diagnosis systems by attempting to diagnose the simulated failure from the simulated behavior. This model does not contain any encryption software nor can it perform any control tasks that might be export controlled.

  8. Development Issues for Lunar Regolith Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Carpenter, Paul; Sibille, Laurent; Owens, Charles; French, Raymond; McLemore, Carole

    2006-01-01

    Significant challenges and logistical issues exist for the development of standardized lunar regolith simulant (SLRS) materials for use in the development and testing of flight hardware for upcoming NASA lunar missions. A production program at Marshall Space Flight Center (MSFC) for the deployment of lunar mare basalt simulant JSC-lA is underway. Root simulants have been proposed for the development of a low-T mare basalt simulant and a high-Ca highland anorthosite simulant, as part of a framework of simulant development outlined in the 2005 Lunar Regolith Simulant Materials Workshop held at MSFC. Many of the recommendation for production and standardization of simulants have already been documented by the MSFC team. But there are a number of unanswered questions related to geology which need ta be addressed prior to the creation of the simulants.

  9. Residents’ perceptions of simulation as a clinical learning approach

    PubMed Central

    Walsh, Catharine M.; Garg, Ankit; Ng, Stella L.; Goyal, Fenny; Grover, Samir C.

    2017-01-01

    Background Simulation is increasingly being integrated into medical education; however, there is little research into trainees’ perceptions of this learning modality. We elicited trainees’ perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. Methods We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4–6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Results Residents’ perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents’ markedly narrow perception of simulation’s capacity to support non-technical skills development or its use beyond introductory learning. Conclusion Trainees’ learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees’ a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation. PMID:28344719

  10. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  11. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1 M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  12. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  13. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  14. A typology of educationally focused medical simulation tools.

    PubMed

    Alinier, Guillaume

    2007-10-01

    The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful resource for simulation users who are trying to improve their educational practice.

  15. Auditory perceptual simulation: Simulating speech rates or accents?

    PubMed

    Zhou, Peiyun; Christianson, Kiel

    2016-07-01

    When readers engage in Auditory Perceptual Simulation (APS) during silent reading, they mentally simulate characteristics of voices attributed to a particular speaker or a character depicted in the text. Previous research found that auditory perceptual simulation of a faster native English speaker during silent reading led to shorter reading times that auditory perceptual simulation of a slower non-native English speaker. Yet, it was uncertain whether this difference was triggered by the different speech rates of the speakers, or by the difficulty of simulating an unfamiliar accent. The current study investigates this question by comparing faster Indian-English speech and slower American-English speech in the auditory perceptual simulation paradigm. Analyses of reading times of individual words and the full sentence reveal that the auditory perceptual simulation effect again modulated reading rate, and auditory perceptual simulation of the faster Indian-English speech led to faster reading rates compared to auditory perceptual simulation of the slower American-English speech. The comparison between this experiment and the data from Zhou and Christianson (2016) demonstrate further that the "speakers'" speech rates, rather than the difficulty of simulating a non-native accent, is the primary mechanism underlying auditory perceptual simulation effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    PubMed Central

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  17. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  18. Improving the result of forcasting using reservoir and surface network simulation

    NASA Astrophysics Data System (ADS)

    Hendri, R. S.; Winarta, J.

    2018-01-01

    This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.

  19. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    PubMed

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  20. Workshop on data acquisition and trigger system simulations for high energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less

  1. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi

    2006-06-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.

  2. Development and operation of a real-time simulation at the NASA Ames Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Sheppard, Shirin; Chetelat, Monique

    1993-01-01

    The Vertical Motion Simulator (VMS) facility at the NASA Ames Research Center combines the largest vertical motion capability in the world with a flexible real-time operating system allowing research to be conducted quickly and effectively. Due to the diverse nature of the aircraft simulated and the large number of simulations conducted annually, the challenge for the simulation engineer is to develop an accurate real-time simulation in a timely, efficient manner. The SimLab facility and the software tools necessary for an operating simulation will be discussed. Subsequent sections will describe the development process through operation of the simulation; this includes acceptance of the model, validation, integration and production phases.

  3. SimZones: An Organizational Innovation for Simulation Programs and Centers.

    PubMed

    Roussin, Christopher J; Weinstock, Peter

    2017-08-01

    The complexity and volume of simulation-based learning programs have increased dramatically over the last decade, presenting several major challenges for those who lead and manage simulation programs and centers. The authors present five major issues affecting the organization of simulation programs: (1) supporting both single- and double-loop learning experiences; (2) managing the training of simulation teaching faculty; (3) optimizing the participant mix, including individuals, professional groups, teams, and other role-players, to ensure learning; (4) balancing in situ, node-based, and center-based simulation delivery; and (5) organizing simulation research and measuring value. They then introduce the SimZones innovation, a system of organization for simulation-based learning, and explain how it can alleviate the problems associated with these five issues.Simulations are divided into four zones (Zones 0-3). Zone 0 simulations include autofeedback exercises typically practiced by solitary learners, often using virtual simulation technology. Zone 1 simulations include hands-on instruction of foundational clinical skills. Zone 2 simulations include acute situational instruction, such as clinical mock codes. Zone 3 simulations involve authentic, native teams of participants and facilitate team and system development.The authors also discuss the translation of debriefing methods from Zone 3 simulations to real patient care settings (Zone 4), and they illustrate how the SimZones approach can enable the development of longitudinal learning systems in both teaching and nonteaching hospitals. The SimZones approach was initially developed in the context of the Boston Children's Hospital Simulator Program, which the authors use to illustrate this innovation in action.

  4. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  5. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  6. Perceptions, training experiences, and preferences of surgical residents toward laparoscopic simulation training: a resident survey.

    PubMed

    Shetty, Shohan; Zevin, Boris; Grantcharov, Teodor P; Roberts, Kurt E; Duffy, Andrew J

    2014-01-01

    Simulation training for surgical residents can shorten learning curves, improve technical skills, and expedite competency. Several studies have shown that skills learned in the simulated environment are transferable to the operating room. Residency programs are trying to incorporate simulation into the resident training curriculum to supplement the hands-on experience gained in the operating room. Despite the availability and proven utility of surgical simulators and simulation laboratories, they are still widely underutilized by surgical trainees. Studies have shown that voluntary use leads to minimal participation in a training curriculum. Although there are several simulation tools, there is no clear evidence of the superiority of one tool over the other in skill acquisition. The purpose of this study was to explore resident perceptions, training experiences, and preferences regarding laparoscopic simulation training. Our goal was to profile resident participation in surgical skills simulation, recognize potential barriers to voluntary simulator use, and identify simulation tools and tasks preferred by residents. Furthermore, this study may help to inform whether mandatory/protected training time, as part of the residents' curriculum is essential to enhance participation in the simulation laboratory. A cross-sectional study on general surgery residents (postgraduate years 1-5) at Yale University School of Medicine and the University of Toronto via an online questionnaire was conducted. Overall, 67 residents completed the survey. The institutional review board approved the methods of the study. Overall, 95.5% of the participants believed that simulation training improved their laparoscopic skills. Most respondents (92.5%) perceived that skills learned during simulation training were transferrable to the operating room. Overall, 56.7% of participants agreed that proficiency in a simulation curriculum should be mandatory before operating room experience. The simulation laboratory was most commonly used during work hours; lack of free time during work hours was most commonly cited as a reason for underutilization. Factors influencing use of the simulation laboratory in order of importance were the need for skill development, an interest in minimally invasive surgery, mandatory/protected time in a simulation environment as part of the residency program curriculum, a recommendation by an attending surgeon, and proximity of the simulation center. The most preferred simulation tool was the live animal model followed by cadaveric tissue. Virtual reality simulators were among the least-preferred (25%) simulation tools. Most residents (91.0%) felt that mandatory/protected time in a simulation environment should be introduced into resident training protocols. Mandatory and protected time in a simulation environment as part of the resident training curriculum may improve participation in simulation training. A comprehensive curriculum, which includes the use of live animals, cadaveric tissue, and virtual reality simulators, may enhance the laparoscopic training experience and interest level of surgical trainees. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. The State of Simulations: Soft-Skill Simulations Emerge as a Powerful New Form of E-Learning.

    ERIC Educational Resources Information Center

    Aldrich, Clark

    2001-01-01

    Presents responses of leaders from six simulation companies about challenges and opportunities of soft-skills simulations in e-learning. Discussion includes: evaluation metrics; role of subject matter experts in developing simulations; video versus computer graphics; technology needed to run simulations; technology breakthroughs; pricing;…

  8. The effectiveness of and satisfaction with high-fidelity simulation to teach cardiac surgical resuscitation skills to nurses.

    PubMed

    McRae, Marion E; Chan, Alice; Hulett, Renee; Lee, Ai Jin; Coleman, Bernice

    2017-06-01

    There are few reports of the effectiveness or satisfaction with simulation to learn cardiac surgical resuscitation skills. To test the effect of simulation on the self-confidence of nurses to perform cardiac surgical resuscitation simulation and nurses' satisfaction with the simulation experience. A convenience sample of sixty nurses rated their self-confidence to perform cardiac surgical resuscitation skills before and after two simulations. Simulation performance was assessed. Subjects completed the Satisfaction with Simulation Experience scale and demographics. Self-confidence scores to perform all cardiac surgical skills as measured by paired t-tests were significantly increased after the simulation (d=-0.50 to 1.78). Self-confidence and cardiac surgical work experience were not correlated with time to performance. Total satisfaction scores were high (mean 80.2, SD 1.06) indicating satisfaction with the simulation. There was no correlation of the satisfaction scores with cardiac surgical work experience (τ=-0.05, ns). Self-confidence scores to perform cardiac surgical resuscitation procedures were higher after the simulation. Nurses were highly satisfied with the simulation experience. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A "Skylight" Simulator for HWIL Simulation of Hyperspectral Remote Sensing.

    PubMed

    Zhao, Huijie; Cui, Bolun; Jia, Guorui; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-12-06

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator's performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing.

  10. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  11. The Tuscan Mobile Simulation Program: a description of a program for the delivery of in situ simulation training.

    PubMed

    Ullman, Edward; Kennedy, Maura; Di Delupis, Francesco Dojmi; Pisanelli, Paolo; Burbui, Andrea Giuliattini; Cussen, Meaghan; Galli, Laura; Pini, Riccardo; Gensini, Gian Franco

    2016-09-01

    Simulation has become a critical aspect of medical education. It allows health care providers the opportunity to focus on safety and high-risk situations in a protected environment. Recently, in situ simulation, which is performed in the actual clinical setting, has been used to recreate a more realistic work environment. This form of simulation allows for better team evaluation as the workers are in their traditional roles, and can reveal latent safety errors that often are not seen in typical simulation scenarios. We discuss the creation and implementation of a mobile in situ simulation program in emergency departments of three hospitals in Tuscany, Italy, including equipment, staffing, and start-up costs for this program. We also describe latent safety threats identified in the pilot in situ simulations. This novel approach has the potential to both reduce the costs of simulation compared to traditional simulation centers, and to expand medical simulation experiences to providers and healthcare organizations that do not have access to a large simulation center.

  12. Establishing a convention for acting in healthcare simulation: merging art and science.

    PubMed

    Sanko, Jill S; Shekhter, Ilya; Kyle, Richard R; Di Benedetto, Stephen; Birnbach, David J

    2013-08-01

    Among the most powerful tools available to simulation instructors is a confederate. Although technical and logical realism is dictated by the simulation platform and setting, the quality of role playing by confederates strongly determines psychological or emotional fidelity of simulation. The highest level of realism, however, is achieved when the confederates are properly trained. Theater and acting methodology can provide simulation educators a framework from which to establish an acting convention specific to the discipline of healthcare simulation. This report attempts to examine simulation through the lens of theater arts and represents an opinion on acting in healthcare simulation for both simulation educators and confederates. It aims to refine the practice of simulation by embracing the lessons of the theater community. Although the application of these approaches in healthcare education has been described in the literature, a systematic way of organizing, publicizing, or documenting the acting within healthcare simulation has never been completed. Therefore, we attempt, for the first time, to take on this challenge and create a resource, which infuses theater arts into the practice of healthcare simulation.

  13. Simulation in bronchoscopy: current and future perspectives.

    PubMed

    Nilsson, Philip Mørkeberg; Naur, Therese Maria Henriette; Clementsen, Paul Frost; Konge, Lars

    2017-01-01

    To provide an overview of current literature that informs how to approach simulation practice of bronchoscopy and discuss how findings from other simulation research can help inform the use of simulation in bronchoscopy training. We conducted a literature search on simulation training of bronchoscopy and divided relevant studies in three categories: 1) structuring simulation training in bronchoscopy, 2) assessment of competence in bronchoscopy training, and 3) development of cheap alternatives for bronchoscopy simulation. Bronchoscopy simulation is effective, and the training should be structured as distributed practice with mastery learning criteria (ie, training until a certain level of competence is achieved). Dyad practice (training in pairs) is possible and may increase utility of available simulators. Trainee performance should be assessed with assessment tools with established validity. Three-dimensional printing is a promising new technology opening possibilities for developing cheap simulators with innovative features.

  14. Large Scale Simulation Platform for NODES Validation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotorrio, P.; Qin, Y.; Min, L.

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less

  15. Training students to detect delirium: An interprofessional pilot study.

    PubMed

    Chambers, Breah; Meyer, Mary; Peterson, Moya

    2018-06-01

    The purpose of this paper is to report nursing student knowledge acquisition and attitude after completing and interprofessional simulation with medical students. The IOM has challenged healthcare educators to teach teamwork and communication skills in interprofessional settings. Interprofessional simulation provides a higher fidelity experience than simulation in silos. Simulation may be particularly useful in helping healthcare workers gain the necessary skills to care for psychiatric clients. Specifically, healthcare providers have difficulty differentiating between dementia and delirium. Recognizing this deficit, an interprofessional simulation was created using medical students in their neurology rotation and senior nursing students. Twenty-four volunteer nursing students completed a pre-survey to assess delirium knowledge and then completed an education module about delirium. Twelve of these students participated in a simulation with medicine students. Pre and Post Kid SIM Attitude questionnaires were completed by all students participating in the simulation. After the simulations were complete, all twenty-four students were asked to complete the post-survey regarding delirium knowledge. While delirium knowledge scores improved in both groups, the simulation group scored higher, but the difference did not reach significance. The simulation group demonstrated a statistically significant improvement in attitudes toward simulation, interprofessional education, and teamwork post simulation compared to their pre-simulation scores. Nursing students who participated in an interprofessional simulation developed a heightened appreciation for learning communication, teamwork, situational awareness, and interprofessional roles and responsibilities. These results support the use of interprofessional simulation in healthcare education. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barberio, E.; /Melbourne U.; Boudreau, J.

    2011-11-29

    One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less

  17. Piloted aircraft simulation concepts and overview

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.

    1978-01-01

    An overview of piloted aircraft simulation is presented that reflects the viewpoint of an aeronautical technologist. The intent is to acquaint potential users with some of the basic concepts and issues that characterize piloted simulation. Application to the development of aircraft are highlighted, but some aspects of training simulators are covered. A historical review is given together with a description of some current simulators. Simulator usages, advantages, and limitations are discussed and human perception qualities important to simulation are related. An assessment of current simulation is presented that addresses validity, fidelity, and deficiencies. Future prospects are discussed and technology projections are made.

  18. Closed loop models for analyzing the effects of simulator characteristics. [digital simulation of human operators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D. L.

    1978-01-01

    The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.

  19. Reevaluating simulation in nursing education: beyond the human patient simulator.

    PubMed

    Schiavenato, Martin

    2009-07-01

    The human patient simulator or high-fidelity mannequin has become synonymous with the word simulation in nursing education. Founded on a historical context and on an evaluation of the current application of simulation in nursing education, this article challenges that assumption as limited and restrictive. A definition of simulation and a broader conceptualization of its application in nursing education are presented. The need for an ideological basis for simulation in nursing education is highlighted. The call is made for theory to answer the question of why simulation is used in nursing to anchor its proper and effective application in nursing education.

  20. Simulation and evaluation of the Sh-2F helicopter in a shipboard environment using the interchangeable cab system

    NASA Technical Reports Server (NTRS)

    Paulk, C. H., Jr.; Astill, D. L.; Donley, S. T.

    1983-01-01

    The operation of the SH-2F helicopter from the decks of small ships in adverse weather was simulated using a large amplitude vertical motion simulator, a wide angle computer generated imagery visual system, and an interchangeable cab (ICAB). The simulation facility, the mathematical programs, and the validation method used to ensure simulation fidelity are described. The results show the simulator to be a useful tool in simulating the ship-landing problem. Characteristics of the ICAB system and ways in which the simulation can be improved are presented.

  1. Mental simulation of routes during navigation involves adaptive temporal compression

    PubMed Central

    Arnold, Aiden E.G.F.; Iaria, Giuseppe; Ekstrom, Arne D.

    2016-01-01

    Mental simulation is a hallmark feature of human cognition, allowing features from memories to be flexibly used during prospection. While past studies demonstrate the preservation of real-world features such as size and distance during mental simulation, their temporal dynamics remains unknown. Here, we compare mental simulations to navigation of routes in a large-scale spatial environment to test the hypothesis that such simulations are temporally compressed in an adaptive manner. Our results show that simulations occurred at 2.39x the speed it took to navigate a route, increasing in compression (3.57x) for slower movement speeds. Participant self-reports of vividness and spatial coherence of simulations also correlated strongly with simulation duration, providing an important link between subjective experiences of simulated events and how spatial representations are combined during prospection. These findings suggest that simulation of spatial events involve adaptive temporal mechanisms, mediated partly by the fidelity of memories used to generate the simulation. PMID:27568586

  2. Design of a bounded wave EMP (Electromagnetic Pulse) simulator

    NASA Astrophysics Data System (ADS)

    Sevat, P. A. A.

    1989-06-01

    Electromagnetic Pulse (EMP) simulators are used to simulate the EMP generated by a nuclear weapon and to harden equipment against the effects of EMP. At present, DREO has a 1 m EMP simulator for testing computer terminal size equipment. To develop the R and D capability for testing larger objects, such as a helicopter, a much bigger threat level facility is required. This report concerns the design of a bounded wave EMP simulator suitable for testing large size equipment. Different types of simulators are described and their pros and cons are discussed. A bounded wave parallel plate type simulator is chosen for it's efficiency and the least environmental impact. Detailed designs are given for 6 m and 10 m parallel plate type wire grid simulators. Electromagnetic fields inside and outside the simulators are computed. Preliminary specifications for a pulse generator required for the simulator are also given. Finally, the electromagnetic fields radiated from the simulator are computed and discussed.

  3. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    NASA Technical Reports Server (NTRS)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  4. Durham extremely large telescope adaptive optics simulation platform.

    PubMed

    Basden, Alastair; Butterley, Timothy; Myers, Richard; Wilson, Richard

    2007-03-01

    Adaptive optics systems are essential on all large telescopes for which image quality is important. These are complex systems with many design parameters requiring optimization before good performance can be achieved. The simulation of adaptive optics systems is therefore necessary to categorize the expected performance. We describe an adaptive optics simulation platform, developed at Durham University, which can be used to simulate adaptive optics systems on the largest proposed future extremely large telescopes as well as on current systems. This platform is modular, object oriented, and has the benefit of hardware application acceleration that can be used to improve the simulation performance, essential for ensuring that the run time of a given simulation is acceptable. The simulation platform described here can be highly parallelized using parallelization techniques suited for adaptive optics simulation, while still offering the user complete control while the simulation is running. The results from the simulation of a ground layer adaptive optics system are provided as an example to demonstrate the flexibility of this simulation platform.

  5. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  6. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  7. A survey of simulators for palpation training.

    PubMed

    Zhang, Yan; Phillips, Roger; Ward, James; Pisharody, Sandhya

    2009-01-01

    Palpation is a widely used diagnostic method in medical practice. The sensitivity of palpation is highly dependent upon the skill of clinicians, which is often difficult to master. There is a need of simulators in palpation training. This paper summarizes important work and the latest achievements in simulation for palpation training. Three types of simulators; physical models, Virtual Reality (VR) based simulations, and hybrid (computerized and physical) simulators, are surveyed. Comparisons among different kinds of simulators are presented.

  8. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  9. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  10. Surgical simulation: a urological perspective.

    PubMed

    Wignall, Geoffrey R; Denstedt, John D; Preminger, Glenn M; Cadeddu, Jeffrey A; Pearle, Margaret S; Sweet, Robert M; McDougall, Elspeth M

    2008-05-01

    Surgical education is changing rapidly as several factors including budget constraints and medicolegal concerns limit opportunities for urological trainees. New methods of skills training such as low fidelity bench trainers and virtual reality simulators offer new avenues for surgical education. In addition, surgical simulation has the potential to allow practicing surgeons to develop new skills and maintain those they already possess. We provide a review of the background, current status and future directions of surgical simulators as they pertain to urology. We performed a literature review and an overview of surgical simulation in urology. Surgical simulators are in various stages of development and validation. Several simulators have undergone extensive validation studies and are in use in surgical curricula. While virtual reality simulators offer the potential to more closely mimic reality and present entire operations, low fidelity simulators remain useful in skills training, particularly for novices and junior trainees. Surgical simulation remains in its infancy. However, the potential to shorten learning curves for difficult techniques and practice surgery without risk to patients continues to drive the development of increasingly more advanced and realistic models. Surgical simulation is an exciting area of surgical education. The future is bright as advancements in computing and graphical capabilities offer new innovations in simulator technology. Simulators must continue to undergo rigorous validation studies to ensure that time spent by trainees on bench trainers and virtual reality simulators will translate into improved surgical skills in the operating room.

  11. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  12. Tests for malingering in ophthalmology

    PubMed Central

    Incesu, Ali Ihsan

    2013-01-01

    Simulation can be defined as malingering, or sometimes functional visual loss (FVL). It manifests as either simulating an ophthalmic disease (positive simulation), or denial of ophthalmic disease (negative simulation). Conscious behavior and compensation or indemnity claims are prominent features of simulation. Since some authors suggest that this is a manifestation of underlying psychopathology, even conversion is included in this context. In today's world, every ophthalmologist can face with simulation of ophthalmic disease or disorder. In case of simulation suspect, the physician's responsibility is to prove the simulation considering the disease/disorder first, and simulation as an exclusion. In simulation examinations, the physician should be firm and smart to select appropriate test(s) to convince not only the subject, but also the judge in case of indemnity or compensation trials. Almost all ophthalmic sensory and motor functions including visual acuity, visual field, color vision and night vision can be the subject of simulation. Examiner must be skillful in selecting the most appropriate test. Apart from those in the literature, we included all kinds of simulation in ophthalmology. In addition, simulation examination techniques, such as, use of optical coherence tomography, frequency doubling perimetry (FDP), and modified polarization tests were also included. In this review, we made a thorough literature search, and added our experiences to give the readers up-to-date information on malingering or simulation in ophthalmology. PMID:24195054

  13. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  14. A review of virtual reality based training simulators for orthopaedic surgery.

    PubMed

    Vaughan, Neil; Dubey, Venketesh N; Wainwright, Thomas W; Middleton, Robert G

    2016-02-01

    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 hip replacements pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  15. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  16. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accordance with subpart C of this part. Line-Operational Simulation means simulation conducted using... operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation specifically includes...

  17. Facilitating researcher use of flight simulators

    NASA Technical Reports Server (NTRS)

    Russell, C. Ray

    1990-01-01

    Researchers conducting experiments with flight simulators encounter numerous obstacles in bringing their ideas to the simulator. Research into how these simulators could be used more efficiently is presented. The study involved: (1) analyzing the Advanced Concepts Simulator software architecture, (2) analyzing the interaction between the researchers and simulation programmers, and (3) proposing a documentation tool for the researchers.

  18. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  19. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    PubMed

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  20. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  1. LPJ-GUESS Simulated North America Vegetation for 21-0 ka Using the TraCE-21ka Climate Simulation

    NASA Astrophysics Data System (ADS)

    Shafer, S. L.; Bartlein, P. J.

    2016-12-01

    Transient climate simulations that span multiple millennia (e.g., TraCE-21ka) have become more common as computing power has increased, allowing climate models to complete long simulations in relatively short periods of time (i.e., months). These climate simulations provide information on the potential rate, variability, and spatial expression of past climate changes. They also can be used as input data for other environmental models to simulate transient changes for different components of paleoenvironmental systems, such as vegetation. Long, transient paleovegetation simulations can provide information on a range of ecological processes, describe the spatial and temporal patterns of changes in species distributions, and identify the potential locations of past species refugia. Paleovegetation simulations also can be used to fill in spatial and temporal gaps in observed paleovegetation data (e.g., pollen records from lake sediments) and to test hypotheses of past vegetation change. We used the TraCE-21ka transient climate simulation for 21-0 ka from CCSM3, a coupled atmosphere-ocean general circulation model. The TraCE-21ka simulated temperature, precipitation, and cloud data were regridded onto a 10-minute grid of North America. These regridded climate data, along with soil data and atmospheric carbon dioxide concentrations, were used as input to LPJ-GUESS, a general ecosystem model, to simulate North America vegetation from 21-0 ka. LPJ-GUESS simulates many of the processes controlling the distribution of vegetation (e.g., competition), although some important processes (e.g., dispersal) are not simulated. We evaluate the LPJ-GUESS-simulated vegetation (in the form of plant functional types and biomes) for key time periods and compare the simulated vegetation with observed paleovegetation data, such as data archived in the Neotoma Paleoecology Database. In general, vegetation simulated by LPJ-GUESS reproduces the major North America vegetation patterns (e.g., forest, grassland) with regional areas of disagreement between simulated and observed vegetation. We describe the regions and time periods with the greatest data-model agreement and disagreement, and discuss some of the strengths and weaknesses of both the simulated climate and simulated vegetation data.

  2. Local and national laparoscopic skill competitions: residents' opinions and impact on adoption of simulation-based training.

    PubMed

    McCreery, Greig L; El-Beheiry, Mostafa; Schlachta, Christopher M

    2017-11-01

    Dedicated practice using laparoscopic simulators improves operative performance. Yet, voluntary utilization is minimal. We hypothesized that skill competition between peers, at the local and national level, positively influences residents' use of laparoscopic simulators. A web-based survey evaluated the relationship between Canadian General Surgery residents' use of laparoscopic simulation and participation in competition. Secondary outcomes assessed attitudes regarding simulation training, factors limiting use, and associations between competition level and usage. One hundred ninety (23%) of 826 potential participants responded. Eighty-three percent rated their laparoscopic abilities as novice or intermediate. More than 70% agreed that use of simulation practice improves intra-operative performance, and should be a mandatory component of training. However, 58% employed simulator practice less than once per month, and 18% never used a simulator. Sixty-five percent engaged in simulator training for 5 h or less over the preceding 6 months. Seventy-three percent had participated in laparoscopic skill competition. Of those, 51% agreed that competition was a motivation for simulation practice. No association was found between those with competition experience and simulator use. However, 83% of those who had competed nationally reported >5 h of simulator use in the previous 6 months compared to those with no competition experience (26%), local competition (40%), and local national-qualifying competition (23%) (p < 0.001). This study does not support the hypothesis that competition alone universally increases voluntary use of simulation-based training, with only the minority of individuals competing at the national level demonstrated significantly higher simulation use. However, simulation training was perceived as a valuable exercise. Lack of time and access to simulators, as opposed to lack of interest, were the most commonly reported to limited use.

  3. The Effectiveness of Remote Facilitation in Simulation-Based Pediatric Resuscitation Training for Medical Students.

    PubMed

    Ohta, Kunio; Kurosawa, Hiroshi; Shiima, Yuko; Ikeyama, Takanari; Scott, James; Hayes, Scott; Gould, Michael; Buchanan, Newton; Nadkarni, Vinay; Nishisaki, Akira

    2017-08-01

    To assess the effectiveness of pediatric simulation by remote facilitation. We hypothesized that simulation by remote facilitation is more effective compared to simulation by an on-site facilitator. We defined remote facilitation as a facilitator remotely (1) introduces simulation-based learning and simulation environment, (2) runs scenarios, and (3) performs debriefing with an on-site facilitator. A remote simulation program for medical students during pediatric rotation was implemented. Groups were allocated to either remote or on-site facilitation depending on the availability of telemedicine technology. Both groups had identical 1-hour simulation sessions with 2 scenarios and debriefing. Their team performance was assessed with behavioral assessment tool by a trained rater. Perception by students was evaluated with Likert scale (1-7). Fifteen groups with 89 students participated in a simulation by remote facilitation, and 8 groups with 47 students participated in a simulation by on-site facilitation. Participant demographics and previous simulation experience were similar. Both groups improved their performance from first to second scenario: groups by remote simulation (first [8.5 ± 4.2] vs second [13.2 ± 6.2], P = 0.003), and groups by on-site simulation (first [6.9 ± 4.1] vs second [12.4 ± 6.4], P = 0.056). The performance improvement was not significantly different between the 2 groups (P = 0.94). Faculty evaluation by students was equally high in both groups (7 vs 7; P = 0.65). A pediatric acute care simulation by remote facilitation significantly improved students' performance. In this pilot study, remote facilitation seems as effective as a traditional, locally facilitated simulation. The remote simulation can be a strong alternative method, especially where experienced facilitators are limited.

  4. Driving simulator sickness: Impact on driving performance, influence of blood alcohol concentration, and effect of repeated simulator exposures.

    PubMed

    Helland, Arne; Lydersen, Stian; Lervåg, Lone-Eirin; Jenssen, Gunnar D; Mørland, Jørg; Slørdal, Lars

    2016-09-01

    Simulator sickness is a major obstacle to the use of driving simulators for research, training and driver assessment purposes. The purpose of the present study was to investigate the possible influence of simulator sickness on driving performance measures such as standard deviation of lateral position (SDLP), and the effect of alcohol or repeated simulator exposure on the degree of simulator sickness. Twenty healthy male volunteers underwent three simulated driving trials of 1h's duration with a curvy rural road scenario, and rated their degree of simulator sickness after each trial. Subjects drove sober and with blood alcohol concentrations (BAC) of approx. 0.5g/L and 0.9g/L in a randomized order. Simulator sickness score (SSS) did not influence the primary outcome measure SDLP. Higher SSS significantly predicted lower average speed and frequency of steering wheel reversals. These effects seemed to be mitigated by alcohol. Higher BAC significantly predicted lower SSS, suggesting that alcohol inebriation alleviates simulator sickness. The negative relation between the number of previous exposures to the simulator and SSS was not statistically significant, but is consistent with habituation to the sickness-inducing effects, as shown in other studies. Overall, the results suggest no influence of simulator sickness on SDLP or several other driving performance measures. However, simulator sickness seems to cause test subjects to drive more carefully, with lower average speed and fewer steering wheel reversals, hampering the interpretation of these outcomes as measures of driving impairment and safety. BAC and repeated simulator exposures may act as confounding variables by influencing the degree of simulator sickness in experimental studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A Functional Comparison of Lunar Regoliths and Their Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, D.; Edmunson, J.; McLemore, C.

    2012-01-01

    Lunar regolith simulants are essential to the development of technology for human exploration of the Moon. Any equipment that will interact with the surface environment must be tested with simulant to mitigate risk. To reduce the greatest amount of risk, the simulant must replicate the lunar surface as well as possible. To quantify the similarities and differences between simulants, the Figures of Merit were developed. The Figures of Merit software compares the simulants and regolith by particle size, particle shape, density, and bulk chemistry and mineralogy; these four properties dictate the majority of the remaining characteristics of a geologic material. There are limitations to both the current Figures of Merit approach and simulants in general. The effect of particle textures is lacking in the Figures of Merit software, and research into this topic has only recently begun with applications to simulants. In addition, not all of the properties for lunar regolith are defined sufficiently for simulant reproduction or comparison; for example, the size distribution of particles greater than 1 centimeter and the makeup of particles less than 10 micrometers is not well known. For simulants, contamination by terrestrial weathering products or undesired trace phases in feedstock material is a major issue. Vapor deposited rims have not yet been created for simulants. Fortunately, previous limitations such as the lack of agglutinates in simulants have been addressed and commercial companies are now making agglutinate material for simulants. Despite some limitations, the Figures of Merit sufficiently quantify the comparison between simulants and regolith for useful application in lunar surface technology. Over time, the compilation and analysis of simulant user data will add an advantageous predictive capability to the Figures of Merit, accurately relating Figures of Merit characteristics to simulant user parameters.

  6. A New Approach to Modeling Jupiter's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Fukazawa, K.; Katoh, Y.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.

    2017-12-01

    The scales in planetary magnetospheres range from 10s of planetary radii to kilometers. For a number of years we have studied the magnetospheres of Jupiter and Saturn by using 3-dimensional magnetohydrodynamic (MHD) simulations. However, we have not been able to reach even the limits of the MHD approximation because of the large amount of computer resources required. Recently thanks to the progress in supercomputer systems, we have obtained the capability to simulate Jupiter's magnetosphere with 1000 times the number of grid points used in our previous simulations. This has allowed us to combine the high resolution global simulation with a micro-scale simulation of the Jovian magnetosphere. In particular we can combine a hybrid (kinetic ions and fluid electrons) simulation with the MHD simulation. In addition, the new capability enables us to run multi-parameter survey simulations of the Jupiter-solar wind system. In this study we performed a high-resolution simulation of Jovian magnetosphere to connect with the hybrid simulation, and lower resolution simulations under the various solar wind conditions to compare with Hisaki and Juno observations. In the high-resolution simulation we used a regular Cartesian gird with 0.15 RJ grid spacing and placed the inner boundary at 7 RJ. From these simulation settings, we provide the magnetic field out to around 20 RJ from Jupiter as a background field for the hybrid simulation. For the first time we have been able to resolve Kelvin Helmholtz waves on the magnetopause. We have investigated solar wind dynamic pressures between 0.01 and 0.09 nPa for a number of IMF values. These simulation data are open for the registered users to download the raw data. We have compared the results of these simulations with Hisaki auroral observations.

  7. Virtual versus face-to-face clinical simulation in relation to student knowledge, anxiety, and self-confidence in maternal-newborn nursing: A randomized controlled trial.

    PubMed

    Cobbett, Shelley; Snelgrove-Clarke, Erna

    2016-10-01

    Clinical simulations can provide students with realistic clinical learning environments to increase their knowledge, self-confidence, and decrease their anxiety prior to entering clinical practice settings. To compare the effectiveness of two maternal newborn clinical simulation scenarios; virtual clinical simulation and face-to-face high fidelity manikin simulation. Randomized pretest-posttest design. A public research university in Canada. Fifty-six third year Bachelor of Science in Nursing students. Participants were randomized to either face-to-face or virtual clinical simulation and then to dyads for completion of two clinical simulations. Measures included: (1) Nursing Anxiety and Self-Confidence with Clinical Decision Making Scale (NASC-CDM) (White, 2011), (2) knowledge pretest and post-test related to preeclampsia and group B strep, and (3) Simulation Completion Questionnaire. Before and after each simulation students completed a knowledge test and the NASC-CDM and the Simulation Completion Questionnaire at study completion. There were no statistically significant differences in student knowledge and self-confidence between face-to-face and virtual clinical simulations. Anxiety scores were higher for students in the virtual clinical simulation than for those in the face-to-face simulation. Students' self-reported preference was face-to-face citing the similarities to practicing in a 'real' situation and the immediate debrief. Students not liking the virtual clinical simulation most often cited technological issues as their rationale. Given the equivalency of knowledge and self-confidence when undergraduate nursing students participate in either maternal newborn clinical scenarios of face-to-face or virtual clinical simulation identified in this trial, it is important to take into the consideration costs and benefits/risks of simulation implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.

  9. Striving for Better Medical Education: the Simulation Approach.

    PubMed

    Sakakushev, Boris E; Marinov, Blagoi I; Stefanova, Penka P; Kostianev, Stefan St; Georgiou, Evangelos K

    2017-06-01

    Medical simulation is a rapidly expanding area within medical education due to advances in technology, significant reduction in training hours and increased procedural complexity. Simulation training aims to enhance patient safety through improved technical competency and eliminating human factors in a risk free environment. It is particularly applicable to a practical, procedure-orientated specialties. Simulation can be useful for novice trainees, experienced clinicians (e.g. for revalidation) and team building. It has become a cornerstone in the delivery of medical education, being a paradigm shift in how doctors are educated and trained. Simulation must take a proactive position in the development of metric-based simulation curriculum, adoption of proficiency benchmarking definitions, and should not depend on the simulation platforms used. Conversely, ingraining of poor practice may occur in the absence of adequate supervision, and equipment malfunction during the simulation can break the immersion and disrupt any learning that has occurred. Despite the presence of high technology, there is a substantial learning curve for both learners and facilitators. The technology of simulation continues to advance, offering devices capable of improved fidelity in virtual reality simulation, more sophisticated procedural practice and advanced patient simulators. Simulation-based training has also brought about paradigm shifts in the medical and surgical education arenas and ensured that the scope and impact of simulation will continue to broaden.

  10. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  11. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    PubMed

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  12. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711

  13. The Persistent Issue of Simulator Sickness in Naval Aviation Training.

    PubMed

    Geyer, Daniel J; Biggs, Adam T

    2018-04-01

    Virtual simulations offer nearly unlimited training potential for naval aviation due to the wide array of scenarios that can be simulated in a safe, reliable, and cost-effective environment. This versatility has created substantial interest in using existing and emerging virtual technology to enhance training scenarios. However, the virtual simulations themselves may hinder training initiatives by inducing simulator sickness among the trainees, which is a series of symptoms similar to motion sickness that can arise from simulator use. Simulator sickness has been a problem for military aviation since the first simulators were introduced. The problem has also persisted despite the increasing fidelity and sense of immersion offered by new generations of simulators. As such, it is essential to understand the various problems so that trainers can ensure the best possible use of the simulators. This review will examine simulator sickness as it pertains to naval aviation training. Topics include: the prevailing theories on why symptoms develop, methods of measurement, contributing factors, effects on training, effects when used shipboard, aftereffects, countermeasures, and recommendations for future research involving virtual simulations in an aviation training environment.Geyer DJ, Biggs AT. The persistent issue of simulator sickness in naval aviation training. Aerosp Med Hum Perform. 2018; 89(4):396-405.

  14. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  15. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  16. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  17. Medical simulation: Overview, and application to wound modelling and management

    PubMed Central

    Pai, Dinker R.; Singh, Simerjit

    2012-01-01

    Simulation in medical education is progressing in leaps and bounds. The need for simulation in medical education and training is increasing because of a) overall increase in the number of medical students vis-à-vis the availability of patients; b) increasing awareness among patients of their rights and consequent increase in litigations and c) tremendous improvement in simulation technology which makes simulation more and more realistic. Simulation in wound care can be divided into use of simulation in wound modelling (to test the effect of projectiles on the body) and simulation for training in wound management. Though this science is still in its infancy, more and more researchers are now devising both low-technology and high-technology (virtual reality) simulators in this field. It is believed that simulator training will eventually translate into better wound care in real patients, though this will be the subject of further research. PMID:23162218

  18. A 2.5D Computational Method to Simulate Cylindrical Fluidized Beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Tingwen; Benyahia, Sofiane; Dietiker, Jeff

    2015-02-17

    In this paper, the limitations of axisymmetric and Cartesian two-dimensional (2D) simulations of cylindrical gas-solid fluidized beds are discussed. A new method has been proposed to carry out pseudo-two-dimensional (2.5D) simulations of a cylindrical fluidized bed by appropriately combining computational domains of Cartesian 2D and axisymmetric simulations. The proposed method was implemented in the open-source code MFIX and applied to the simulation of a lab-scale bubbling fluidized bed with necessary sensitivity study. After a careful grid study to ensure the numerical results are grid independent, detailed comparisons of the flow hydrodynamics were presented against axisymmetric and Cartesian 2D simulations. Furthermore,more » the 2.5D simulation results have been compared to the three-dimensional (3D) simulation for evaluation. This new approach yields better agreement with the 3D simulation results than with axisymmetric and Cartesian 2D simulations.« less

  19. Medium Fidelity Simulation of Oxygen Tank Venting

    NASA Technical Reports Server (NTRS)

    Sweet, Adam; Kurien, James; Lau, Sonie (Technical Monitor)

    2001-01-01

    The item to he cleared is a medium-fidelity software simulation model of a vented cryogenic tank. Such tanks are commonly used to transport cryogenic liquids such as liquid oxygen via truck, and have appeared on liquid-fueled rockets for decades. This simulation model works with the HCC simulation system that was developed by Xerox PARC and NASA Ames Research Center. HCC has been previously cleared for distribution. When used with the HCC software, the model generates simulated readings for the tank pressure and temperature as the simulated cryogenic liquid boils off and is vented. Failures (such as a broken vent valve) can be injected into the simulation to produce readings corresponding to the failure. Release of this simulation will allow researchers to test their software diagnosis systems by attempting to diagnose the simulated failure from the simulated readings. This model does not contain any encryption software nor can it perform any control tasks that might be export controlled.

  20. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  1. INACSL Standards of Best Practice for Simulation: Past, Present, and Future.

    PubMed

    Sittner, Barbara J; Aebersold, Michelle L; Paige, Jane B; Graham, Leslie L M; Schram, Andrea Parsons; Decker, Sharon I; Lioce, Lori

    2015-01-01

    To describe the historical evolution of the International Nursing Association for Clinical Simulation and Learning's (INACSL) Standards of Best Practice: Simulation. The establishment of simulation standards began as a concerted effort by the INACSL Board of Directors in 2010 to provide best practices to design, conduct, and evaluate simulation activities in order to advance the science of simulation as a teaching methodology. A comprehensive review of the evolution of INACSL Standards of Best Practice: Simulation was conducted using journal publications, the INACSL website, INACSL member survey, and reports from members of the INACSL Standards Committee. The initial seven standards, published in 2011, were reviewed and revised in 2013. Two new standards were published in 2015. The standards will continue to evolve as the science of simulation advances. As the use of simulation-based experiences increases, the INACSL Standards of Best Practice: Simulation are foundational to standardizing language, behaviors, and curricular design for facilitators and learners.

  2. SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with Smp Standard

    NASA Astrophysics Data System (ADS)

    Koo, Cheol-Hea; Lee, Hoon-Hee; Cheon, Yee-Jin

    2010-12-01

    Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP) is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI) are available. Korea Aerospace Research Institute (KARI) is developing hardware abstraction layer (HAL) supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.

  3. Medical simulation: Overview, and application to wound modelling and management.

    PubMed

    Pai, Dinker R; Singh, Simerjit

    2012-05-01

    Simulation in medical education is progressing in leaps and bounds. The need for simulation in medical education and training is increasing because of a) overall increase in the number of medical students vis-à-vis the availability of patients; b) increasing awareness among patients of their rights and consequent increase in litigations and c) tremendous improvement in simulation technology which makes simulation more and more realistic. Simulation in wound care can be divided into use of simulation in wound modelling (to test the effect of projectiles on the body) and simulation for training in wound management. Though this science is still in its infancy, more and more researchers are now devising both low-technology and high-technology (virtual reality) simulators in this field. It is believed that simulator training will eventually translate into better wound care in real patients, though this will be the subject of further research.

  4. Transfer of training and simulator qualification or myth and folklore in helicopter simulation

    NASA Technical Reports Server (NTRS)

    Dohme, Jack

    1992-01-01

    Transfer of training studies at Fort Rucker using the backward-transfer paradigm have shown that existing flight simulators are not entirely adequate for meeting training requirements. Using an ab initio training research simulator, a simulation of the UH-1, training effectiveness ratios were developed. The data demonstrate it to be a cost-effective primary trainer. A simulator qualification method was suggested in which a combination of these transfer-of-training paradigms is used to determine overall simulator fidelity and training effectiveness.

  5. Modeling of Army Research Laboratory EMP simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miletta, J.R.; Chase, R.J.; Luu, B.B.

    1993-12-01

    Models are required that permit the estimation of emitted field signatures from EMP simulators to design the simulator antenna structure, to establish the usable test volumes, and to estimate human exposure risk. This paper presents the capabilities and limitations of a variety of EMP simulator models useful to the Army's EMP survivability programs. Comparisons among frequency and time-domain models are provided for two powerful US Army Research Laboratory EMP simulators: AESOP (Army EMP Simulator Operations) and VEMPS II (Vertical EMP Simulator II).

  6. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  7. Use of a Virtual Learning Platform for Distance-Based Simulation in an Acute Care Nurse Practitioner Curriculum.

    PubMed

    Carman, Margaret; Xu, Shu; Rushton, Sharron; Smallheer, Benjamin A; Williams, Denise; Amarasekara, Sathya; Oermann, Marilyn H

    Acute care nurse practitioner (ACNP) programs that use high-fidelity simulation as a teaching tool need to consider innovative strategies to provide distance-based students with learning experiences that are comparable to those in a simulation laboratory. The purpose of this article is to describe the use of virtual simulations in a distance-based ACNP program and student performance in the simulations. Virtual simulations using iSimulate were integrated into the ACNP course to promote the translation of content into a clinical context and enable students to develop their knowledge and decision-making skills. With these simulations, students worked as a team, even though they were at different sites from each other and from the faculty, to manage care of an acutely ill patient. The students were assigned to simulation groups of 4 students each. One week before the simulation, they reviewed past medical records. The virtual simulation sessions were recorded and then evaluated. The evaluation tools assessed 8 areas of performance and included key behaviors in each of these areas to be performed by students in the simulation. More than 80% of the student groups performed the key behaviors. Virtual simulations provide a learning platform that allows live interaction between students and faculty, at a distance, and application of content to clinical situations. With simulation, learners have an opportunity to practice assessment and decision-making in emergency and high-risk situations. Simulations not only are valuable for student learning but also provide a nonthreatening environment for staff to practice, receive feedback on their skills, and improve their confidence.

  8. Using flight simulators aboard ships: human side effects of an optimal scenario with smooth seas.

    PubMed

    Muth, Eric R; Lawson, Ben

    2003-05-01

    The U.S. Navy is considering placing flight simulators aboard ships. It is known that certain types of flight simulators can elicit motion adaptation syndrome (MAS), and also that certain types of ship motion can cause MAS. The goal of this study was to determine if using a flight simulator during ship motion would cause MAS, even when the simulator stimulus and the ship motion were both very mild. All participants in this study completed three conditions. Condition 1 (Sim) entailed "flying" a personal computer-based flight simulator situated on land. Condition 2 (Ship) involved riding aboard a U.S. Navy Yard Patrol boat. Condition 3 (ShipSim) entailed "flying" a personal computer-based flight simulator while riding aboard a Yard Patrol boat. Before and after each condition, participants' balance and dynamic visual acuity were assessed. After each condition, participants filled out the Nausea Profile and the Simulator Sickness Questionnaire. Following exposure to a flight simulator aboard a ship, participants reported negligible symptoms of nausea and simulator sickness. However, participants exhibited a decrease in dynamic visual acuity after exposure to the flight simulator aboard ship (T[25] = 3.61, p < 0.05). Balance results were confounded by significant learning and, therefore, not interpretable. This study suggests that flight simulators can be used aboard ship. As a minimal safety precaution, these simulators should be used according to current safety practices for land-based simulators. Optimally, these simulators should be designed to minimize MAS, located near the ship's center of rotation and used when ship motion is not provocative.

  9. The use of psychiatry-focused simulation in undergraduate nursing education: A systematic search and review.

    PubMed

    Vandyk, Amanda D; Lalonde, Michelle; Merali, Sabrina; Wright, Erica; Bajnok, Irmajean; Davies, Barbara

    2018-04-01

    Evidence on the use of simulation to teach psychiatry and mental health (including addiction) content is emerging, yet no summary of the implementation processes or associated outcomes exists. The aim of this study was to systematically search and review empirical literature on the use of psychiatry-focused simulation in undergraduate nursing education. Objectives were to (i) assess the methodological quality of existing evidence on the use of simulation to teach mental health content to undergraduate nursing students, (ii) describe the operationalization of the simulations, and (iii) summarize the associated quantitative and qualitative outcomes. We conducted online database (MEDLINE, Embase, ERIC, CINAHL, PsycINFO from January 2004 to October 2015) and grey literature searches. Thirty-two simulation studies were identified describing and evaluating six types of simulations (standardized patients, audio simulations, high-fidelity simulators, virtual world, multimodal, and tabletop). Overall, 2724 participants were included in the studies. Studies reflected a limited number of intervention designs, and outcomes were evaluated with qualitative and quantitative methods incorporating a variety of tools. Results indicated that simulation was effective in reducing student anxiety and improving their knowledge, empathy, communication, and confidence. The summarized qualitative findings all supported the benefit of simulation; however, more research is needed to assess the comparative effectiveness of the types of simulations. Recommendations from the findings include the development of guidelines for educators to deliver each simulation component (briefing, active simulation, debriefing). Finally, consensus around appropriate training of facilitators is needed, as is consistent and agreed upon simulation terminology. © 2017 Australian College of Mental Health Nurses Inc.

  10. Simulation in International Relations Education.

    ERIC Educational Resources Information Center

    Starkey, Brigid A.; Blake, Elizabeth L.

    2001-01-01

    Discusses the educational implications of simulations in international relations. Highlights include the development of international relations simulations; the role of technology; the International Communication and Negotiation Simulations (ICONS) project at the University of Maryland; evolving information technology; and simulating real-world…

  11. The role of the research simulator in the systems development of rotorcraft

    NASA Technical Reports Server (NTRS)

    Statler, I. C.; Deel, A.

    1981-01-01

    The potential application of the research simulator to future rotorcraft systems design, development, product improvement evaluations, and safety analysis is examined. Current simulation capabilities for fixed-wing aircraft are reviewed and the requirements of a rotorcraft simulator are defined. The visual system components, vertical motion simulator, cab, and computation system for a research simulator under development are described.

  12. Hardware Fault Simulator for Microprocessors

    NASA Technical Reports Server (NTRS)

    Hess, L. M.; Timoc, C. C.

    1983-01-01

    Breadboarded circuit is faster and more thorough than software simulator. Elementary fault simulator for AND gate uses three gates and shaft register to simulate stuck-at-one or stuck-at-zero conditions at inputs and output. Experimental results showed hardware fault simulator for microprocessor gave faster results than software simulator, by two orders of magnitude, with one test being applied every 4 microseconds.

  13. The new ATLAS Fast Calorimeter Simulation

    NASA Astrophysics Data System (ADS)

    Schaarschmidt, J.; ATLAS Collaboration

    2017-10-01

    Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.

  14. Accelerating a Particle-in-Cell Simulation Using a Hybrid Counting Sort

    NASA Astrophysics Data System (ADS)

    Bowers, K. J.

    2001-11-01

    In this article, performance limitations of the particle advance in a particle-in-cell (PIC) simulation are discussed. It is shown that the memory subsystem and cache-thrashing severely limit the speed of such simulations. Methods to implement a PIC simulation under such conditions are explored. An algorithm based on a counting sort is developed which effectively eliminates PIC simulation cache thrashing. Sustained performance gains of 40 to 70 percent are measured on commodity workstations for a minimal 2d2v electrostatic PIC simulation. More complete simulations are expected to have even better results as larger simulations are usually even more memory subsystem limited.

  15. Displays and simulators

    NASA Astrophysics Data System (ADS)

    Mohon, N.

    A 'simulator' is defined as a machine which imitates the behavior of a real system in a very precise manner. The major components of a simulator and their interaction are outlined in brief form, taking into account the major components of an aircraft flight simulator. Particular attention is given to the visual display portion of the simulator, the basic components of the display, their interactions, and their characteristics. Real image displays are considered along with virtual image displays, and image generators. Attention is given to an advanced simulator for pilot training, a holographic pancake window, a scan laser image generator, the construction of an infrared target simulator, and the Apollo Command Module Simulator.

  16. Research of laser echo signal simulator

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Shi, Rui; Wang, Xin; Li, Zhou

    2015-11-01

    Laser echo signal simulator is one of the most significant components of hardware-in-the-loop (HWIL) simulation systems for LADAR. System model and time series model of laser echo signal simulator are established. Some influential factors which could induce fixed error and random error on the simulated return signals are analyzed, and then these system insertion errors are analyzed quantitatively. Using this theoretical model, the simulation system is investigated experimentally. The results corrected by subtracting fixed error indicate that the range error of the simulated laser return signal is less than 0.25m, and the distance range that the system can simulate is from 50m to 20km.

  17. Mars Smart Lander Simulations for Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Striepe, S. A.; Way, D. W.; Balaram, J.

    2002-01-01

    Two primary simulations have been developed and are being updated for the Mars Smart Lander Entry, Descent, and Landing (EDL). The high fidelity engineering end-to-end EDL simulation that is based on NASA Langley's Program to Optimize Simulated Trajectories (POST) and the end-to-end real-time, hardware-in-the-loop simulation testbed, which is based on NASA JPL's (Jet Propulsion Laboratory) Dynamics Simulator for Entry, Descent and Surface landing (DSENDS). This paper presents the status of these Mars Smart Lander EDL end-to-end simulations at this time. Various models, capabilities, as well as validation and verification for these simulations are discussed.

  18. Use of Carbon Arc Lamps as Solar Simulation in Environmental Testing

    NASA Technical Reports Server (NTRS)

    Goggia, R. J.; Maclay, J. E.

    1962-01-01

    This report covers work done by the authors on the solar simulator for the six-foot diameter space simulator presently in use at JPL. The space simulator was made by modifying an existent vacuum chamber and uses carbon arc lamps for solar simulation. All Ranger vehicles flown to date have been tested in this facility. The report also contains a series of appendixes covering various aspects of space-simulation design and use. Some of these appendixes contain detailed analyses of space-simulator design criteria. Others cover the techniques used in studying carbon-arc lamps and in applying them as solar simulation.

  19. An Example-Based Brain MRI Simulation Framework.

    PubMed

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L

    2015-02-21

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  20. Using Reconstructed POD Modes as Turbulent Inflow for LES Wind Turbine Simulations

    NASA Astrophysics Data System (ADS)

    Nielson, Jordan; Bhaganagar, Kiran; Juttijudata, Vejapong; Sirisup, Sirod

    2016-11-01

    Currently, in order to get realistic atmospheric effects of turbulence, wind turbine LES simulations require computationally expensive precursor simulations. At times, the precursor simulation is more computationally expensive than the wind turbine simulation. The precursor simulations are important because they capture turbulence in the atmosphere and as stated above, turbulence impacts the power production estimation. On the other hand, POD analysis has been shown to be capable of capturing turbulent structures. The current study was performed to determine the plausibility of using lower dimension models from POD analysis of LES simulations as turbulent inflow to wind turbine LES simulations. The study will aid the wind energy community by lowering the computational cost of full scale wind turbine LES simulations, while maintaining a high level of turbulent information and being able to quickly apply the turbulent inflow to multi turbine wind farms. This will be done by comparing a pure LES precursor wind turbine simulation with simulations that use reduced POD mod inflow conditions. The study shows the feasibility of using lower dimension models as turbulent inflow of LES wind turbine simulations. Overall the power production estimation and velocity field of the wind turbine wake are well captured with small errors.

  1. A systematic review of validated sinus surgery simulators.

    PubMed

    Stew, B; Kao, S S-T; Dharmawardana, N; Ooi, E H

    2018-06-01

    Simulation provides a safe and effective opportunity to develop surgical skills. A variety of endoscopic sinus surgery (ESS) simulators has been described in the literature. Validation of these simulators allows for effective utilisation in training. To conduct a systematic review of the published literature to analyse the evidence for validated ESS simulation. Pubmed, Embase, Cochrane and Cinahl were searched from inception of the databases to 11 January 2017. Twelve thousand five hundred and sixteen articles were retrieved of which 10 112 were screened following the removal of duplicates. Thirty-eight full-text articles were reviewed after meeting search criteria. Evidence of face, content, construct, discriminant and predictive validity was extracted. Twenty articles were included in the analysis describing 12 ESS simulators. Eleven of these simulators had undergone validation: 3 virtual reality, 7 physical bench models and 1 cadaveric simulator. Seven of the simulators were shown to have face validity, 7 had construct validity and 1 had predictive validity. None of the simulators demonstrated discriminate validity. This systematic review demonstrates that a number of ESS simulators have been comprehensively validated. Many of the validation processes, however, lack standardisation in outcome reporting, thus limiting a meta-analysis comparison between simulators. © 2017 John Wiley & Sons Ltd.

  2. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  3. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  4. What is going on in augmented reality simulation in laparoscopic surgery?

    PubMed

    Botden, Sanne M B I; Jakimowicz, Jack J

    2009-08-01

    To prevent unnecessary errors and adverse results of laparoscopic surgery, proper training is of paramount importance. A safe way to train surgeons for laparoscopic skills is simulation. For this purpose traditional box trainers are often used, however they lack objective assessment of performance. Virtual reality laparoscopic simulators assess performance, but lack realistic haptic feedback. Augmented reality (AR) combines a virtual reality (VR) setting with real physical materials, instruments, and feedback. This article presents the current developments in augmented reality laparoscopic simulation. Pubmed searches were performed to identify articles regarding surgical simulation and augmented reality. Identified companies manufacturing an AR laparoscopic simulator received the same questionnaire referring to the features of the simulator. Seven simulators that fitted the definition of augmented reality were identified during the literature search. Five of the approached manufacturers returned a completed questionnaire, of which one simulator appeared to be VR and was therefore not applicable for this review. Several augmented reality simulators have been developed over the past few years and they are improving rapidly. We recommend the development of AR laparoscopic simulators for component tasks of procedural training. AR simulators should be implemented in current laparoscopic training curricula, in particular for laparoscopic suturing training.

  5. Aviation Simulators for the Desktop: Panel and Demonstrations

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Rosekind, Marl R. (Technical Monitor)

    1997-01-01

    Panel Members are: Christine M. Mitchell (Georgia Tech), Michael T. Palmer (NASA Langley), Greg Pisani (NASA Ames), and Amy R. Pritchett (MIT). The Panel members are affiliated with aviation human factors groups from NASA Ames, NASA Langley, MITCHELL Department of Aerospace and Aeronautical Engineering, and Georgia Technics Center for Human-Machine Systems Research. Panelists will describe the simulator(s) used in their respective institutions including a description of the FMS aircraft models, software, hardware, and displays. Panelists will summarize previous, on-going, and planned empirical studies conducted with the simulators. Greg Pisanich will describe two NASA Ames simulation systems: the Stone Soup Simulator (SSS), and the Airspace Operations Human Factors Simulation Laboratory. The the Stone Soup Simulator is a desktop-based, research flight simulator that includes mode control, flight management, and datalink functionality. It has been developed as a non-proprietary simulator that can be easily distributed to academic and industry researchers who are collaborating on NASA research projects. It will be used and extended by research groups represented by at least two panelists (Mitchell and Palmer). The Airspace Operations Simulator supports the study of air traffic control in conjunction with the flight deck. This simulator will be used provide an environment in which many AATT and free flight concepts can be demonstrated and evaluated. Mike Palmer will describe two NASA Langley efforts: The Langley Simulator and MD-11 extensions to the NASA Amesbury simulator. The first simulator is publicly available and combines a B-737 model with a high fidelity flight management system. The second simulator enhances the S3 simulator with MD-11 electronic flight displays together with modifications to the flight and FMS models to emulate MD-11 dynamics and operations. Chris Mitchell will describe GT-EFIRT (Georgia Tech-Electronic Flight Instrument Research Tool) and B-757 enhancements to the NASA Ames S3. GT-EFIRT is a medium fidelity simulator used to conduct preliminary studies of the CATS (crew activity tracking system). Like the Langley efforts with S3, the Georgia Tech enhancements will allow it to emulate the dynamics and operations of a widely used glass cockpit. Amy Pritchett will describe the MIT simulator(s) that have been used in a range of research investigating cockpit displays, warning devices, and flight deck-ATC interaction.

  6. Large eddy simulation in a rotary blood pump: Viscous shear stress computation and comparison with unsteady Reynolds-averaged Navier-Stokes simulation.

    PubMed

    Torner, Benjamin; Konnigk, Lucas; Hallier, Sebastian; Kumar, Jitendra; Witte, Matthias; Wurm, Frank-Hendrik

    2018-06-01

    Numerical flow analysis (computational fluid dynamics) in combination with the prediction of blood damage is an important procedure to investigate the hemocompatibility of a blood pump, since blood trauma due to shear stresses remains a problem in these devices. Today, the numerical damage prediction is conducted using unsteady Reynolds-averaged Navier-Stokes simulations. Investigations with large eddy simulations are rarely being performed for blood pumps. Hence, the aim of the study is to examine the viscous shear stresses of a large eddy simulation in a blood pump and compare the results with an unsteady Reynolds-averaged Navier-Stokes simulation. The simulations were carried out at two operation points of a blood pump. The flow was simulated on a 100M element mesh for the large eddy simulation and a 20M element mesh for the unsteady Reynolds-averaged Navier-Stokes simulation. As a first step, the large eddy simulation was verified by analyzing internal dissipative losses within the pump. Then, the pump characteristics and mean and turbulent viscous shear stresses were compared between the two simulation methods. The verification showed that the large eddy simulation is able to reproduce the significant portion of dissipative losses, which is a global indication that the equivalent viscous shear stresses are adequately resolved. The comparison with the unsteady Reynolds-averaged Navier-Stokes simulation revealed that the hydraulic parameters were in agreement, but differences for the shear stresses were found. The results show the potential of the large eddy simulation as a high-quality comparative case to check the suitability of a chosen Reynolds-averaged Navier-Stokes setup and turbulence model. Furthermore, the results lead to suggest that large eddy simulations are superior to unsteady Reynolds-averaged Navier-Stokes simulations when instantaneous stresses are applied for the blood damage prediction.

  7. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  8. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    PubMed

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.

  9. Simulation-Based Training Platforms for Arthroscopy: A Randomized Comparison of Virtual Reality Learning to Benchtop Learning.

    PubMed

    Middleton, Robert M; Alvand, Abtin; Garfjeld Roberts, Patrick; Hargrove, Caroline; Kirby, Georgina; Rees, Jonathan L

    2017-05-01

    To determine whether a virtual reality (VR) arthroscopy simulator or benchtop (BT) arthroscopy simulator showed superiority as a training tool. Arthroscopic novices were randomized to a training program on a BT or a VR knee arthroscopy simulator. The VR simulator provided user performance feedback. Individuals performed a diagnostic arthroscopy on both simulators before and after the training program. Performance was assessed using wireless objective motion analysis and a global rating scale. The groups (8 in the VR group, 9 in the BT group) were well matched at baseline across all parameters (P > .05). Training on each simulator resulted in significant performance improvements across all parameters (P < .05). BT training conferred a significant improvement in all parameters when trainees were reassessed on the VR simulator (P < .05). In contrast, VR training did not confer improvement in performance when trainees were reassessed on the BT simulator (P > .05). BT-trained subjects outperformed VR-trained subjects in all parameters during final assessments on the BT simulator (P < .05). There was no difference in objective performance between VR-trained and BT-trained subjects on final VR simulator wireless objective motion analysis assessment (P > .05). Both simulators delivered improvements in arthroscopic skills. BT training led to skills that readily transferred to the VR simulator. Skills acquired after VR training did not transfer as readily to the BT simulator. Despite trainees receiving automated metric feedback from the VR simulator, the results suggest a greater gain in psychomotor skills for BT training. Further work is required to determine if this finding persists in the operating room. This study suggests that there are differences in skills acquired on different simulators and skills learnt on some simulators may be more transferable. Further work in identifying user feedback metrics that enhance learning is also required. Copyright © 2016 Arthroscopy Association of North America. All rights reserved.

  10. The effect of simulation courseware on critical thinking in undergraduate nursing students: multi-site pre-post study.

    PubMed

    Shin, Hyunsook; Ma, Hyunhee; Park, Jiyoung; Ji, Eun Sun; Kim, Dong Hee

    2015-04-01

    The use of simulations has been considered as opportunities for students to enhance their critical thinking (CT), but previous studies were limited because they did not provide in-depth information on the working dynamics of simulation or on the effects of the number of simulation exposures on CT. This study examined the effect of an integrated pediatric nursing simulation used in a nursing practicum on students' CT abilities and identified the effects of differing numbers of simulation exposures on CT in a multi-site environment. The study used a multi-site, pre-test, post-test design. A total of 237 nursing students at three universities enrolled in a pediatric practicum participated in this study from February to December 2013. All three schools used the same simulation courseware, including the same simulation scenarios, evaluation tools, and simulation equipment. The courseware incorporated high-fidelity simulators and standardized patients. Students at school A completed one simulation session, whereas students at schools B and C completed two and three simulation sessions, respectively. Yoon's Critical Thinking Disposition tool (2008) was used to measure students' CT abilities. The gains in students' CT scores varied according to their numbers of exposures to the simulation courseware. With a single exposure, there were no statistically significant gains in CT, whereas three exposures to the courseware produced significant gains in CT. In seven subcategories of critical thinking, three exposures to the simulation courseware produced CT gains in the prudence and intellectual eagerness subcategories, and the overall simulation experience produced CT gains in the prudence, systematicity, healthy skepticism, and intellectual eagerness subcategories. Simulation courseware may produce positive learning outcomes for prudence in nursing education. In addition, the findings from the multi-site comparative study may contribute to greater understanding of how patient simulation experiences impact students' CT abilities. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. MCC level C formulation requirements. Shuttle TAEM guidance and flight control, STS-1 baseline

    NASA Technical Reports Server (NTRS)

    Carman, G. L.; Montez, M. N.

    1980-01-01

    The TAEM guidance and body rotational dynamics models required for the MCC simulation of the TAEM mission phase are defined. This simulation begins at the end of the entry phase and terminates at TAEM autoland interface. The logic presented is the required configuration for the first shuttle orbital flight (STS-1). The TAEM guidance is simulated in detail. The rotational dynamics simulation is a simplified model that assumes that the commanded rotational rates can be achieved in the integration interval. Thus, the rotational dynamics simulation is essentially a simulation of the autopilot commanded rates and integration of these rates to determine orbiter attitude. The rotational dynamics simulation also includes a simulation of the speedbrake deflection. The body flap and elevon deflections are computed in the orbiter aerodynamic simulation.

  12. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  13. An agent-based stochastic Occupancy Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  14. An agent-based stochastic Occupancy Simulator

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    2017-06-01

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  15. Simulation of transmission electron microscope images of biological specimens.

    PubMed

    Rullgård, H; Ofverstedt, L-G; Masich, S; Daneholt, B; Oktem, O

    2011-09-01

    We present a new approach to simulate electron cryo-microscope images of biological specimens. The framework for simulation consists of two parts; the first is a phantom generator that generates a model of a specimen suitable for simulation, the second is a transmission electron microscope simulator. The phantom generator calculates the scattering potential of an atomic structure in aqueous buffer and allows the user to define the distribution of molecules in the simulated image. The simulator includes a well defined electron-specimen interaction model based on the scalar Schrödinger equation, the contrast transfer function for optics, and a noise model that includes shot noise as well as detector noise including detector blurring. To enable optimal performance, the simulation framework also includes a calibration protocol for setting simulation parameters. To test the accuracy of the new framework for simulation, we compare simulated images to experimental images recorded of the Tobacco Mosaic Virus (TMV) in vitreous ice. The simulated and experimental images show good agreement with respect to contrast variations depending on dose and defocus. Furthermore, random fluctuations present in experimental and simulated images exhibit similar statistical properties. The simulator has been designed to provide a platform for development of new instrumentation and image processing procedures in single particle electron microscopy, two-dimensional crystallography and electron tomography with well documented protocols and an open source code into which new improvements and extensions are easily incorporated. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  16. Quantitative Technique for Comparing Simulant Materials through Figures of Merit

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Hoelzer, Hans; Fourroux, Kathy; Owens, Charles; McLemore, Carole; Fikes, John

    2007-01-01

    The 1989 workshop report entitled Workshop on Production and Uses of Simulated Lunar Materials and the Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage, NASA Technical Publication both identified and reinforced a need for a set of standards and requirements for the production and usage of the Lunar simulant materials. As NASA prepares to return to the Moon, and set out to Mars, a set of early requirements have been developed for simulant materials and the initial methods to produce and measure those simulants have been defined. Addressed in the requirements document are: 1) a method for evaluating the quality of any simulant of a regolith, 2) the minimum characteristics for simulants of Lunar regolith, and 3) a method to produce simulants needed for NASA's Exploration mission. As an extension of the requirements document a method to evaluate new and current simulants has been rigorously defined through the mathematics of Figures of Merit (FoM). Requirements and techniques have been developed that allow the simulant provider to compare their product to a standard reference material through Figures of Merit. Standard reference material may be physical material such as the Apollo core samples or material properties predicted for any landing site. The simulant provider is not restricted to providing a single "high fidelity" simulant, which may be costly to produce. The provider can now develop "lower fidelity" simulants for engineering applications such as drilling and mobility applications.

  17. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  18. Software for Engineering Simulations of a Spacecraft

    NASA Technical Reports Server (NTRS)

    Shireman, Kirk; McSwain, Gene; McCormick, Bernell; Fardelos, Panayiotis

    2005-01-01

    Spacecraft Engineering Simulation II (SES II) is a C-language computer program for simulating diverse aspects of operation of a spacecraft characterized by either three or six degrees of freedom. A functional model in SES can include a trajectory flight plan; a submodel of a flight computer running navigational and flight-control software; and submodels of the environment, the dynamics of the spacecraft, and sensor inputs and outputs. SES II features a modular, object-oriented programming style. SES II supports event-based simulations, which, in turn, create an easily adaptable simulation environment in which many different types of trajectories can be simulated by use of the same software. The simulation output consists largely of flight data. SES II can be used to perform optimization and Monte Carlo dispersion simulations. It can also be used to perform simulations for multiple spacecraft. In addition to its generic simulation capabilities, SES offers special capabilities for space-shuttle simulations: for this purpose, it incorporates submodels of the space-shuttle dynamics and a C-language version of the guidance, navigation, and control components of the space-shuttle flight software.

  19. Pre-simulation orientation for medical trainees: An approach to decrease anxiety and improve confidence and performance.

    PubMed

    Bommer, Cassidy; Sullivan, Sarah; Campbell, Krystle; Ahola, Zachary; Agarwal, Suresh; O'Rourke, Ann; Jung, Hee Soo; Gibson, Angela; Leverson, Glen; Liepert, Amy E

    2018-02-01

    We assessed the effect of basic orientation to the simulation environment on anxiety, confidence, and clinical decision making. Twenty-four graduating medical students participated in a two-week surgery preparatory curriculum, including three simulations. Baseline anxiety was assessed pre-course. Scenarios were completed on day 2 and day 9. Prior to the first simulation, participants were randomly divided into two groups. Only one group received a pre-simulation orientation. Before the second simulation, all students received the same orientation. Learner anxiety was reported immediately preceding and following each simulation. Confidence was assessed post-simulation. Performance was evaluated by surgical faculty. The oriented group experienced decreased anxiety following the first simulation (p = 0.003); the control group did not. Compared to the control group, the oriented group reported less anxiety and greater confidence and received higher performance scores following all three simulations (all p < 0.05). Pre-simulation orientation reduces anxiety while increasing confidence and improving performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Holistic Nursing Simulation: A Concept Analysis.

    PubMed

    Cohen, Bonni S; Boni, Rebecca

    2018-03-01

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  1. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  2. A “Skylight” Simulator for HWIL Simulation of Hyperspectral Remote Sensing

    PubMed Central

    Zhao, Huijie; Cui, Bolun; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-01-01

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator’s performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing. PMID:29211004

  3. Development of a device to simulate tooth mobility.

    PubMed

    Erdelt, Kurt-Jürgen; Lamper, Timea

    2010-10-01

    The testing of new materials under simulation of oral conditions is essential in medicine. For simulation of fracture strength different simulation devices are used for test set-up. The results of these in vitro tests differ because there is no standardization of tooth mobility in simulation devices. The aim of this study is to develop a simulation device that depicts the tooth mobility curve as accurately as possible and creates reproducible and scalable mobility curves. With the aid of published literature and with the help of dentists, average forms of tooth classes were generated. Based on these tooth data, different abutment tooth shapes and different simulation devices were designed with a CAD system and were generated with a Rapid Prototyping system. Then, for all simulation devices the displacement curves were created with a universal testing machine and compared with the tooth mobility curve. With this new information, an improved adapted simulation device was constructed. A simulations device that is able to simulate the mobility curve of natural teeth with high accuracy and where mobility is reproducible and scalable was developed.

  4. Prototype software model for designing intruder detection systems with simulation

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh

    1998-08-01

    This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.

  5. Simulation in surgery: a review.

    PubMed

    Tan, Shaun Shi Yan; Sarker, Sudip K

    2011-05-01

    The ability to acquire surgical skills requires consistent practice, and evidence suggests that many of these technical skills can be learnt away from the operating theatre. The aim of this review article is to discuss the importance of surgical simulation today and its various types, exploring the effectiveness of simulation in the clinical setting and its challenges for the future. Surgical simulation offers the opportunity for trainees to practise their surgical skills prior to entering the operating theatre, allowing detailed feedback and objective assessment of their performance. This enables better patient safety and standards of care. Surgical simulators can be divided into organic or inorganic simulators. Organic simulators, consisting of live animal and fresh human cadaver models, are considered to be of high-fidelity. Inorganic simulators comprise virtual reality simulators and synthetic bench models. Current evidence suggests that skills acquired through training with simulators, positively transfers to the clinical setting and improves operative outcome. The major challenge for the future revolves around understanding the value of this new technology and developing an educational curriculum that can incorporate surgical simulators.

  6. Enhancing the Simulation Speed of Sensor Network Applications by Asynchronization of Interrupt Service Routines

    PubMed Central

    Joe, Hyunwoo; Woo, Duk-Kyun; Kim, Hyungshin

    2013-01-01

    Sensor network simulations require high fidelity and timing accuracy to be used as an implementation and evaluation tool. The cycle-accurate and instruction-level simulator is the known solution for these purposes. However, this type of simulation incurs a high computation cost since it has to model not only the instruction level behavior but also the synchronization between multiple sensors for their causality. This paper presents a novel technique that exploits asynchronous simulations of interrupt service routines (ISR). We can avoid the synchronization overheads when the interrupt service routines are simulated without preemption. If the causality errors occur, we devise a rollback procedure to restore the original synchronized simulation. This concept can be extended to any instruction-level sensor network simulator. Evaluation results show our method can enhance the simulation speed up to 52% in the case of our experiments. For applications with longer interrupt service routines and smaller number of preemptions, the speedup becomes greater. In addition, our simulator is 2 to 11 times faster than the well-known sensor network simulator. PMID:23966200

  7. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  8. Implementation of extended Lagrangian dynamics in GROMACS for polarizable simulations using the classical Drude oscillator model.

    PubMed

    Lemkul, Justin A; Roux, Benoît; van der Spoel, David; MacKerell, Alexander D

    2015-07-15

    Explicit treatment of electronic polarization in empirical force fields used for molecular dynamics simulations represents an important advancement in simulation methodology. A straightforward means of treating electronic polarization in these simulations is the inclusion of Drude oscillators, which are auxiliary, charge-carrying particles bonded to the cores of atoms in the system. The additional degrees of freedom make these simulations more computationally expensive relative to simulations using traditional fixed-charge (additive) force fields. Thus, efficient tools are needed for conducting these simulations. Here, we present the implementation of highly scalable algorithms in the GROMACS simulation package that allow for the simulation of polarizable systems using extended Lagrangian dynamics with a dual Nosé-Hoover thermostat as well as simulations using a full self-consistent field treatment of polarization. The performance of systems of varying size is evaluated, showing that the present code parallelizes efficiently and is the fastest implementation of the extended Lagrangian methods currently available for simulations using the Drude polarizable force field. © 2015 Wiley Periodicals, Inc.

  9. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  10. Simulation of networks of spiking neurons: A review of tools and strategies

    PubMed Central

    Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami

    2009-01-01

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781

  11. Hybrid Eulerian and Lagrangian Simulation of Steep and Breaking Waves and Surface Fluxes in High Winds

    DTIC Science & Technology

    2010-09-30

    simulating violent free - surface flows , and show the importance of wave breaking in energy transport...using Eulerian simulation . 3 IMPACT/APPLICATION This project aims at developing an advanced simulation tool for multi-fluids free - surface flows that...several Eulerian and Lagrangian methods for free - surface turbulence and wave simulation . The WIND–SNOW is used to simulate 1 Report

  12. Integrating simulation training into the nursing curriculum.

    PubMed

    Wilford, Amanda; Doyle, Thomas J

    The use of simulation is gaining momentum in nurse education across the UK. The Nursing and Midwifery Council is currently investigating the use of simulation in pre-registration nursing. This article gives a brief history of simulation, discusses competence issues and why simulation is best placed to teach nurses in today's health service. An innovative approach to implementing simulation into the nursing curriculum is introduced.

  13. Military Training: Observations on Efforts to Prepare Personnel to Survive Helicopter Crashes into Water

    DTIC Science & Technology

    2014-07-14

    Air Force Environmental conditions simulation equipment Equipment that simulates conditions such as waves, wind, rain, thunder , lightning , and...Environmental conditions simulation equipment Equipment that simulates conditions such as waves, wind, rain, thunder , lightning , and combat sounds...items such as wave generators, heavy-duty fans to simulate high winds, strobe lights to simulate lightning , water spray and injection systems to

  14. Effects of water-management alternatives on streamflow in the Ipswich River basin, Massachusetts

    USGS Publications Warehouse

    Zarriello, Philip J.

    2001-01-01

    Management alternatives that could help mitigate the effects of water withdrawals on streamflow in the Ipswich River Basin were evaluated by simulation with a calibrated Hydrologic Simulation Program--Fortran (HSPF) model. The effects of management alternatives on streamflow were simulated for a 35-year period (196195). Most alternatives examined increased low flows compared to the base simulation of average 1989-93 withdrawals. Only the simulation of no septic-effluent inflow, and the simulation of a 20-percent increase in withdrawals, further lowered flows or caused the river to stop flowing for longer periods of time than the simulation of average 198993 withdrawals. Simulations of reduced seasonal withdrawals by 20 percent, and by 50 percent, resulted in a modest increase in low flow in a critical habitat reach (model reach 8 near the Reading town well field); log-Pearson Type III analysis of simulated daily-mean flow indicated that under these reduced withdrawals, model reach 8 would stop flowing for a period of seven consecutive days about every other year, whereas under average 198993 withdrawals this reach would stop flowing for a seven consecutive day period almost every year. Simulations of no seasonal withdrawals, and simulations that stopped streamflow depletion when flow in model reach 19 was below 22 cubic feet per second, indicated flow would be maintained in model reach 8 at all times. Simulations indicated wastewater-return flows would augment low flow in proportion to the rate of return flow. Simulations of a 1.5 million gallons per day return flow rate indicated model reach 8 would stop flowing for a period of seven consecutive days about once every 5 years; simulated return flow rates of 1.1 million gallons per day indicated that model reach 8 would stop flowing for a period of seven consecutive days about every other year. Simulation of reduced seasonal withdrawals, combined with no septic effluent return flow, indicated only a slight increase in low flow compared to low flows simulated under average 198993 withdrawals. Simulation of reduced seasonal withdrawal, combined with 2.6 million gallons per day wastewater-return flows, provided more flow in model reach 8 than that simulated under no withdrawals.

  15. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  16. NOTE: Implementation of angular response function modeling in SPECT simulations with GATE

    NASA Astrophysics Data System (ADS)

    Descourt, P.; Carlier, T.; Du, Y.; Song, X.; Buvat, I.; Frey, E. C.; Bardies, M.; Tsui, B. M. W.; Visvikis, D.

    2010-05-01

    Among Monte Carlo simulation codes in medical imaging, the GATE simulation platform is widely used today given its flexibility and accuracy, despite long run times, which in SPECT simulations are mostly spent in tracking photons through the collimators. In this work, a tabulated model of the collimator/detector response was implemented within the GATE framework to significantly reduce the simulation times in SPECT. This implementation uses the angular response function (ARF) model. The performance of the implemented ARF approach has been compared to standard SPECT GATE simulations in terms of the ARF tables' accuracy, overall SPECT system performance and run times. Considering the simulation of the Siemens Symbia T SPECT system using high-energy collimators, differences of less than 1% were measured between the ARF-based and the standard GATE-based simulations, while considering the same noise level in the projections, acceleration factors of up to 180 were obtained when simulating a planar 364 keV source seen with the same SPECT system. The ARF-based and the standard GATE simulation results also agreed very well when considering a four-head SPECT simulation of a realistic Jaszczak phantom filled with iodine-131, with a resulting acceleration factor of 100. In conclusion, the implementation of an ARF-based model of collimator/detector response for SPECT simulations within GATE significantly reduces the simulation run times without compromising accuracy.

  17. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  18. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  19. Numerical simulation and characterization of trapping noise in InGaP-GaAs heterojunctions devices at high injection

    NASA Astrophysics Data System (ADS)

    Nallatamby, Jean-Christophe; Abdelhadi, Khaled; Jacquet, Jean-Claude; Prigent, Michel; Floriot, Didier; Delage, Sylvain; Obregon, Juan

    2013-03-01

    Commercially available simulators present considerable advantages in performing accurate DC, AC and transient simulations of semiconductor devices, including many fundamental and parasitic effects which are not generally taken into account in house-made simulators. Nevertheless, while the TCAD simulators of the public domain we have tested give accurate results for the simulation of diffusion noise, none of the tested simulators perform trap-assisted GR noise accurately. In order to overcome the aforementioned problem we propose a robust solution to accurately simulate GR noise due to traps. It is based on numerical processing of the output data of one of the simulators available in the public-domain, namely SENTAURUS (from Synopsys). We have linked together, through a dedicated Data Access Component (DAC), the deterministic output data available from SENTAURUS and a powerful, customizable post-processing tool developed on the mathematical SCILAB software package. Thus, robust simulations of GR noise in semiconductor devices can be performed by using GR Langevin sources associated to the scalar Green functions responses of the device. Our method takes advantage of the accuracy of the deterministic simulations of electronic devices obtained with SENTAURUS. A Comparison between 2-D simulations and measurements of low frequency noise on InGaP-GaAs heterojunctions, at low as well as high injection levels, demonstrates the validity of the proposed simulation tool.

  20. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework.

    PubMed

    Tay, Charison; Khajuria, Ankur; Gupte, Chinmay

    2014-01-01

    Traditional orthopaedic training has followed an apprenticeship model whereby trainees enhance their skills by operating under guidance. However the introduction of limitations on training hours and shorter training programmes mean that alternative training strategies are required. To perform a literature review on simulation training in arthroscopy and devise a framework that structures different simulation techniques that could be used in arthroscopic training. A systematic search of Medline, Embase, Google Scholar and the Cochrane Databases were performed. Search terms included "virtual reality OR simulator OR simulation" and "arthroscopy OR arthroscopic". 14 studies evaluating simulators in knee, shoulder and hip arthroplasty were included. The majority of the studies demonstrated construct and transference validity but only one showed concurrent validity. More studies are required to assess its potential as a training and assessment tool, skills transference between simulators and to determine the extent of skills decay from prolonged delays in training. We also devised a "ladder of arthroscopic simulation" that provides a competency-based framework to implement different simulation strategies. The incorporation of simulation into an orthopaedic curriculum will depend on a coordinated approach between many bodies. But the successful integration of simulators in other areas of surgery supports a possible role for simulation in advancing orthopaedic education. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  1. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  2. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  3. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  4. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  5. An electrical circuit model for simulation of indoor radon concentration.

    PubMed

    Musavi Nasab, S M; Negarestani, A

    2013-01-01

    In this study, a new model based on electric circuit theory was introduced to simulate the behaviour of indoor radon concentration. In this model, a voltage source simulates radon generation in walls, conductivity simulates migration through walls and voltage across a capacitor simulates radon concentration in a room. This simulation considers migration of radon through walls by diffusion mechanism in one-dimensional geometry. Data reported in a typical Greek house were employed to examine the application of this technique of simulation to the behaviour of radon.

  6. An Innovative and Successful Simulation Day.

    PubMed

    Bowling, Ann M; Eismann, Michelle

    This article discusses the development of a creative and innovative plan to incorporate independent activities, including skill reviews and scenarios, into a single eight-hour day, using small student groups to enhance the learning process for pediatric nursing students. The simulation day consists of skills activities and pediatric simulation scenarios using the human patient simulator. Using small student groups in simulation captures the students' attention and enhances motivation to learn. The simulation day is a work in progress; appropriate changes are continually being made to improve the simulation experience for students.

  7. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Limits to high-speed simulations of spiking neural networks using general-purpose computers.

    PubMed

    Zenke, Friedemann; Gerstner, Wulfram

    2014-01-01

    To understand how the central nervous system performs computations using recurrent neuronal circuitry, simulations have become an indispensable tool for theoretical neuroscience. To study neuronal circuits and their ability to self-organize, increasing attention has been directed toward synaptic plasticity. In particular spike-timing-dependent plasticity (STDP) creates specific demands for simulations of spiking neural networks. On the one hand a high temporal resolution is required to capture the millisecond timescale of typical STDP windows. On the other hand network simulations have to evolve over hours up to days, to capture the timescale of long-term plasticity. To do this efficiently, fast simulation speed is the crucial ingredient rather than large neuron numbers. Using different medium-sized network models consisting of several thousands of neurons and off-the-shelf hardware, we compare the simulation speed of the simulators: Brian, NEST and Neuron as well as our own simulator Auryn. Our results show that real-time simulations of different plastic network models are possible in parallel simulations in which numerical precision is not a primary concern. Even so, the speed-up margin of parallelism is limited and boosting simulation speeds beyond one tenth of real-time is difficult. By profiling simulation code we show that the run times of typical plastic network simulations encounter a hard boundary. This limit is partly due to latencies in the inter-process communications and thus cannot be overcome by increased parallelism. Overall, these results show that to study plasticity in medium-sized spiking neural networks, adequate simulation tools are readily available which run efficiently on small clusters. However, to run simulations substantially faster than real-time, special hardware is a prerequisite.

  9. Simulating adverse event spontaneous reporting systems as preferential attachment networks: application to the Vaccine Adverse Event Reporting System.

    PubMed

    Scott, J; Botsis, T; Ball, R

    2014-01-01

    Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.

  10. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  11. Translational simulation: not 'where?' but 'why?' A functional view of in situ simulation.

    PubMed

    Brazil, Victoria

    2017-01-01

    Healthcare simulation has been widely adopted for health professional education at all stages of training and practice and across cognitive, procedural, communication and teamwork domains. Recent enthusiasm for in situ simulation-delivered in the real clinical environment-cites improved transfer of knowledge and skills into real-world practice, as well as opportunities to identify latent safety threats and other workplace-specific issues. However, describing simulation type according to place may not be helpful. Instead, I propose the term translational simulation as a functional term for how simulation may be connected directly with health service priorities and patient outcomes, through interventional and diagnostic functions, independent of the location of the simulation activity.

  12. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  13. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  14. The ATLAS Simulation Infrastructure

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2010-09-25

    The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, andmore » the validation of the simulated output against known physics processes.« less

  15. Microgrid and Inverter Control and Simulator Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-13

    A collection of software that can simulate the operation of an inverter on a microgrid or control a real inverter. In addition, it can simulate the control of multiple nodes on a microgrid." Application: Simulation of inverters and microgrids; control of inverters on microgrids." The MMI submodule is designed to control custom inverter hardware, and to simulate that hardware. The INVERTER submodule is only the simulator code, and is of an earlier generation than the simulator in MMI. The MICROGRID submodule is an agent-based simulator of multiple nodes on a microgrid which presents a web interface. The WIND submodule producesmore » movies of wind data with a web interface.« less

  16. COCOA: Simulating Observations of Star Cluster Simulations

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2017-03-01

    COCOA (Cluster simulatiOn Comparison with ObservAtions) creates idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. The code can simulate optical observations from simulation snapshots in which positions and magnitudes of objects are known. The parameters for simulating the observations can be adjusted to mimic telescopes of various sizes. COCOA also has a photometry pipeline that can use standalone versions of DAOPHOT (ascl:1104.011) and ALLSTAR to produce photometric catalogs for all observed stars.

  17. An Orion/Ares I Launch and Ascent Simulation: One Segment of the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Crues, Edwin Z.; Blum, Mike G.; Alofs, Cathy; Busto, Juan

    2007-01-01

    This paper describes the architecture and implementation of a distributed launch and ascent simulation of NASA's Orion spacecraft and Ares I launch vehicle. This simulation is one segment of the Distributed Space Exploration Simulation (DSES) Project. The DSES project is a research and development collaboration between NASA centers which investigates technologies and processes for distributed simulation of complex space systems in support of NASA's Exploration Initiative. DSES is developing an integrated end-to-end simulation capability to support NASA development and deployment of new exploration spacecraft and missions. This paper describes the first in a collection of simulation capabilities that DSES will support.

  18. Cosimulation of embedded system using RTOS software simulator

    NASA Astrophysics Data System (ADS)

    Wang, Shihao; Duan, Zhigang; Liu, Mingye

    2003-09-01

    Embedded system design often employs co-simulation to verify system's function; one efficient verification tool of software is Instruction Set Simulator (ISS). As a full functional model of target CPU, ISS interprets instruction of embedded software step by step, which usually is time-consuming since it simulates at low-level. Hence ISS often becomes the bottleneck of co-simulation in a complicated system. In this paper, a new software verification tools, the RTOS software simulator (RSS) was presented. The mechanism of its operation was described in a full details. In RSS method, RTOS API is extended and hardware simulator driver is adopted to deal with data-exchange and synchronism between the two simulators.

  19. Gravitational Reference Sensor Front-End Electronics Simulator for LISA

    NASA Astrophysics Data System (ADS)

    Meshksar, Neda; Ferraioli, Luigi; Mance, Davor; ten Pierick, Jan; Zweifel, Peter; Giardini, Domenico; ">LISA Pathfinder colaboration, Medical Simulation Practices 2010 Survey Results

    NASA Technical Reports Server (NTRS)

    McCrindle, Jeffrey J.

    2011-01-01

    Medical Simulation Centers are an essential component of our learning infrastructure to prepare doctors and nurses for their careers. Unlike the military and aerospace simulation industry, very little has been published regarding the best practices currently in use within medical simulation centers. This survey attempts to provide insight into the current simulation practices at medical schools, hospitals, university nursing programs and community college nursing programs. Students within the MBA program at Saint Joseph's University conducted a survey of medical simulation practices during the summer 2010 semester. A total of 115 institutions responded to the survey. The survey resus discuss overall effectiveness of current simulation centers as well as the tools and techniques used to conduct the simulation activity

  1. Development of IR imaging system simulator

    NASA Astrophysics Data System (ADS)

    Xiang, Xinglang; He, Guojing; Dong, Weike; Dong, Lu

    2017-02-01

    To overcome the disadvantages of the tradition semi-physical simulation and injection simulation equipment in the performance evaluation of the infrared imaging system (IRIS), a low-cost and reconfigurable IRIS simulator, which can simulate the realistic physical process of infrared imaging, is proposed to test and evaluate the performance of the IRIS. According to the theoretical simulation framework and the theoretical models of the IRIS, the architecture of the IRIS simulator is constructed. The 3D scenes are generated and the infrared atmospheric transmission effects are simulated using OGRE technology in real-time on the computer. The physical effects of the IRIS are classified as the signal response characteristic, modulation transfer characteristic and noise characteristic, and they are simulated on the single-board signal processing platform based on the core processor FPGA in real-time using high-speed parallel computation method.

  2. Acoustic Parametric Array for Identifying Standoff Targets

    NASA Astrophysics Data System (ADS)

    Hinders, M. K.; Rudd, K. E.

    2010-02-01

    An integrated simulation method for investigating nonlinear sound beams and 3D acoustic scattering from any combination of complicated objects is presented. A standard finite-difference simulation method is used to model pulsed nonlinear sound propagation from a source to a scattering target via the KZK equation. Then, a parallel 3D acoustic simulation method based on the finite integration technique is used to model the acoustic wave interaction with the target. Any combination of objects and material layers can be placed into the 3D simulation space to study the resulting interaction. Several example simulations are presented to demonstrate the simulation method and 3D visualization techniques. The combined simulation method is validated by comparing experimental and simulation data and a demonstration of how this combined simulation method assisted in the development of a nonlinear acoustic concealed weapons detector is also presented.

  3. Compound simulator IR radiation characteristics test and calibration

    NASA Astrophysics Data System (ADS)

    Li, Yanhong; Zhang, Li; Li, Fan; Tian, Yi; Yang, Yang; Li, Zhuo; Shi, Rui

    2015-10-01

    The Hardware-in-the-loop simulation can establish the target/interference physical radiation and interception of product flight process in the testing room. In particular, the simulation of environment is more difficult for high radiation energy and complicated interference model. Here the development in IR scene generation produced by a fiber array imaging transducer with circumferential lamp spot sources is introduced. The IR simulation capability includes effective simulation of aircraft signatures and point-source IR countermeasures. Two point-sources as interference can move in two-dimension random directions. For simulation the process of interference release, the radiation and motion characteristic is tested. Through the zero calibration for optical axis of simulator, the radiation can be well projected to the product detector. The test and calibration results show the new type compound simulator can be used in the hardware-in-the-loop simulation trial.

  4. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  5. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  6. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  7. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  8. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  9. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  10. Simulating neural systems with Xyce.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  11. Some Dimensions of Simulation.

    ERIC Educational Resources Information Center

    Beck, Isabel; Monroe, Bruce

    Beginning with definitions of "simulation" (a methodology for testing alternative decisions under hypothetical conditions), this paper focuses on the use of simulation as an instructional method, pointing out the relationships and differences between role playing, games, and simulation. The term "simulation games" is explored with an analysis of…

  12. Computer Simulation in Tomorrow's Schools.

    ERIC Educational Resources Information Center

    Foster, David

    1984-01-01

    Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…

  13. 78 FR 30956 - Cruise Vessel Security and Safety Training Provider Certification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ..., practical demonstration, or simulation program. A detailed instructor manual must be submitted. Submissions... simulation programs to be used. If a simulator or simulation program is to be used, include technical... lessons and, if appropriate, for practical demonstrations or simulation exercises and assessments...

  14. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  15. The Many Faces of Patient-Centered Simulation: Implications for Researchers.

    PubMed

    Arnold, Jennifer L; McKenzie, Frederic Rick D; Miller, Jane Lindsay; Mancini, Mary E

    2018-06-01

    Patient-centered simulation for nonhealthcare providers is an emerging and innovative application for healthcare simulation. Currently, no consensus exists on what patient-centered simulation encompasses and outcomes research in this area is limited. Conceptually, patient-centered simulation aligns with the principles of patient- and family-centered care bringing this educational tool directly to patients and caregivers with the potential to improve patient care and outcomes. This descriptive article is a summary of findings presented at the 2nd International Meeting for Simulation in Healthcare Research Summit. Experts in the field delineated a categorization for better describing patient-centered simulation and reviewed the literature to identify a research agenda. Three types of patient-centered simulation patient-directed, patient-driven, and patient-specific are presented with research priorities identified for each. Patient-centered simulation has been shown to be an effective educational tool and has the potential to directly improve patient care outcomes. Presenting a typology for patient-centered simulation provides direction for future research.

  16. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    PubMed

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  17. Real-time visual simulation of APT system based on RTW and Vega

    NASA Astrophysics Data System (ADS)

    Xiong, Shuai; Fu, Chengyu; Tang, Tao

    2012-10-01

    The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.

  18. Role of a cumulus parameterization scheme in simulating atmospheric circulation and rainfall in the nine-layer Goddard Laboratory for Atmospheres General Circulation Model

    NASA Technical Reports Server (NTRS)

    Sud, Y. C.; Chao, Winston C.; Walker, G. K.

    1992-01-01

    The influence of a cumulus convection scheme on the simulated atmospheric circulation and hydrologic cycle is investigated by means of a coarse version of the GCM. Two sets of integrations, each containing an ensemble of three summer simulations, were produced. The ensemble sets of control and experiment simulations are compared and differentially analyzed to determine the influence of a cumulus convection scheme on the simulated circulation and hydrologic cycle. The results show that cumulus parameterization has a very significant influence on the simulation circulation and precipitation. The upper-level condensation heating over the ITCZ is much smaller for the experiment simulations as compared to the control simulations; correspondingly, the Hadley and Walker cells for the control simulations are also weaker and are accompanied by a weaker Ferrel cell in the Southern Hemisphere. Overall, the difference fields show that experiment simulations (without cumulus convection) produce a cooler and less energetic atmosphere.

  19. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  20. Benefits of full scope simulators during solar thermal power plants design and construction

    NASA Astrophysics Data System (ADS)

    Gallego, José F.; Gil, Elena; Rey, Pablo

    2017-06-01

    In order to efficiently develop high-precision dynamic simulators for solar thermal power plants, Tecnatom adapted its simulation technology to consider solar thermal models. This effort and the excellent response of the simulation market have allowed Tecnatom to develop simulators with both parabolic trough and solar power tower technologies, including molten salt energy storage. These simulators may pursue different objectives, giving rise to training or engineering simulators. Solar thermal power market combines the need for the training of the operators with the potential benefits associated to the improvement of the design of the plants. This fact along with the simulation capabilities enabled by the current technology and the broad experience of Tecnatom present the development of an engineering+training simulator as a very advantageous option. This paper describes the challenge of the development and integration of a full scope simulator during the design and construction stages of a solar thermal power plant, showing the added value to the different engineering areas.

  1. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.

    PubMed

    Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk

    2013-08-01

    Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.

  2. Evaluation and development the routing protocol of a fully functional simulation environment for VANETs

    NASA Astrophysics Data System (ADS)

    Ali, Azhar Tareq; Warip, Mohd Nazri Mohd; Yaakob, Naimah; Abduljabbar, Waleed Khalid; Atta, Abdu Mohammed Ali

    2017-11-01

    Vehicular Ad-hoc Networks (VANETs) is an area of wireless technologies that is attracting a great deal of interest. There are still several areas of VANETS, such as security and routing protocols, medium access control, that lack large amounts of research. There is also a lack of freely available simulators that can quickly and accurately simulate VANETs. The main goal of this paper is to develop a freely available VANETS simulator and to evaluate popular mobile ad-hoc network routing protocols in several VANETS scenarios. The VANETS simulator consisted of a network simulator, traffic (mobility simulator) and used a client-server application to keep the two simulators in sync. The VANETS simulator also models buildings to create a more realistic wireless network environment. Ad-Hoc Distance Vector routing (AODV), Dynamic Source Routing (DSR) and Dynamic MANET On-demand (DYMO) were initially simulated in a city, country, and highway environment to provide an overall evaluation.

  3. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  4. Innovations in surgery simulation: a review of past, current and future techniques

    PubMed Central

    Burtt, Karen; Solorzano, Carlos A.; Carey, Joseph N.

    2016-01-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon’s skill set, decrease hospital costs, and improve patient outcomes. PMID:28090509

  5. Innovations in surgery simulation: a review of past, current and future techniques.

    PubMed

    Badash, Ido; Burtt, Karen; Solorzano, Carlos A; Carey, Joseph N

    2016-12-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon's skill set, decrease hospital costs, and improve patient outcomes.

  6. Integrating Medical Simulation Into the Physician Assistant Physiology Curriculum.

    PubMed

    Li, Lixin; Lopes, John; Zhou, Joseph Yi; Xu, Biao

    2016-12-01

    Medical simulation has recently been used in medical education, and evidence indicates that it is a valuable tool for teaching and evaluation. Very few studies have evaluated the integration of medical simulation in medical physiology education, particularly in PA programs. This study was designed to assess the value of integrating medical simulation into the PA physiology curriculum. Seventy-five students from the PA program at Central Michigan University participated in this study. Mannequin-based simulation was used to simulate a patient with hemorrhagic shock and congestive heart failure to demonstrate the Frank-Starling force and cardiac function curve. Before and after the medical simulation, students completed a questionnaire as a self-assessment. A knowledge test was also delivered after the simulation. Our study demonstrated a significant improvement in student confidence in understanding congestive heart failure, hemorrhagic shock, and the Frank-Starling curve after the simulation. Medical simulation may be an effective way to enhance basic science learning experiences for students and an ideal supplement to traditional, lecture-based teaching in PA education.

  7. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  8. Modeling, Simulation and Design of Plasmonic Interconnects for On-Chip Signal Processing

    DTIC Science & Technology

    2011-02-14

    integration and computation can be achieved by using the photonic detection devices such as the ultrafast photodectors and nanowire field transistors... infrared to optical frequencies, and their FDTD simulation results are shown in the middle diagram. In the right most diagram, the HSPICE simulation...FDTD simulation. The results tally very well to affirm that plasmonic nanowires can be simulated using circuit simulators like HSPICE to combine the

  9. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  10. OpenSimulator Interoperability with DRDC Simulation Tools: Compatibility Study

    DTIC Science & Technology

    2014-09-01

    into two components: (1) backend data services consisting of user accounts, login service, assets, and inventory; and (2) the simulator server which...components are combined into a single OpenSimulator process. In grid mode, the two components are separated, placing the backend services into a ROBUST... mobile devices. Potential points of compatibility between Unity and OpenSimulator include: a Unity-based desktop computer OpenSimulator viewer; a

  11. Multiscale optical simulation settings: challenging applications handled with an iterative ray-tracing FDTD interface method.

    PubMed

    Leiner, Claude; Nemitz, Wolfgang; Schweitzer, Susanne; Kuna, Ladislav; Wenzl, Franz P; Hartmann, Paul; Satzinger, Valentin; Sommer, Christian

    2016-03-20

    We show that with an appropriate combination of two optical simulation techniques-classical ray-tracing and the finite difference time domain method-an optical device containing multiple diffractive and refractive optical elements can be accurately simulated in an iterative simulation approach. We compare the simulation results with experimental measurements of the device to discuss the applicability and accuracy of our iterative simulation procedure.

  12. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  13. [Current and future use of surgical skills simulation in gynecologic resident education: a French national survey].

    PubMed

    Crochet, P; Aggarwal, R; Berdah, S; Yaribakht, S; Boubli, L; Gamerre, M; Agostini, A

    2014-05-01

    Simulation is a promising method to enhance surgical education in gynecology. The purpose of this study was to provide baseline information on the current use of simulators across French academic schools. Two questionnaires were created, one specifically for residents and one for professors. Main issues included the type of simulators used and the kind of use made for training purposes. Opinions and agreement about the use of simulators were also asked. Twenty-six percent of residents (258/998) and 24% of professors (29/122) answered the questionnaire. Sixty-five percent of residents (167/258) had experienced simulators. Laparoscopic pelvic-trainers (84%) and sessions on alive pigs (63%) were most commonly used. Residents reported access to simulators most commonly during introductory sessions (51%) and days of academic workshops (38%). Residents believed simulators very useful for training. Professors agreed that simulators should become a required part of residency training, but were less enthusiastic regarding simulation becoming a part of certification for practice. Surgical skills simulators are already experienced by a majority of French gynecologic residents. However, the use of these educational tools varies among surgical schools and remains occasional for the majority of residents. There was a strong agreement that simulation technology should be a component of training. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  14. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Taming Wild Horses: The Need for Virtual Time-based Scheduling of VMs in Network Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S; Henz, Brian J

    2012-01-01

    The next generation of scalable network simulators employ virtual machines (VMs) to act as high-fidelity models of traffic producer/consumer nodes in simulated networks. However, network simulations could be inaccurate if VMs are not scheduled according to virtual time, especially when many VMs are hosted per simulator core in a multi-core simulator environment. Since VMs are by default free-running, on the outset, it is not clear if, and to what extent, their untamed execution affects the results in simulated scenarios. Here, we provide the first quantitative basis for establishing the need for generalized virtual time scheduling of VMs in network simulators,more » based on an actual prototyped implementations. To exercise breadth, our system is tested with multiple disparate applications: (a) a set of message passing parallel programs, (b) a computer worm propagation phenomenon, and (c) a mobile ad-hoc wireless network simulation. We define and use error metrics and benchmarks in scaled tests to empirically report the poor match of traditional, fairness-based VM scheduling to VM-based network simulation, and also clearly show the better performance of our simulation-specific scheduler, with up to 64 VMs hosted on a 12-core simulator node.« less

  16. Is There Bias against Simulation in Microsurgery Training?

    PubMed

    Theman, Todd A; Labow, Brian I

    2016-09-01

    Background While other surgical specialties have embraced virtual reality simulation for training and recertification, microsurgery has lagged. This study aims to assess the opinions of microsurgeons on the role of simulation in microsurgery assessment and training. Methods We surveyed faculty members of the American Society of Reconstructive Microsurgery to ascertain opinions on their use of simulation in training and opinions about the utility of simulation for skills acquisition, teaching, and skills assessment. The 21-question survey was disseminated online to 675 members. Results Eighty-nine members completed the survey for a 13.2% response rate. Few microsurgeons have experience with high-fidelity simulation, and opinions on its utility are internally inconsistent. Although 84% of respondents could not identify a reason why simulation would not be useful, only 24% believed simulation is a useful measure of clinical performance. Nearly three-fourths of respondents were skeptical that simulation would improve their skills. Ninety-four percent had no experience with simulator-based assessment. Conclusion Simulation has been shown to improve skills acquisition in microsurgery, but our survey suggests that unfamiliarity may foster bias against the technology. Failure to incorporate simulation may adversely affect training and may put surgeons at a disadvantage should these technologies be adopted for recertification by regulatory agencies. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.

    PubMed

    Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L

    2018-05-16

    During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.

  18. Sensitivity of polar ozone recovery predictions of the GMI 3D CTM to GCM and DAS dynamics

    NASA Astrophysics Data System (ADS)

    Considine, D.; Connell, P.; Strahan, S.; Douglass, A.; Rotman, D.

    2003-04-01

    The Global Modeling Initiative (GMI) 3-D chemistry and transport model has been used to generate 2 simulations of the 1995-2030 time period. The 36-year simulations both used the source gas and aerosol boundary conditions of the 2002 World Meteorological Organization assessment exercise MA2. The first simulation was based on a single year of meteorological data (winds, temperatures) generated by the new Goddard Space Flight Center "Finite Volume" General Circulation Model (FVGCM), repeated for each year of the simulation. The second simulation used a year of meteorological data generated by a new data assimilation system based on the FVGCM (FVDAS), using observations for July 1, 1999 - June 30, 2000. All other aspects of the two simulations were identical. The increase in vortex-averaged south polar springtime ozone concentrations in the lower stratosphere over the course of the simulations is more robust in the simulation driven by the GCM meteorological data than in the simulation driven by DAS winds. At the same time, the decrease in estimated chemical springtime ozone loss is similar. We thus attribute the differences between the two simulations to differences in the representations of polar dynamics which reduce the sensitivity of the simulation driven by DAS winds to changes in vortex chemistry. We also evaluate the representations in the two simulations of trace constituent distributions in the current polar lower stratosphere using various observations. In these comparisons the GCM-based simulation often is in better agreement with the observations than the DAS-based simulation.

  19. The effects of simulated patients and simulated gynecologic models on student anxiety in providing IUD services.

    PubMed

    Khadivzadeh, Talat; Erfanian, Fatemeh

    2012-10-01

    Midwifery students experience high levels of stress during their initial clinical practices. Addressing the learner's source of anxiety and discomfort can ease the learning experience and lead to better outcomes. The aim of this study was to find out the effect of a simulation-based course, using simulated patients and simulated gynecologic models on student anxiety and comfort while practicing to provide intrauterine device (IUD) services. Fifty-six eligible midwifery students were randomly allocated into simulation-based and traditional training groups. They participated in a 12-hour workshop in providing IUD services. The simulation group was trained through an educational program including simulated gynecologic models and simulated patients. The students in both groups then practiced IUD consultation and insertion with real patients in the clinic. The students' anxiety in IUD insertion was assessed using the "Spielberger anxiety test" and the "comfort in providing IUD services" questionnaire. There were significant differences between students in 2 aspects of anxiety including state (P < 0.001) and trait (P = 0.024) and the level of comfort (P = 0.000) in providing IUD services in simulation and traditional groups. "Fear of uterine perforation during insertion" was the most important cause of students' anxiety in providing IUD services, which was reported by 74.34% of students. Simulated patients and simulated gynecologic models are effective in optimizing students' anxiety levels when practicing to deliver IUD services. Therefore, it is recommended that simulated patients and simulated gynecologic models be used before engaging students in real clinical practice.

  20. Mobile Simulation Unit: taking simulation to the surgical trainee.

    PubMed

    Pena, Guilherme; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy

    2015-05-01

    Simulation-based training has become an increasingly accepted part of surgical training. However, simulators are still not widely available to surgical trainees. Some factors that hinder the widespread implementation of simulation-based training are the lack of standardized methods and equipment, costs and time constraints. We have developed a Mobile Simulation Unit (MSU) that enables trainees to access modern simulation equipment tailored to the needs of the learner at the trainee's workplace. From July 2012 to December 2012, the MSU visited six hospitals in South Australia, four in metropolitan and two in rural areas. Resident Medical Officers, surgical trainees, Fellows and International Medical Graduates were invited to voluntarily utilize a variety of surgical simulators on offer. Participants were asked to complete a survey about the accessibility of simulation equipment at their workplace, environment of the MSU, equipment available and instruction received. Utilization data were collected. The MSU was available for a total of 303 h over 52 days. Fifty-five participants were enrolled in the project and each spent on average 118 min utilizing the simulators. The utilization of the total available time was 36%. Participants reported having a poor access to simulation at their workplace and overwhelmingly gave positive feedback regarding their experience in the MSU. The use of the MSU to provide simulation-based education in surgery is feasible and practical. The MSU provides consistent simulation training at the surgical trainee's workplace, regardless of geographic location, and it has the potential to increase participation in simulation programmes. © 2014 Royal Australasian College of Surgeons.

  1. Visualization and simulation techniques for surgical simulators using actual patient's data.

    PubMed

    Radetzky, Arne; Nürnberger, Andreas

    2002-11-01

    Because of the increasing complexity of surgical interventions research in surgical simulation became more and more important over the last years. However, the simulation of tissue deformation is still a challenging problem, mainly due to the short response times that are required for real-time interaction. The demands to hard and software are even larger if not only the modeled human anatomy is used but the anatomy of actual patients. This is required if the surgical simulator should be used as training medium for expert surgeons rather than students. In this article, suitable visualization and simulation methods for surgical simulation utilizing actual patient's datasets are described. Therefore, the advantages and disadvantages of direct and indirect volume rendering for the visualization are discussed and a neuro-fuzzy system is described, which can be used for the simulation of interactive tissue deformations. The neuro-fuzzy system makes it possible to define the deformation behavior based on a linguistic description of the tissue characteristics or to learn the dynamics by using measured data of real tissue. Furthermore, a simulator for minimally-invasive neurosurgical interventions is presented that utilizes the described visualization and simulation methods. The structure of the simulator is described in detail and the results of a system evaluation by an experienced neurosurgeon--a quantitative comparison between different methods of virtual endoscopy as well as a comparison between real brain images and virtual endoscopies--are given. The evaluation proved that the simulator provides a higher realism of the visualization and simulation then other currently available simulators. Copyright 2002 Elsevier Science B.V.

  2. Bringing consistency to simulation of population models--Poisson simulation as a bridge between micro and macro simulation.

    PubMed

    Gustafsson, Leif; Sternad, Mikael

    2007-10-01

    Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.

  3. Laparoscopic skills acquisition: a study of simulation and traditional training.

    PubMed

    Marlow, Nicholas; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy J

    2014-12-01

    Training in basic laparoscopic skills can be undertaken using traditional methods, where trainees are educated by experienced surgeons through a process of graduated responsibility or by simulation-based training. This study aimed to assess whether simulation trained individuals reach the same level of proficiency in basic laparoscopic skills as traditional trained participants when assessed in a simulated environment. A prospective study was undertaken. Participants were allocated to one of two cohorts according to surgical experience. Participants from the inexperienced cohort were randomized to receive training in basic laparoscopic skills on either a box trainer or a virtual reality simulator. They were then assessed on the simulator on which they did not receive training. Participants from the experienced cohort, considered to have received traditional training in basic laparoscopic skills, did not receive simulation training and were randomized to either the box trainer or virtual reality simulator for skills assessment. The assessment scores from different cohorts on either simulator were then compared. A total of 138 participants completed the assessment session, 101 in the inexperienced simulation-trained cohort and 37 on the experienced traditionally trained cohort. There was no statistically significant difference between the training outcomes of simulation and traditionally trained participants, irrespective of the simulator type used. The results demonstrated that participants trained on either a box trainer or virtual reality simulator achieved a level of basic laparoscopic skills assessed in a simulated environment that was not significantly different from participants who had been traditionally trained in basic laparoscopic skills. © 2013 Royal Australasian College of Surgeons.

  4. Surgical simulators in urological training--views of UK Training Programme Directors.

    PubMed

    Forster, James A; Browning, Anthony J; Paul, Alan B; Biyani, C Shekhar

    2012-09-01

    What's known on the subject? and What does the study add? The role of surgical simulators is currently being debated in urological and other surgical specialties. Simulators are not presently implemented in the UK urology training curriculum. The availability of simulators and the opinions of Training Programme Directors' (TPD) on their role have not been described. In the present questionnaire-based survey, the trainees of most, but not all, UK TPDs had access to laparoscopic simulators, and that all responding TPDs thought that simulators improved laparoscopic training. We hope that the present study will be a positive step towards making an agreement to formally introduce simulators into the UK urology training curriculum. To discuss the current situation on the use of simulators in surgical training. To determine the views of UK Urology Training Programme Directors (TPDs) on the availability and use of simulators in Urology at present, and to discuss the role that simulators may have in future training. An online-questionnaire survey was distributed to all UK Urology TPDs. In all, 16 of 21 TPDs responded. All 16 thought that laparoscopic simulators improved the quality of laparoscopic training. The trainees of 13 TPDs had access to a laparoscopic simulator (either in their own hospital or another hospital in the deanery). Most TPDs thought that trainees should use simulators in their free time, in quiet time during work hours, or in teaching sessions (rather than incorporated into the weekly timetable). We feel that the current apprentice-style method of training in urological surgery is out-dated. We think that all TPDs and trainees should have access to a simulator, and that a formal competency based simulation training programme should be incorporated into the urology training curriculum, with trainees reaching a minimum proficiency on a simulator before undertaking surgical procedures. © 2012 THE AUTHORS. BJU INTERNATIONAL © 2012 BJU INTERNATIONAL.

  5. Design and Test of Advanced Thermal Simulators for an Alkali Metal-Cooled Reactor Simulator

    NASA Technical Reports Server (NTRS)

    Garber, Anne E.; Dickens, Ricky E.

    2011-01-01

    The Early Flight Fission Test Facility (EFF-TF) at NASA Marshall Space Flight Center (MSFC) has as one of its primary missions the development and testing of fission reactor simulators for space applications. A key component in these simulated reactors is the thermal simulator, designed to closely mimic the form and function of a nuclear fuel pin using electric heating. Continuing effort has been made to design simple, robust, inexpensive thermal simulators that closely match the steady-state and transient performance of a nuclear fuel pin. A series of these simulators have been designed, developed, fabricated and tested individually and in a number of simulated reactor systems at the EFF-TF. The purpose of the thermal simulators developed under the Fission Surface Power (FSP) task is to ensure that non-nuclear testing can be performed at sufficiently high fidelity to allow a cost-effective qualification and acceptance strategy to be used. Prototype thermal simulator design is founded on the baseline Fission Surface Power reactor design. Recent efforts have been focused on the design, fabrication and test of a prototype thermal simulator appropriate for use in the Technology Demonstration Unit (TDU). While designing the thermal simulators described in this paper, effort were made to improve the axial power profile matching of the thermal simulators. Simultaneously, a search was conducted for graphite materials with higher resistivities than had been employed in the past. The combination of these two efforts resulted in the creation of thermal simulators with power capacities of 2300-3300 W per unit. Six of these elements were installed in a simulated core and tested in the alkali metal-cooled Fission Surface Power Primary Test Circuit (FSP-PTC) at a variety of liquid metal flow rates and temperatures. This paper documents the design of the thermal simulators, test program, and test results.

  6. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  7. Mobile in Situ Simulation as a Tool for Evaluation and Improvement of Trauma Treatment in the Emergency Department.

    PubMed

    Amiel, Imri; Simon, Daniel; Merin, Ofer; Ziv, Amitai

    2016-01-01

    Medical simulation is an increasingly recognized tool for teaching, coaching, training, and examining practitioners in the medical field. For many years, simulation has been used to improve trauma care and teamwork. Despite technological advances in trauma simulators, including better means of mobilization and control, most reported simulation-based trauma training has been conducted inside simulation centers, and the practice of mobile simulation in hospitals' trauma rooms has not been investigated fully. The emergency department personnel from a second-level trauma center in Israel were evaluated. Divided into randomly formed trauma teams, they were reviewed twice using in situ mobile simulation training at the hospital's trauma bay. In all, 4 simulations were held before and 4 simulations were held after a structured learning intervention. The intervention included a 1-day simulation-based training conducted at the Israel Center for Medical Simulation (MSR), which included video-based debriefing facilitated by the hospital's 4 trauma team leaders who completed a 2-day simulation-based instructors' course before the start of the study. The instructors were also trained on performance rating and thus were responsible for the assessment of their respective teams in real time as well as through reviewing of the recorded videos; thus enabling a comparison of the performances in the mobile simulation exercise before and after the educational intervention. The internal reliability of the experts' evaluation calculated in the Cronbach α model was found to be 0.786. Statistically significant improvement was observed in 4 of 10 parameters, among which were teamwork (29.64%) and communication (24.48%) (p = 0.00005). The mobile in situ simulation-based training demonstrated efficacy both as an assessment tool for trauma teams' function and an educational intervention when coupled with in vitro simulation-based training, resulting in a significant improvement of the teams' function in various aspects of treatment. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. Fluid, solid and fluid-structure interaction simulations on patient-based abdominal aortic aneurysm models.

    PubMed

    Kelly, Sinead; O'Rourke, Malachy

    2012-04-01

    This article describes the use of fluid, solid and fluid-structure interaction simulations on three patient-based abdominal aortic aneurysm geometries. All simulations were carried out using OpenFOAM, which uses the finite volume method to solve both fluid and solid equations. Initially a fluid-only simulation was carried out on a single patient-based geometry and results from this simulation were compared with experimental results. There was good qualitative and quantitative agreement between the experimental and numerical results, suggesting that OpenFOAM is capable of predicting the main features of unsteady flow through a complex patient-based abdominal aortic aneurysm geometry. The intraluminal thrombus and arterial wall were then included, and solid stress and fluid-structure interaction simulations were performed on this, and two other patient-based abdominal aortic aneurysm geometries. It was found that the solid stress simulations resulted in an under-estimation of the maximum stress by up to 5.9% when compared with the fluid-structure interaction simulations. In the fluid-structure interaction simulations, flow induced pressure within the aneurysm was found to be up to 4.8% higher than the value of peak systolic pressure imposed in the solid stress simulations, which is likely to be the cause of the variation in the stress results. In comparing the results from the initial fluid-only simulation with results from the fluid-structure interaction simulation on the same patient, it was found that wall shear stress values varied by up to 35% between the two simulation methods. It was concluded that solid stress simulations are adequate to predict the maximum stress in an aneurysm wall, while fluid-structure interaction simulations should be performed if accurate prediction of the fluid wall shear stress is necessary. Therefore, the decision to perform fluid-structure interaction simulations should be based on the particular variables of interest in a given study.

  9. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142

  10. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  11. A Study of Umbilical Communication Interface of Simulator Kernel to Enhance Visibility and Controllability

    NASA Astrophysics Data System (ADS)

    Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.

  12. Simulating Issue Networks in Small Classes using the World Wide Web.

    ERIC Educational Resources Information Center

    Josefson, Jim; Casey, Kelly

    2000-01-01

    Provides background information on simulations and active learning. Discusses the use of simulations in political science courses. Describes a simulation exercise where students performed specific institutional role playing, simulating the workings of a single congressional issue network, based on the reauthorization of the Endangered Species Act.…

  13. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  14. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  15. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  16. Do Simulations Enhance Student Learning? An Empirical Evaluation of an IR Simulation

    ERIC Educational Resources Information Center

    Shellman, Stephen M.; Turan, Kursad

    2006-01-01

    There is a nascent literature on the question of whether active learning methods, and in particular simulation methods, enhance student learning. In this article, the authors evaluate the utility of an international relations simulation in enhancing learning objectives. Student evaluations provide evidence that the simulation process enhances…

  17. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Training devices and simulators. 121.921...

  18. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Training devices and simulators. 121.921...

  19. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Training devices and simulators. 121.921...

  20. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Training devices and simulators. 121.921...

  1. An Investigation of Computer-based Simulations for School Crises Management.

    ERIC Educational Resources Information Center

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  2. Simulation, Gaming, and Conventional Instruction: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Fennessey, Gail M.; And Others

    An environmental problems unit was organized to be taught with three approaches. One approach contained two simulation exercises, one contained a simulation game and a simulation exercise, and one contained no simulations. These approaches were compared for their effectiveness for teaching facts and relationships and for producing favorable…

  3. Incorporating Haptic Feedback in Simulation for Learning Physics

    ERIC Educational Resources Information Center

    Han, Insook; Black, John B.

    2011-01-01

    The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…

  4. Optical simulations for experimental networks: lessons from MONET

    NASA Astrophysics Data System (ADS)

    Richards, Dwight H.; Jackel, Janet L.; Goodman, Matthew S.; Roudas, Ioannis; Wagner, Richard E.; Antoniades, Neophytos

    1999-08-01

    We have used optical simulations as a means of setting component requirements, assessing component compatibility, and designing experiments in the MONET (Multiwavelength Optical Networking) Project. This paper reviews the simulation method, gives some examples of the types of simulations that have been performed, and discusses the validation of the simulations.

  5. A Simulation Game for the Study of State Policies.

    ERIC Educational Resources Information Center

    Enzer, Selwyn; And Others

    A simulated planning conference used both informed experts (simulating state planners and societal groups) and "Monte Carlo" procedures (simulating random events) to identify some possible futures for the state of Connecticut and to examine rational planning behavior patterns. In the simulation the participants were divided into two…

  6. Two Applications of Simulation in the Educational Environment. Tech Memo.

    ERIC Educational Resources Information Center

    Thomas, David B.

    Two educational computer simulations are described in this paper. One of the simulations is STATSIM, a series of exercises applicable to statistical instruction. The content of the other simulation is comprised of mathematical learning models. Student involvement, the interactive nature of the simulations, and terminal display of materials are…

  7. Combining Interactive Thermodynamics Simulations with Screencasts and Conceptests

    ERIC Educational Resources Information Center

    Falconer, John L.

    2016-01-01

    More than 40 interactive "Mathematica" simulations were prepared for chemical engineering thermodynamics, screencasts were prepared that explain how to use each simulation, and more than 100 ConcepTests were prepared that utilize the simulations. They are located on www.LearnChemE.com. The purposes of these simulations are to clarify…

  8. Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.

    ERIC Educational Resources Information Center

    Michael, Joel A.; Rovick, Allen A.

    1986-01-01

    Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)

  9. Exploring Simulation Utilization and Simulation Evaluation Practices and Approaches in Undergraduate Nursing Education

    ERIC Educational Resources Information Center

    Zitzelsberger, Hilde; Coffey, Sue; Graham, Leslie; Papaconstantinou, Efrosini; Anyinam, Charles

    2017-01-01

    Simulation-based learning (SBL) is rapidly becoming one of the most significant teaching-learning-evaluation strategies available in undergraduate nursing education. While there is indication within the literature and anecdotally about the benefits of simulation, abundant and strong evidence that supports the effectiveness of simulation for…

  10. A Unique Software System For Simulation-to-Flight Research

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Hutchinson, Brian K.

    2001-01-01

    "Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.

  11. Free-energy analyses of a proton transfer reaction by simulated-tempering umbrella sampling and first-principles molecular dynamics simulations.

    PubMed

    Mori, Yoshiharu; Okamoto, Yuko

    2013-02-01

    A simulated tempering method, which is referred to as simulated-tempering umbrella sampling, for calculating the free energy of chemical reactions is proposed. First principles molecular dynamics simulations with this simulated tempering were performed to study the intramolecular proton transfer reaction of malonaldehyde in an aqueous solution. Conformational sampling in reaction coordinate space can be easily enhanced with this method, and the free energy along a reaction coordinate can be calculated accurately. Moreover, the simulated-tempering umbrella sampling provides trajectory data more efficiently than the conventional umbrella sampling method.

  12. Variance in binary stellar population synthesis

    NASA Astrophysics Data System (ADS)

    Breivik, Katelyn; Larson, Shane L.

    2016-03-01

    In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.

  13. Studying Variance in the Galactic Ultra-compact Binary Population

    NASA Astrophysics Data System (ADS)

    Larson, Shane L.; Breivik, Katelyn

    2017-01-01

    In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.

  14. Simulation as a vehicle for enhancing collaborative practice models.

    PubMed

    Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A

    2008-12-01

    Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.

  15. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  16. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  17. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  18. Large-Eddy Simulation of Wind-Plant Aerodynamics: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Churchfield, M. J.; Lee, S.; Moriarty, P. J.

    In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done wind plant large-eddy simulations with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology for performingmore » this type of simulation. We have used the OpenFOAM CFD toolbox to create our solver.« less

  19. Summarizing Simulation Results using Causally-relevant States

    PubMed Central

    Parikh, Nidhi; Marathe, Madhav; Swarup, Samarth

    2016-01-01

    As increasingly large-scale multiagent simulations are being implemented, new methods are becoming necessary to make sense of the results of these simulations. Even concisely summarizing the results of a given simulation run is a challenge. Here we pose this as the problem of simulation summarization: how to extract the causally-relevant descriptions of the trajectories of the agents in the simulation. We present a simple algorithm to compress agent trajectories through state space by identifying the state transitions which are relevant to determining the distribution of outcomes at the end of the simulation. We present a toy-example to illustrate the working of the algorithm, and then apply it to a complex simulation of a major disaster in an urban area. PMID:28042620

  20. Molecular dynamics simulations of amphiphilic graft copolymer molecules at a water/air interface.

    PubMed

    Anderson, Philip M; Wilson, Mark R

    2004-11-01

    Fully atomistic molecular dynamics simulations of amphiphilic graft copolymer molecules have been performed at a range of surface concentrations at a water/air interface. These simulations are compared to experimental results from a corresponding system over a similar range of surface concentrations. Neutron reflectivity data calculated from the simulation trajectories agrees well with experimentally acquired profiles. In particular, excellent agreement in neutron reflectivity is found for lower surface concentration simulations. A simulation of a poly(ethylene oxide) (PEO) chain in aqueous solution has also been performed. This simulation allows the conformational behavior of the free PEO chain and those tethered to the interface in the previous simulations to be compared. (c) 2004 American Institute of Physics.

  1. Statistical Emulator for Expensive Classification Simulators

    NASA Technical Reports Server (NTRS)

    Ross, Jerret; Samareh, Jamshid A.

    2016-01-01

    Expensive simulators prevent any kind of meaningful analysis to be performed on the phenomena they model. To get around this problem the concept of using a statistical emulator as a surrogate representation of the simulator was introduced in the 1980's. Presently, simulators have become more and more complex and as a result running a single example on these simulators is very expensive and can take days to weeks or even months. Many new techniques have been introduced, termed criteria, which sequentially select the next best (most informative to the emulator) point that should be run on the simulator. These criteria methods allow for the creation of an emulator with only a small number of simulator runs. We follow and extend this framework to expensive classification simulators.

  2. Simulant Development for LAWPS Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, Renee L.; Schonewill, Philip P.; Burns, Carolyn A.

    2017-05-23

    This report describes simulant development work that was conducted to support the technology maturation of the LAWPS facility. Desired simulant physical properties (density, viscosity, solids concentration, solid particle size), sodium concentrations, and general anion identifications were provided by WRPS. The simulant recipes, particularly a “nominal” 5.6M Na simulant, are intended to be tested at several scales, ranging from bench-scale (500 mL) to full-scale. Each simulant formulation was selected to be chemically representative of the waste streams anticipated to be fed to the LAWPS system, and used the current version of the LAWPS waste specification as a formulation basis. After simulantmore » development iterations, four simulants of varying sodium concentration (5.6M, 6.0M, 4.0M, and 8.0M) were prepared and characterized. The formulation basis, development testing, and final simulant recipes and characterization data for these four simulants are presented in this report.« less

  3. The role of simulation training in anesthesiology resident education.

    PubMed

    Yunoki, Kazuma; Sakai, Tetsuro

    2018-06-01

    An increasing number of reports indicate the efficacy of simulation training in anesthesiology resident education. Simulation education helps learners to acquire clinical skills in a safe learning environment without putting real patients at risk. This useful tool allows anesthesiology residents to obtain medical knowledge and both technical and non-technical skills. For faculty members, simulation-based settings provide the valuable opportunity to evaluate residents' performance in scenarios including airway management and regional, cardiac, and obstetric anesthesiology. However, it is still unclear what types of simulators should be used or how to incorporate simulation education effectively into education curriculums. Whether simulation training improves patient outcomes has not been fully determined. The goal of this review is to provide an overview of the status of simulation in anesthesiology resident education, encourage more anesthesiologists to get involved in simulation education to propagate its influence, and stimulate future research directed toward improving resident education and patient outcomes.

  4. Ignaz Semmelweis redux?

    PubMed

    Raemer, Daniel B

    2014-06-01

    The story of Ignaz Semmelweis suggests a lesson to beware of unintended consequences, especially with in situ simulation. In situ simulation offers many important advantages over center-based simulation such as learning about the real setting, putting participants at ease, saving travel time, minimizing space requirements, involving patients and families. Some substantial disadvantages include frequent distractions, lack of privacy, logistics of setup, availability of technology, and supply costs. Importantly, in situ simulation amplifies some of the safety hazards of simulation itself including maintaining control of simulated medications and equipment, limiting the use of valuable hospital resources, preventing incorrect learning from simulation shortcuts, and profoundly upsetting patients and their families. Mitigating these hazards by labeling effectively, publishing policies and procedures, securing simulation supplies and equipment, educating simulation staff, and informing participants of the risks are all methods that may lessen the potential for an accident. Each requires a serious effort of analysis, design, and implementation.

  5. Accomplishments and challenges of surgical simulation.

    PubMed

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  6. Building a Community of Practice for Researchers: The International Network for Simulation-Based Pediatric Innovation, Research and Education.

    PubMed

    Cheng, Adam; Auerbach, Marc; Calhoun, Aaron; Mackinnon, Ralph; Chang, Todd P; Nadkarni, Vinay; Hunt, Elizabeth A; Duval-Arnould, Jordan; Peiris, Nicola; Kessler, David

    2018-06-01

    The scope and breadth of simulation-based research is growing rapidly; however, few mechanisms exist for conducting multicenter, collaborative research. Failure to foster collaborative research efforts is a critical gap that lies in the path of advancing healthcare simulation. The 2017 Research Summit hosted by the Society for Simulation in Healthcare highlighted how simulation-based research networks can produce studies that positively impact the delivery of healthcare. In 2011, the International Network for Simulation-based Pediatric Innovation, Research and Education (INSPIRE) was formed to facilitate multicenter, collaborative simulation-based research with the aim of developing a community of practice for simulation researchers. Since its formation, the network has successfully completed and published numerous collaborative research projects. In this article, we describe INSPIRE's history, structure, and internal processes with the goal of highlighting the community of practice model for other groups seeking to form a simulation-based research network.

  7. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  8. RANS simulation of cavitation and hull pressure fluctuation for marine propeller operating behind-hull condition

    NASA Astrophysics Data System (ADS)

    Paik, Kwang-Jun; Park, Hyung-Gil; Seo, Jongsoo

    2013-12-01

    Simulations of cavitation flow and hull pressure fluctuation for a marine propeller operating behind a hull using the unsteady Reynolds-Averaged Navier-Stokes equations (RANS) are presented. A full hull body submerged under the free surface is modeled in the computational domain to simulate directly the wake field of the ship at the propeller plane. Simulations are performed in design and ballast draught conditions to study the effect of cavitation number. And two propellers with slightly different geometry are simulated to validate the detectability of the numerical simulation. All simulations are performed using a commercial CFD software FLUENT. Cavitation patterns of the simulations show good agreement with the experimental results carried out in Samsung CAvitation Tunnel (SCAT). The simulation results for the hull pressure fluctuation induced by a propeller are also compared with the experimental results showing good agreement in the tendency and amplitude, especially, for the first blade frequency.

  9. Designing a SCADA system simulator for fast breeder reactor

    NASA Astrophysics Data System (ADS)

    Nugraha, E.; Abdullah, A. G.; Hakim, D. L.

    2016-04-01

    SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.

  10. Tissue simulating gel for medical research

    NASA Technical Reports Server (NTRS)

    Companion, John A. (Inventor)

    1991-01-01

    A tissue simulating gel and a method for preparing the tissue simulating gel are disclosed. The tissue simulating gel is prepared by a process using water, gelatin, ethylene glycol, and a cross-linking agent. In order to closely approximate the characteristics of the type of tissue being simulated, other material has been added to change the electrical, sound conducting, and wave scattering properties of the tissue simulating gel. The result of the entire process is a formulation that will not melt at the elevated temperatures involved in hyperthermia medical research. Furthermore, the tissue simulating gel will not support mold or bacterial growth, is of a sufficient mechanical strength to maintain a desired shape without a supporting shell, and is non-hardening and non-drying. Substances have been injected into the tissue simulating gel prior to the setting-up thereof just as they could be injected into actual tissue, and the tissue simulating gel is translucent so as to permit visual inspection of its interior. A polyurethane spray often used for coating circuit boards can be applied to the surface of the tissue simulating gel to give a texture similar to human skin, making the tissue simulating gel easier to handle and contributing to its longevity.

  11. Mouse Acetylcholinesterase Unliganded and in Complex with Huperzine A: A Comparison of Molecular Dynamics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tara, Sylvia; Straatsma, TP; Mccammon, Andy

    1999-06-01

    A 1 ns molecular dynamics simulation of unliganded mouse acetylcholinesterase (AChE) is compared to a previous simulation of mouse AChE complexed with Huperzine A (HupA). Several common features are observed. In both simulations, the active site gorge fluctuates in size during the 1 ns trajectory, and is completely pinched off several times. Many of the residues in the gorge that formed hydrogen bonds with HupA in the simulation of the complex, now form hydrogen bonds with other protein residues and water molecules in the gorge. The opening of a "backdoor" entrance to the active site that was found in themore » simulation of the complex is also observed in the unliganded simulation. Differences between the two simulations include overall lower structural RMS deviations for residues in the gorge in the unliganded simulation, a smaller diameter of the gorge in the absence of HupA, and the disappearance of a side channel that was frequently present in the liganded simulation. The differences between the two simulations can be attributed, in part, to the interaction of AChE with HupA.« less

  12. Synchronization Algorithms for Co-Simulation of Power Grid and Communication Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciraci, Selim; Daily, Jeffrey A.; Agarwal, Khushbu

    2014-09-11

    The ongoing modernization of power grids consists of integrating them with communication networks in order to achieve robust and resilient control of grid operations. To understand the operation of the new smart grid, one approach is to use simulation software. Unfortunately, current power grid simulators at best utilize inadequate approximations to simulate communication networks, if at all. Cooperative simulation of specialized power grid and communication network simulators promises to more accurately reproduce the interactions of real smart grid deployments. However, co-simulation is a challenging problem. A co-simulation must manage the exchange of informa- tion, including the synchronization of simulator clocks,more » between all simulators while maintaining adequate computational perfor- mance. This paper describes two new conservative algorithms for reducing the overhead of time synchronization, namely Active Set Conservative and Reactive Conservative. We provide a detailed analysis of their performance characteristics with respect to the current state of the art including both conservative and optimistic synchronization algorithms. In addition, we provide guidelines for selecting the appropriate synchronization algorithm based on the requirements of the co-simulation. The newly proposed algorithms are shown to achieve as much as 14% and 63% im- provement, respectively, over the existing conservative algorithm.« less

  13. Next Generation Simulation Framework for Robotic and Human Space Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  14. Simulation and Digitization of a Gas Electron Multiplier Detector Using Geant4 and an Object-Oriented Digitization Program

    NASA Astrophysics Data System (ADS)

    McMullen, Timothy; Liyanage, Nilanga; Xiong, Weizhi; Zhao, Zhiwen

    2017-01-01

    Our research has focused on simulating the response of a Gas Electron Multiplier (GEM) detector using computational methods. GEM detectors provide a cost effective solution for radiation detection in high rate environments. A detailed simulation of GEM detector response to radiation is essential for the successful adaption of these detectors to different applications. Using Geant4 Monte Carlo (GEMC), a wrapper around Geant4 which has been successfully used to simulate the Solenoidal Large Intensity Device (SoLID) at Jefferson Lab, we are developing a simulation of a GEM chamber similar to the detectors currently used in our lab. We are also refining an object-oriented digitization program, which translates energy deposition information from GEMC into electronic readout which resembles the readout from our physical detectors. We have run the simulation with beta particles produced by the simulated decay of a 90Sr source, as well as with a simulated bremsstrahlung spectrum. Comparing the simulation data with real GEM data taken under similar conditions is used to refine the simulation parameters. Comparisons between results from the simulations and results from detector tests will be presented.

  15. Color visual simulation applications at the Defense Mapping Agency

    NASA Astrophysics Data System (ADS)

    Simley, J. D.

    1984-09-01

    The Defense Mapping Agency (DMA) produces the Digital Landmass System data base to provide culture and terrain data in support of numerous aircraft simulators. In order to conduct data base and simulation quality control and requirements analysis, DMA has developed the Sensor Image Simulator which can rapidly generate visual and radar static scene digital simulations. The use of color in visual simulation allows the clear portrayal of both landcover and terrain data, whereas the initial black and white capabilities were restricted in this role and thus found limited use. Color visual simulation has many uses in analysis to help determine the applicability of current and prototype data structures to better meet user requirements. Color visual simulation is also significant in quality control since anomalies can be more easily detected in natural appearing forms of the data. The realism and efficiency possible with advanced processing and display technology, along with accurate data, make color visual simulation a highly effective medium in the presentation of geographic information. As a result, digital visual simulation is finding increased potential as a special purpose cartographic product. These applications are discussed and related simulation examples are presented.

  16. Tissue simulating gel for medical research

    NASA Technical Reports Server (NTRS)

    Companion, John A. (Inventor)

    1989-01-01

    A tissue simulating gel and a method for preparing the tissue simulating gel are disclosed. The tissue simulating gel is prepared by a process using water, gelatin, ethylene gylcol, and a cross-linking agent. In order to closely approximate the characteristics of the type of tissue being simulated, other material has been added to change the electrical, sound conducting, and wave scattering properties of the tissue simulating gel. The result of the entire process is a formulation that will not melt at the elevated temperatures involved in hyperthermia medical research. Furthermore, the tissue simulating gel will not support mold or bacterial growth, is of a sufficient mechanical strength to maintain a desired shape without a supporting shell, and is non-hardening and non-drying. Substances were injected into the tissue simulating gel prior to the setting-up thereof just as they could be injected into actual tissue, and the tissue simulating gel is translucent so as to permit visual inspection of its interior. A polyurethane spray often used for coating circuit boards can be applied to the surface of the tissue simulating gel to give a texture similar to human skin, making the tissue simulating gel easier to handle and contributing to its longevity.

  17. The future vision of simulation in health care

    PubMed Central

    Gaba, D

    2004-01-01

    Simulation is a technique—not a technology—to replace or amplify real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner. The diverse applications of simulation in health care can be categorised by 11 dimensions: aims and purposes of the simulation activity; unit of participation; experience level of participants; health care domain; professional discipline of participants; type of knowledge, skill, attitudes, or behaviours addressed; the simulated patient's age; technology applicable or required; site of simulation; extent of direct participation; and method of feedback used. Using simulation to improve safety will require full integration of its applications into the routine structures and practices of health care. The costs and benefits of simulation are difficult to determine, especially for the most challenging applications, where long term use may be required. Various driving forces and implementation mechanisms can be expected to propel simulation forward, including professional societies, liability insurers, health care payers, and ultimately the public. The future of simulation in health care depends on the commitment and ingenuity of the health care simulation community to see that improved patient safety using this tool becomes a reality. PMID:15465951

  18. Measurement of the Solar Absorptance and Thermal Emittance of Lunar Simulants

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Street, Kenneth W.; Gutafson, Robert J.

    2010-01-01

    The first comparative study of the reflectance spectra of lunar simulants is presented. All of the simulants except one had a wavelength-dependant reflectivity ( ( )) near 0.10 over the wavelength range of 8 to 25 m, so they are highly emitting at room temperature and lower. The 300 K emittance ( ) of all the lunar simulants except one ranged from 0.884 to 0.906. The 300 K of JSC Mars-1 simulant was 0.927. There was considerably more variation in the lunar simulant reflectance in the solar spectral range (250 to 2500 nm) than in the thermal infrared. Larger particle size simulants reflected much less than those with smaller particle size. As expected, the lunar highlands simulants were more reflective in this wavelength range than the lunar mare simulants. The integrated solar absorptance ( ) of the simulants ranged from 0.413 to 0.817 for those with smaller particles, and 0.669 to 0.906 for those with larger particles. Although spectral differences were observed, the for the simulants appears to be similar to that of lunar soils (0.65 to 0.88). These data are now available to be used in modeling the effects of dust on thermal control surfaces.

  19. Autoshaping and automaintenance: a neural-network approach.

    PubMed

    Burgos, José E

    2007-07-01

    This article presents an interpretation of autoshaping, and positive and negative automaintenance, based on a neural-network model. The model makes no distinction between operant and respondent learning mechanisms, and takes into account knowledge of hippocampal and dopaminergic systems. Four simulations were run, each one using an A-B-A design and four instances of feedfoward architectures. In A, networks received a positive contingency between inputs that simulated a conditioned stimulus (CS) and an input that simulated an unconditioned stimulus (US). Responding was simulated as an output activation that was neither elicited by nor required for the US. B was an omission-training procedure. Response directedness was defined as sensory feedback from responding, simulated as a dependence of other inputs on responding. In Simulation 1, the phenomena were simulated with a fully connected architecture and maximally intense response feedback. The other simulations used a partially connected architecture without competition between CS and response feedback. In Simulation 2, a maximally intense feedback resulted in substantial autoshaping and automaintenance. In Simulation 3, eliminating response feedback interfered substantially with autoshaping and automaintenance. In Simulation 4, intermediate autoshaping and automaintenance resulted from an intermediate response feedback. Implications for the operant-respondent distinction and the behavior-neuroscience relation are discussed.

  20. Autoshaping and Automaintenance: A Neural-Network Approach

    PubMed Central

    Burgos, José E

    2007-01-01

    This article presents an interpretation of autoshaping, and positive and negative automaintenance, based on a neural-network model. The model makes no distinction between operant and respondent learning mechanisms, and takes into account knowledge of hippocampal and dopaminergic systems. Four simulations were run, each one using an A-B-A design and four instances of feedfoward architectures. In A, networks received a positive contingency between inputs that simulated a conditioned stimulus (CS) and an input that simulated an unconditioned stimulus (US). Responding was simulated as an output activation that was neither elicited by nor required for the US. B was an omission-training procedure. Response directedness was defined as sensory feedback from responding, simulated as a dependence of other inputs on responding. In Simulation 1, the phenomena were simulated with a fully connected architecture and maximally intense response feedback. The other simulations used a partially connected architecture without competition between CS and response feedback. In Simulation 2, a maximally intense feedback resulted in substantial autoshaping and automaintenance. In Simulation 3, eliminating response feedback interfered substantially with autoshaping and automaintenance. In Simulation 4, intermediate autoshaping and automaintenance resulted from an intermediate response feedback. Implications for the operant–respondent distinction and the behavior–neuroscience relation are discussed. PMID:17725055

  1. Measurement of the Solar Absorptance and Thermal Emittance of Lunar Simulants

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Street, Kenneth W.; Gustafson, Robert J.

    2010-01-01

    The first comparative study of the reflectance spectra of lunar simulants is presented. All of the simulants except one had a wavelength-dependent reflectivity, rho(lambda), near 0.10 over the wavelength range of 8 to 25 microns, so they are highly emitting at room temperature and lower. The 300 K emittance, epsilon, of all the lunar simulants except one ranged from 0.884 to 0.906. The 300 K epsilon of JSC Mars-1 simulant was 0.927. There was considerably more variation in the lunar simulant reflectance in the solar spectral range (250 to 2500 nm) than in the thermal infrared. Larger particle size simulants reflected much less than those with smaller particle size. As expected, the lunar highlands simulants were more reflective in this wavelength range than the lunar mare simulants. The alpha of the simulants ranged from 0.413 to 0.817 for those with smaller particles and 0.669 to 0.906 for large particles. Although spectral differences were observed, the total integrated alpha for the simulants appears to be similar to that of lunar soils (0.65 to 0.88). These data are now available to be used in modeling the effects of dust on thermal control surfaces.

  2. Simulation in Canadian postgraduate emergency medicine training - a national survey.

    PubMed

    Russell, Evan; Hall, Andrew Koch; Hagel, Carly; Petrosoniak, Andrew; Dagnone, Jeffrey Damon; Howes, Daniel

    2018-01-01

    Simulation-based education (SBE) is an important training strategy in emergency medicine (EM) postgraduate programs. This study sought to characterize the use of simulation in FRCPC-EM residency programs across Canada. A national survey was administered to residents and knowledgeable program representatives (PRs) at all Canadian FRCPC-EM programs. Survey question themes included simulation program characteristics, the frequency of resident participation, the location and administration of SBE, institutional barriers, interprofessional involvement, content, assessment strategies, and attitudes about SBE. Resident and PR response rates were 63% (203/321) and 100% (16/16), respectively. Residents reported a median of 20 (range 0-150) hours of annual simulation training, with 52% of residents indicating that the time dedicated to simulation training met their needs. PRs reported the frequency of SBE sessions ranging from weekly to every 6 months, with 15 (94%) programs having an established simulation curriculum. Two (13%) of the programs used simulation for resident assessment, although 15 (94%) of PRs indicated that they would be comfortable with simulation-based assessment. The most common PR-identified barriers to administering simulation were a lack of protected faculty time (75%) and a lack of faculty experience with simulation (56%). Interprofessional involvement in simulation was strongly valued by both residents and PRs. SBE is frequently used by Canadian FRCPC-EM residency programs. However, there exists considerable variability in the structure, frequency, and timing of simulation-based activities. As programs transition to competency-based medical education, national organizations and collaborations should consider the variability in how SBE is administered.

  3. Point-of-care ultrasound education: the increasing role of simulation and multimedia resources.

    PubMed

    Lewiss, Resa E; Hoffmann, Beatrice; Beaulieu, Yanick; Phelan, Mary Beth

    2014-01-01

    This article reviews the current technology, literature, teaching models, and methods associated with simulation-based point-of-care ultrasound training. Patient simulation appears particularly well suited for learning point-of-care ultrasound, which is a required core competency for emergency medicine and other specialties. Work hour limitations have reduced the opportunities for clinical practice, and simulation enables practicing a skill multiple times before it may be used on patients. Ultrasound simulators can be categorized into 2 groups: low and high fidelity. Low-fidelity simulators are usually static simulators, meaning that they have nonchanging anatomic examples for sonographic practice. Advantages are that the model may be reused over time, and some simulators can be homemade. High-fidelity simulators are usually high-tech and frequently consist of many computer-generated cases of virtual sonographic anatomy that can be scanned with a mock probe. This type of equipment is produced commercially and is more expensive. High-fidelity simulators provide students with an active and safe learning environment and make a reproducible standardized assessment of many different ultrasound cases possible. The advantages and disadvantages of using low- versus high-fidelity simulators are reviewed. An additional concept used in simulation-based ultrasound training is blended learning. Blended learning may include face-to-face or online learning often in combination with a learning management system. Increasingly, with simulation and Web-based learning technologies, tools are now available to medical educators for the standardization of both ultrasound skills training and competency assessment.

  4. Bringing good teaching cases "to life": a simulator-based medical education service.

    PubMed

    Gordon, James A; Oriol, Nancy E; Cooper, Jeffrey B

    2004-01-01

    Realistic medical simulation has expanded worldwide over the last decade. Such technology is playing an increasing role in medical education not merely because simulator sessions are enjoyable, but because they can provide an enhanced environment for experiential learning and reflective thought. High-fidelity patient simulators allow students of all levels to "practice" medicine without risk, providing a natural framework for the integration of basic and clinical science in a safe environment. Often described as "flight simulation for doctors," the rationale, utility, and range of medical simulations have been described elsewhere, yet the challenges of integrating this technology into the medical school curriculum have received little attention. The authors report how Harvard Medical School established an on-campus simulator program for students in 2001, building on the work of the Center for Medical Simulation in Boston. As an overarching structure for the process, faculty and residents developed a simulator-based "medical education service"-like any other medical teaching service, but designed exclusively to help students learn on the simulator alongside a clinician-mentor, on demand. Initial evaluations among both preclinical and clinical students suggest that simulation is highly accepted and increasingly demanded. For some learners, simulation may allow complex information to be understood and retained more efficiently than can occur with traditional methods. Moreover, the process outlined here suggests that simulation can be integrated into existing curricula of almost any medical school or teaching hospital in an efficient and cost-effective manner.

  5. Simulation reframed.

    PubMed

    Kneebone, Roger L

    2016-01-01

    Simulation is firmly established as a mainstay of clinical education, and extensive research has demonstrated its value. Current practice uses inanimate simulators (with a range of complexity, sophistication and cost) to address the patient 'as body' and trained actors or lay people (Simulated Patients) to address the patient 'as person'. These approaches are often separate.Healthcare simulation to date has been largely for the training and assessment of clinical 'insiders', simulating current practices. A close coupling with the clinical world restricts access to the facilities and practices of simulation, often excluding patients, families and publics. Yet such perspectives are an essential component of clinical practice. This paper argues that simulation offers opportunities to move outside a clinical 'insider' frame and create connections with other individuals and groups. Simulation becomes a bridge between experts whose worlds do not usually intersect, inviting an exchange of insights around embodied practices-the 'doing' of medicine-without jeopardising the safety of actual patients.Healthcare practice and education take place within a clinical frame that often conceals parallels with other domains of expert practice. Valuable insights emerge by viewing clinical practice not only as the application of medical science but also as performance and craftsmanship.Such connections require a redefinition of simulation. Its essence is not expensive elaborate facilities. Developments such as hybrid, distributed and sequential simulation offer examples of how simulation can combine 'patient as body' with 'patient as person' at relatively low cost, democratising simulation and exerting traction beyond the clinical sphere.The essence of simulation is a purposeful design, based on an active process of selection from an originary world, abstraction of what is criterial and re - presentation in another setting for a particular purpose or audience. This may be done within traditional simulation centres, or outside in local communities, public spaces or arts and performance venues. Simulation has established a central role in clinical education but usually focuses on learning to do things as they are already done. Imaginatively designed, simulation offers untapped potential for deep engagement with patients, publics and experts outside medicine.

  6. Evaluation of cloud-resolving model simulations of midlatitude cirrus with ARM and A-train observations

    DOE PAGES

    Muhlbauer, A.; Ackerman, T. P.; Lawson, R. P.; ...

    2015-07-14

    Cirrus clouds are ubiquitous in the upper troposphere and still constitute one of the largest uncertainties in climate predictions. Our paper evaluates cloud-resolving model (CRM) and cloud system-resolving model (CSRM) simulations of a midlatitude cirrus case with comprehensive observations collected under the auspices of the Atmospheric Radiation Measurements (ARM) program and with spaceborne observations from the National Aeronautics and Space Administration A-train satellites. The CRM simulations are driven with periodic boundary conditions and ARM forcing data, whereas the CSRM simulations are driven by the ERA-Interim product. Vertical profiles of temperature, relative humidity, and wind speeds are reasonably well simulated bymore » the CSRM and CRM, but there are remaining biases in the temperature, wind speeds, and relative humidity, which can be mitigated through nudging the model simulations toward the observed radiosonde profiles. Simulated vertical velocities are underestimated in all simulations except in the CRM simulations with grid spacings of 500 m or finer, which suggests that turbulent vertical air motions in cirrus clouds need to be parameterized in general circulation models and in CSRM simulations with horizontal grid spacings on the order of 1 km. The simulated ice water content and ice number concentrations agree with the observations in the CSRM but are underestimated in the CRM simulations. The underestimation of ice number concentrations is consistent with the overestimation of radar reflectivity in the CRM simulations and suggests that the model produces too many large ice particles especially toward the cloud base. Simulated cloud profiles are rather insensitive to perturbations in the initial conditions or the dimensionality of the model domain, but the treatment of the forcing data has a considerable effect on the outcome of the model simulations. Despite considerable progress in observations and microphysical parameterizations, simulating the microphysical, macrophysical, and radiative properties of cirrus remains challenging. Comparing model simulations with observations from multiple instruments and observational platforms is important for revealing model deficiencies and for providing rigorous benchmarks. But, there still is considerable need for reducing observational uncertainties and providing better observations especially for relative humidity and for the size distribution and chemical composition of aerosols in the upper troposphere.« less

  7. Comparison of virtual patient simulation with mannequin-based simulation for improving clinical performances in assessing and managing clinical deterioration: randomized controlled trial.

    PubMed

    Liaw, Sok Ying; Chan, Sally Wai-Chi; Chen, Fun-Gee; Hooi, Shing Chuan; Siau, Chiang

    2014-09-17

    Virtual patient simulation has grown substantially in health care education. A virtual patient simulation was developed as a refresher training course to reinforce nursing clinical performance in assessing and managing deteriorating patients. The objective of this study was to describe the development of the virtual patient simulation and evaluate its efficacy, by comparing with a conventional mannequin-based simulation, for improving the nursing students' performances in assessing and managing patients with clinical deterioration. A randomized controlled study was conducted with 57 third-year nursing students who were recruited through email. After a baseline evaluation of all participants' clinical performance in a simulated environment, the experimental group received a 2-hour fully automated virtual patient simulation while the control group received 2-hour facilitator-led mannequin-based simulation training. All participants were then re-tested one day (first posttest) and 2.5 months (second posttest) after the intervention. The participants from the experimental group completed a survey to evaluate their learning experiences with the newly developed virtual patient simulation. Compared to their baseline scores, both experimental and control groups demonstrated significant improvements (P<.001) in first and second post-test scores. While the experimental group had significantly lower (P<.05) second post-test scores compared with the first post-test scores, no significant difference (P=.94) was found between these two scores for the control group. The scores between groups did not differ significantly over time (P=.17). The virtual patient simulation was rated positively. A virtual patient simulation for a refreshing training course on assessing and managing clinical deterioration was developed. Although the randomized controlled study did not show that the virtual patient simulation was superior to mannequin-based simulation, both simulations have demonstrated to be effective refresher learning strategies for improving nursing students' clinical performance. Given the greater resource requirements of mannequin-based simulation, the virtual patient simulation provides a more promising alternative learning strategy to mitigate the decay of clinical performance over time.

  8. LOADING SIMULATION PROGRAM C

    EPA Pesticide Factsheets

    LSPC is the Loading Simulation Program in C++, a watershed modeling system that includes streamlined Hydrologic Simulation Program Fortran (HSPF) algorithms for simulating hydrology, sediment, and general water quality

  9. Simulation-based training for prostate surgery.

    PubMed

    Khan, Raheej; Aydin, Abdullatif; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran

    2015-10-01

    To identify and review the currently available simulators for prostate surgery and to explore the evidence supporting their validity for training purposes. A review of the literature between 1999 and 2014 was performed. The search terms included a combination of urology, prostate surgery, robotic prostatectomy, laparoscopic prostatectomy, transurethral resection of the prostate (TURP), simulation, virtual reality, animal model, human cadavers, training, assessment, technical skills, validation and learning curves. Furthermore, relevant abstracts from the American Urological Association, European Association of Urology, British Association of Urological Surgeons and World Congress of Endourology meetings, between 1999 and 2013, were included. Only studies related to prostate surgery simulators were included; studies regarding other urological simulators were excluded. A total of 22 studies that carried out a validation study were identified. Five validated models and/or simulators were identified for TURP, one for photoselective vaporisation of the prostate, two for holmium enucleation of the prostate, three for laparoscopic radical prostatectomy (LRP) and four for robot-assisted surgery. Of the TURP simulators, all five have demonstrated content validity, three face validity and four construct validity. The GreenLight laser simulator has demonstrated face, content and construct validities. The Kansai HoLEP Simulator has demonstrated face and content validity whilst the UroSim HoLEP Simulator has demonstrated face, content and construct validity. All three animal models for LRP have been shown to have construct validity whilst the chicken skin model was also content valid. Only two robotic simulators were identified with relevance to robot-assisted laparoscopic prostatectomy, both of which demonstrated construct validity. A wide range of different simulators are available for prostate surgery, including synthetic bench models, virtual-reality platforms, animal models, human cadavers, distributed simulation and advanced training programmes and modules. The currently validated simulators can be used by healthcare organisations to provide supplementary training sessions for trainee surgeons. Further research should be conducted to validate simulated environments, to determine which simulators have greater efficacy than others and to assess the cost-effectiveness of the simulators and the transferability of skills learnt. With surgeons investigating new possibilities for easily reproducible and valid methods of training, simulation offers great scope for implementation alongside traditional methods of training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  10. The utility of simulation in medical education: what is the evidence?

    PubMed

    Okuda, Yasuharu; Bryson, Ethan O; DeMaria, Samuel; Jacobson, Lisa; Quinones, Joshua; Shen, Bing; Levine, Adam I

    2009-08-01

    Medical schools and residencies are currently facing a shift in their teaching paradigm. The increasing amount of medical information and research makes it difficult for medical education to stay current in its curriculum. As patients become increasingly concerned that students and residents are "practicing" on them, clinical medicine is becoming focused more on patient safety and quality than on bedside teaching and education. Educators have faced these challenges by restructuring curricula, developing small-group sessions, and increasing self-directed learning and independent research. Nevertheless, a disconnect still exists between the classroom and the clinical environment. Many students feel that they are inadequately trained in history taking, physical examination, diagnosis, and management. Medical simulation has been proposed as a technique to bridge this educational gap. This article reviews the evidence for the utility of simulation in medical education. We conducted a MEDLINE search of original articles and review articles related to simulation in education with key words such as simulation, mannequin simulator, partial task simulator, graduate medical education, undergraduate medical education, and continuing medical education. Articles, related to undergraduate medical education, graduate medical education, and continuing medical education were used in the review. One hundred thirteen articles were included in this review. Simulation-based training was demonstrated to lead to clinical improvement in 2 areas of simulation research. Residents trained on laparoscopic surgery simulators showed improvement in procedural performance in the operating room. The other study showed that residents trained on simulators were more likely to adhere to the advanced cardiac life support protocol than those who received standard training for cardiac arrest patients. In other areas of medical training, simulation has been demonstrated to lead to improvements in medical knowledge, comfort in procedures, and improvements in performance during retesting in simulated scenarios. Simulation has also been shown to be a reliable tool for assessing learners and for teaching topics such as teamwork and communication. Only a few studies have shown direct improvements in clinical outcomes from the use of simulation for training. Multiple studies have demonstrated the effectiveness of simulation in the teaching of basic science and clinical knowledge, procedural skills, teamwork, and communication as well as assessment at the undergraduate and graduate medical education levels. As simulation becomes increasingly prevalent in medical school and resident education, more studies are needed to see if simulation training improves patient outcomes.

  11. Surrogate model approach for improving the performance of reactive transport simulations

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines (MARS) method provides the best trade-off between speed and accuracy. This proof-of-concept forms an essential step towards building an interactive visual analytics system to enable user-driven systematic creation of geochemical surrogate models. Such a system shall enable reactive transport simulations with unprecedented spatial and temporal detail to become possible. References: Kolditz, O., Görke, U.J., Shao, H. and Wang, W., 2012. Thermo-hydro-mechanical-chemical processes in porous media: benchmarks and examples (Vol. 86). Springer Science & Business Media.

  12. Range Finding with a Plenoptic Camera

    DTIC Science & Technology

    2014-03-27

    92 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Simulated Camera Analysis...Varying Lens Diameter . . . . . . . . . . . . . . . . 95 Simulated Camera Analysis: Varying Detector Size . . . . . . . . . . . . . . . . . 98 Simulated ...Matching Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 37 Simulated Camera Performance with SIFT

  13. Space Simulation, 7th. [facilities and testing techniques

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Space simulation facilities and techniques are outlined that encompass thermal scale modeling, computerized simulations, reentry materials, spacecraft contamination, solar simulation, vacuum tests, and heat transfer studies.

  14. Validation of newly developed physical laparoscopy simulator in transabdominal preperitoneal (TAPP) inguinal hernia repair.

    PubMed

    Nishihara, Yuichi; Isobe, Yoh; Kitagawa, Yuko

    2017-12-01

    A realistic simulator for transabdominal preperitoneal (TAPP) inguinal hernia repair would enhance surgeons' training experience before they enter the operating theater. The purpose of this study was to create a novel physical simulator for TAPP inguinal hernia repair and obtain surgeons' opinions regarding its efficacy. Our novel TAPP inguinal hernia repair simulator consists of a physical laparoscopy simulator and a handmade organ replica model. The physical laparoscopy simulator was created by three-dimensional (3D) printing technology, and it represents the trunk of the human body and the bendability of the abdominal wall under pneumoperitoneal pressure. The organ replica model was manually created by assembling materials. The TAPP inguinal hernia repair simulator allows for the performance of all procedures required in TAPP inguinal hernia repair. Fifteen general surgeons performed TAPP inguinal hernia repair using our simulator. Their opinions were scored on a 5-point Likert scale. All participants strongly agreed that the 3D-printed physical simulator and organ replica model were highly useful for TAPP inguinal hernia repair training (median, 5 points) and TAPP inguinal hernia repair education (median, 5 points). They felt that the simulator would be effective for TAPP inguinal hernia repair training before entering the operating theater. All surgeons considered that this simulator should be introduced in the residency curriculum. We successfully created a physical simulator for TAPP inguinal hernia repair training using 3D printing technology and a handmade organ replica model created with inexpensive, readily accessible materials. Preoperative TAPP inguinal hernia repair training using this simulator and organ replica model may be of benefit in the training of all surgeons. All general surgeons involved in the present study felt that this simulator and organ replica model should be used in their residency curriculum.

  15. Simulation Training in Obstetrics and Gynaecology Residency Programs in Canada.

    PubMed

    Sanders, Ari; Wilson, R Douglas

    2015-11-01

    The integration of simulation into residency programs has been slower in obstetrics and gynaecology than in other surgical specialties. The goal of this study was to evaluate the current use of simulation in obstetrics and gynaecology residency programs in Canada. A 19-question survey was developed and distributed to all 16 active and accredited obstetrics and gynaecology residency programs in Canada. The survey was sent to program directors initially, but on occasion was redirected to other faculty members involved in resident education or to senior residents. Survey responses were collected over an 18-month period. Twelve programs responded to the survey (11 complete responses). Eleven programs (92%) reported introducing an obstetrics and gynaecology simulation curriculum into their residency education. All respondents (100%) had access to a simulation centre. Simulation was used to teach various obstetrical and gynaecological skills using different simulation modalities. Barriers to simulation integration were primarily the costs of equipment and space and the need to ensure dedicated time for residents and educators. The majority of programs indicated that it was a priority for them to enhance their simulation curriculum and transition to competency-based resident assessment. Simulation training has increased in obstetrics and gynaecology residency programs. The development of formal simulation curricula for use in obstetrics and gynaecology resident education is in early development. A standardized national simulation curriculum would help facilitate the integration of simulation into obstetrics and gynaecology resident education and aid in the shift to competency-based resident assessment. Obstetrics and gynaecology residency programs need national collaboration (between centres and specialties) to develop a standardized simulation curriculum for use in obstetrics and gynaecology residency programs in Canada.

  16. Large-Eddy Simulation of Wind-Plant Aerodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Churchfield, M. J.; Lee, S.; Moriarty, P. J.

    In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation, and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done large-eddy simulations of wind plants with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology formore » performing this type of simulation. We used the OpenFOAM CFD toolbox to create our solver. The simulated time-averaged power production of the turbines in the plant agrees well with field observations, except with the sixth turbine and beyond in each wind-aligned. The power produced by each of those turbines is overpredicted by 25-40%. A direct comparison between simulated and field data is difficult because we simulate one wind direction with a speed and turbulence intensity characteristic of Lillgrund, but the field observations were taken over a year of varying conditions. The simulation shows the significant 60-70% decrease in the performance of the turbines behind the front row in this plant that has a spacing of 4.3 rotor diameters in this direction. The overall plant efficiency is well predicted. This work shows the importance of using local grid refinement to simultaneously capture the meter-scale details of the turbine wake and the kilometer-scale turbulent atmospheric structures. Although this work illustrates the power of large-eddy simulation in producing a time-accurate solution, it required about one million processor-hours, showing the significant cost of large-eddy simulation.« less

  17. Multi-pass Monte Carlo simulation method in nuclear transmutations.

    PubMed

    Mateescu, Liviu; Kadambi, N Prasad; Ravindra, Nuggehalli M

    2016-12-01

    Monte Carlo methods, in their direct brute simulation incarnation, bring realistic results if the involved probabilities, be they geometrical or otherwise, remain constant for the duration of the simulation. However, there are physical setups where the evolution of the simulation represents a modification of the simulated system itself. Chief among such evolving simulated systems are the activation/transmutation setups. That is, the simulation starts with a given set of probabilities, which are determined by the geometry of the system, the components and by the microscopic interaction cross-sections. However, the relative weight of the components of the system changes along with the steps of the simulation. A natural measure would be adjusting probabilities after every step of the simulation. On the other hand, the physical system has typically a number of components of the order of Avogadro's number, usually 10 25 or 10 26 members. A simulation step changes the characteristics for just a few of these members; a probability will therefore shift by a quantity of 1/10 25 . Such a change cannot be accounted for within a simulation, because then the simulation should have then a number of at least 10 28 steps in order to have some significance. This is not feasible, of course. For our computing devices, a simulation of one million steps is comfortable, but a further order of magnitude becomes too big a stretch for the computing resources. We propose here a method of dealing with the changing probabilities, leading to the increasing of the precision. This method is intended as a fast approximating approach, and also as a simple introduction (for the benefit of students) in the very branched subject of Monte Carlo simulations vis-à-vis nuclear reactors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Thermal Optical Properties of Lunar Dust Simulants and Their Constituents

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Ellis, Shaneise; Hanks, Nichole

    2011-01-01

    The total reflectance spectra of lunar simulant dusts (< 20 mm particles) were measured in order to determine their integrated solar absorptance (alpha) and their thermal emittance (epsilon) for the purpose of analyzing the effect of dust on the performance of thermal control surfaces. All of the simulants except one had a wavelength-dependent reflectivity (p (lambda)) near 0.10 over the wavelength range of 8 to 25 microns and so are highly emitting at room temperature and lower. The 300 K emittance (epsilon) of all the lunar simulants except one ranged from 0.78 to 0.92. The exception was Minnesota Lunar Simulant 1 (MLS-1), which has little or no glassy component. In all cases the epsilon was lower for the < 20 micron particles than for larger particles reported earlier. There was considerably more variation in the lunar simulant reflectance in the solar spectral range (250 to 2500 nm) than in the thermal infrared. As expected, the lunar highlands simulants were more reflective in this wavelength range than the lunar mare simulants. The integrated solar absorptance (alpha) of the simulants ranged from 0.39 to 0.75. This is lower than values reported earlier for larger particles of the same simulants (0.41 to 0.82), and for representative mare and highlands lunar soils (0.74 to 0.91). Since the of some mare simulants more closely matched that of highlands lunar soils, it is recommended that and values be the criteria for choosing a simulant for assessing the effects of dust on thermal control surfaces, rather than whether a simulant has been formulated as a highlands or a mare simulant.

  19. Thermal Optical Properties of Lunar Dust Simulants and Their Constituents

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Ellis, Shaneise; Hanks, Nichole

    2011-01-01

    The total reflectance spectra of lunar simulant dusts (less than 20 micrometer particles) were measured in order to determine their integrated solar absorptance (alpha) and their thermal emittance (e) for the purpose of analyzing the effect of dust on the performance of thermal control surfaces. All of the simulants except one had a wavelength-dependant reflectivity (p(lambda)) near 0.10 over the wavelength range of 8 to 25 micrometers, and so are highly emitting at room temperature and lower. The 300 K emittance (epsilon) of all the lunar simulants except one ranged from 0.78 to 0.92. The exception was Minnesota Lunar Simulant 1 (MLS-1), which has little or no glassy component. In all cases the epsilon was lower for the less 20 micrometer particles than for larger particles reported earlier. There was considerably more variation in the lunar simulant reflectance in the solar spectral range (250 to 2500 nanometers) than in the thermal infrared. As expected, the lunar highlands simulants were more reflective in this wavelength range than the lunar mare simulants. The integrated solar absorptance (alpha) of the simulants ranged from 0.39 to 0.75. This is lower than values reported earlier for larger particles of the same simulants (0.41 to 0.82), and for representative mare and highlands lunar soils (0.74 to 0.91). Since the alpha of some mare simulants more closely matched that of highlands lunar soils, it is recommended that and values be the criteria for choosing a simulant for assessing the effects of dust on thermal control surfaces, rather than whether a simulant has been formulated as a highlands or a mare simulant.

  20. Simulation of temperature field for temperature-controlled radio frequency ablation using a hyperbolic bioheat equation and temperature-varied voltage calibration: a liver-mimicking phantom study.

    PubMed

    Zhang, Man; Zhou, Zhuhuang; Wu, Shuicai; Lin, Lan; Gao, Hongjian; Feng, Yusheng

    2015-12-21

    This study aims at improving the accuracy of temperature simulation for temperature-controlled radio frequency ablation (RFA). We proposed a new voltage-calibration method in the simulation and investigated the feasibility of a hyperbolic bioheat equation (HBE) in the RFA simulation with longer durations and higher power. A total of 40 RFA experiments was conducted in a liver-mimicking phantom. Four mathematical models with multipolar electrodes were developed by the finite element method in COMSOL software: HBE with/without voltage calibration, and the Pennes bioheat equation (PBE) with/without voltage calibration. The temperature-varied voltage calibration used in the simulation was calculated from an experimental power output and temperature-dependent resistance of liver tissue. We employed the HBE in simulation by considering the delay time τ of 16 s. First, for simulations by each kind of bioheat equation (PBE or HBE), we compared the differences between the temperature-varied voltage-calibration and the fixed-voltage values used in the simulations. Then, the comparisons were conducted between the PBE and the HBE in the simulations with temperature-varied voltage calibration. We verified the simulation results by experimental temperature measurements on nine specific points of the tissue phantom. The results showed that: (1) the proposed voltage-calibration method improved the simulation accuracy of temperature-controlled RFA for both the PBE and the HBE, and (2) for temperature-controlled RFA simulation with the temperature-varied voltage calibration, the HBE method was 0.55 °C more accurate than the PBE method. The proposed temperature-varied voltage calibration may be useful in temperature field simulations of temperature-controlled RFA. Besides, the HBE may be used as an alternative in the simulation of long-duration high-power RFA.

  1. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    PubMed

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-09-01

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  2. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less

  3. Gamma Ray Observatory (GRO) dynamics simulator requirements and mathematical specifications, revision 1

    NASA Technical Reports Server (NTRS)

    Harman, R.; Blejer, D.

    1990-01-01

    The requirements and mathematical specifications for the Gamma Ray Observatory (GRO) Dynamics Simulator are presented. The complete simulator system, which consists of the profie subsystem, simulation control and input/output subsystem, truth model subsystem, onboard computer model subsystem, and postprocessor, is described. The simulator will be used to evaluate and test the attitude determination and control models to be used on board GRO under conditions that simulate the expected in-flight environment.

  4. Modeling and Simulation With Operational Databases to Enable Dynamic Situation Assessment & Prediction

    DTIC Science & Technology

    2010-11-01

    subsections discuss the design of the simulations. 3.12.1 Lanchester5D Simulation A Lanchester simulation was developed to conduct performance...benchmarks using the WarpIV Kernel and HyperWarpSpeed. The Lanchester simulation contains a user-definable number of grid cells in which blue and red...forces engage in battle using Lanchester equations. Having a user-definable number of grid cells enables the simulation to be stressed with high entity

  5. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  6. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  7. Validation of Supersonic Film Cooling Modeling for Liquid Rocket Engine Applications

    NASA Technical Reports Server (NTRS)

    Morris, Christopher I.; Ruf, Joseph H.

    2010-01-01

    Topics include: upper stage engine key requirements and design drivers; Calspan "stage 1" results, He slot injection into hypersonic flow (air); test articles for shock generator diagram, slot injector details, and instrumentation positions; test conditions; modeling approach; 2-d grid used for film cooling simulations of test article; heat flux profiles from 2-d flat plate simulations (run #4); heat flux profiles from 2-d backward facing step simulations (run #43); isometric sketch of single coolant nozzle, and x-z grid of half-nozzle domain; comparison of 2-d and 3-d simulations of coolant nozzles (run #45); flowfield properties along coolant nozzle centerline (run #45); comparison of 3-d CFD nozzle flow calculations with experimental data; nozzle exit plane reduced to linear profile for use in 2-d film-cooling simulations (run #45); synthetic Schlieren image of coolant injection region (run #45); axial velocity profiles from 2-d film-cooling simulation (run #45); coolant mass fraction profiles from 2-d film-cooling simulation (run #45); heat flux profiles from 2-d film cooling simulations (run #45); heat flux profiles from 2-d film cooling simulations (runs #47, #45, and #47); 3-d grid used for film cooling simulations of test article; heat flux contours from 3-d film-cooling simulation (run #45); and heat flux profiles from 3-d and 2-d film cooling simulations (runs #44, #46, and #47).

  8. The effect of self-directed virtual reality simulation on dissection training performance in mastoidectomy.

    PubMed

    Andersen, Steven Arild Wuyts; Foghsgaard, Søren; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten

    2016-08-01

    To establish the effect of self-directed virtual reality (VR) simulation training on cadaveric dissection training performance in mastoidectomy and the transferability of skills acquired in VR simulation training to the cadaveric dissection training setting. Prospective study. Two cohorts of 20 novice otorhinolaryngology residents received either self-directed VR simulation training before cadaveric dissection training or vice versa. Cadaveric and VR simulation performances were assessed using final-product analysis with three blinded expert raters. The group receiving VR simulation training before cadaveric dissection had a mean final-product score of 14.9 (95 % confidence interval [CI] [12.9-16.9]) compared with 9.8 (95% CI [8.4-11.1]) in the group not receiving VR simulation training before cadaveric dissection. This 52% increase in performance was statistically significantly (P < 0.0001). A single dissection mastoidectomy did not increase VR simulation performance (P = 0.22). Two hours of self-directed VR simulation training was effective in increasing cadaveric dissection mastoidectomy performance and suggests that mastoidectomy skills are transferable from VR simulation to the traditional dissection setting. Virtual reality simulation training can therefore be employed to optimize training, and can spare the use of donated material and instructional resources for more advanced training after basic competencies have been acquired in the VR simulation environment. NA. Laryngoscope, 126:1883-1888, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  9. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  10. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  11. Computer simulation of on-orbit manned maneuvering unit operations

    NASA Technical Reports Server (NTRS)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  12. The perceived value of using BIM for energy simulation

    NASA Astrophysics Data System (ADS)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement level with BIM and/or energy simulation. However, green design stakeholder perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM and/or energy simulation may differ between different user groups (i.e. BIM users only, energy simulation users only, and BIM and energy simulation users). For example, the BIM-only user groups appeared to have a strong positive correlation between the perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM. Additionally, this study suggests that the top perceived benefits of using BIMs to inform energy simulations among green design stakeholders are: facilitation of communication, reducing of process related costs, and giving users the ability examine more design options. The main perceived barrier of using BIMs to inform energy simulations among green design stakeholders was a lack of BIM standards for model integration with multidisciplinary teams. Results from this study will help readers understand how to better implement BIM-based energy simulation while mitigating barriers and optimizing benefits. Additionally, examining discrepancies between user groups can lead the identification and improvement of shortfalls in current BIM-based energy simulation processes. Understanding how perceptions and engagement levels differ among different software user groups will help in developing a strategies for implementing BIM-based energy simulation that are tailored to each specific user group.

  13. Turbofan Engine Post-Instability Behavior - Computer Simulations, Test Validation, and Application of Simulations,

    DTIC Science & Technology

    COMPRESSORS, *AIR FLOW, TURBOFAN ENGINES , TRANSIENTS, SURGES, STABILITY, COMPUTERIZED SIMULATION, EXPERIMENTAL DATA, VALIDATION, DIGITAL SIMULATION, INLET GUIDE VANES , ROTATION, STALLING, RECOVERY, HYSTERESIS

  14. Volume I: Select Papers

    DTIC Science & Technology

    2010-08-01

    Pressurization Simulations ....................................................................................18  3.2  NVT Uniaxial Strain... Simulations .................................................................................26  3.3  Stacking Mismatch Simulations ...13  Figure 2. Pressure versus normalized volume. Circles are simulation results

  15. Interactive Graphics Simulator: Design, Development, and Effectiveness/Cost Evaluation. Final Report.

    ERIC Educational Resources Information Center

    Pieper, William J.; And Others

    This study was initiated to design, develop, implement, and evaluate a videodisc-based simulator system, the Interactive Graphics Simulator (IGS) for 6883 Converter Flight Control Test Station training at Lowry Air Force Base, Colorado. The simulator provided a means for performing task analysis online, developing simulations from the task…

  16. Developing a Problem-Based Learning Simulation: An Economics Unit on Trade

    ERIC Educational Resources Information Center

    Maxwell, Nan L.; Mergendoller, John R.; Bellisimo, Yolanda

    2004-01-01

    This article argues that the merger of simulations and problem-based learning (PBL) can enhance both active-learning strategies. Simulations benefit by using a PBL framework to promote student-directed learning and problem-solving skills to explain a simulated dilemma with multiple solutions. PBL benefits because simulations structure the…

  17. Dynamic Systems for Individual Tracking via Heterogeneous Information Integration and Crowd Source Distributed Simulation

    DTIC Science & Technology

    2015-12-04

    51   6.6   Power Consumption: Communications ...simulations executing on mobile computing platforms, an area not widely studied to date in the distributed simulation research community . A...simulation community . These initial studies focused on two conservative synchronization algorithms widely used in the distributed simulation field

  18. SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration

    ERIC Educational Resources Information Center

    Han, Kyung T.

    2012-01-01

    Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…

  19. Reconsidering Simulations in Science Education at a Distance: Features of Effective Use

    ERIC Educational Resources Information Center

    Blake, C.; Scanlon, E.

    2007-01-01

    This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…

  20. Simulations Build Efficacy: Empirical Results from a Four-Week Congressional Simulation

    ERIC Educational Resources Information Center

    Mariani, Mack; Glenn, Brian J.

    2014-01-01

    This article describes a four-week congressional committee simulation implemented in upper level courses on Congress and the Legislative process at two liberal arts colleges. We find that the students participating in the simulation possessed high levels of political knowledge and confidence in their political skills prior to the simulation. An…

  1. Games and Simulations in the Community College Classroom.

    ERIC Educational Resources Information Center

    Butler, J. Thomas

    This discussion of the use of games and simulations in instruction includes a number of examples of activities that can be used in the community college classroom. Section I assesses the value of games and simulations as an approach to learning; defines games, simulations, and non-simulation games; considers the advantages and disadvantages of the…

  2. Practical and Creative Simulations for Training Personnel in Deafblindness.

    ERIC Educational Resources Information Center

    Olson, Joyce; Grondin, Jennifer

    This monograph describes how to conduct simulations that allow individuals to experience what it is like to have deafblindness. It begins by discussing the philosophy and benefits of simulations and explains the two different types of simulations. The first type of simulation gives a generic overview of the impact of deafblindness on learning and…

  3. Simulation Gaming: A New Teaching Strategy in Nursing Education

    ERIC Educational Resources Information Center

    Clark, Carolyn Chambers

    1976-01-01

    Defines simulation gaming and differentiates it from role playing. The author suggests that educators need to be aware of its advantages and disadvantages and to know how to evaluate the potential effectiveness of a particular simulation game. An example simulation game is provided, with guidelines for developing more simulation games in nursing.…

  4. 21 CFR 352.71 - Light source (solar simulator).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Light source (solar simulator). 352.71 Section 352... Procedures § 352.71 Light source (solar simulator). A solar simulator used for determining the SPF of a... nanometers. In addition, a solar simulator should have no significant time-related fluctuations in radiation...

  5. 21 CFR 352.71 - Light source (solar simulator).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Light source (solar simulator). 352.71 Section 352... Procedures § 352.71 Light source (solar simulator). A solar simulator used for determining the SPF of a... nanometers. In addition, a solar simulator should have no significant time-related fluctuations in radiation...

  6. Application of the Environmental Sensation Learning Vehicle Simulation Platform in Virtual Reality

    ERIC Educational Resources Information Center

    Hsu, Kuei-Shu; Jiang, Jinn-Feng; Wei, Hung-Yuan; Lee, Tsung-Han

    2016-01-01

    The use of simulation technologies in learning has received considerable attention in recent years, but few studies to date have focused on vehicle driving simulation systems. In this study, a vehicle driving simulation system was developed to support novice drivers in practicing their skills. Specifically, the vehicle driving simulation system…

  7. Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs

    ERIC Educational Resources Information Center

    Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh

    2013-01-01

    Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…

  8. Simulation Methodology in Nursing Education and Adult Learning Theory

    ERIC Educational Resources Information Center

    Rutherford-Hemming, Tonya

    2012-01-01

    Simulation is often used in nursing education as a teaching methodology. Simulation is rooted in adult learning theory. Three learning theories, cognitive, social, and constructivist, explain how learners gain knowledge with simulation experiences. This article takes an in-depth look at each of these three theories as each relates to simulation.…

  9. Simulation of CIFF (Centralized IFF) remote control displays

    NASA Astrophysics Data System (ADS)

    Tucker, D. L.; Leibowitz, L. M.

    1986-06-01

    This report presents the software simulation of the Remote-Control-Display (RCS) proposed to be used in the Centralized IFF (CIFF) system. A description of the simulation programs along with simulated menu formats are presented. A sample listing of the simulation programs and a brief description of the program operation are also included.

  10. Modeling and Simulation: PowerBoosting Productivity with Simulation.

    ERIC Educational Resources Information Center

    Riley, Suzanne

    Minnesota high school students and teachers are learning the technology of simulation and integrating it into business and industrial technology courses. Modeling and simulation is the science of using software to construct a system within an organization and then running simulations of proposed changes to assess results before funds are spent. In…

  11. The Impact of Human Patient Simulation on the Attainment of Learning Outcomes

    ERIC Educational Resources Information Center

    Re, Antonio

    2011-01-01

    Human patient simulation, and more specifically, high fidelity patient simulation is a growing teaching technique that enables students in medical and health related professions to learn through interacting with a simulator. This study examined the uses of high fidelity simulation with 106 students enrolled in nursing and respiratory therapist…

  12. Investigation of Propagation in Foliage Using Simulation Techniques

    DTIC Science & Technology

    2011-12-01

    simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the...simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the path... Rainforest ...............................2 2. Electrical Properties of a Forest .........................................................3 B. OBJECTIVES OF

  13. Reducing EnergyPlus Run Time For Code Compliance Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athalye, Rahul A.; Gowri, Krishnan; Schultz, Robert W.

    2014-09-12

    Integration of the EnergyPlus ™ simulation engine into performance-based code compliance software raises a concern about simulation run time, which impacts timely feedback of compliance results to the user. EnergyPlus annual simulations for proposed and code baseline building models, and mechanical equipment sizing result in simulation run times beyond acceptable limits. This paper presents a study that compares the results of a shortened simulation time period using 4 weeks of hourly weather data (one per quarter), to an annual simulation using full 52 weeks of hourly weather data. Three representative building types based on DOE Prototype Building Models and threemore » climate zones were used for determining the validity of using a shortened simulation run period. Further sensitivity analysis and run time comparisons were made to evaluate the robustness and run time savings of using this approach. The results of this analysis show that the shortened simulation run period provides compliance index calculations within 1% of those predicted using annual simulation results, and typically saves about 75% of simulation run time.« less

  14. American Society of Composites, 32nd Technical Conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitharaju, Venkat; Wollschlager, Jeffrey; Plakomytis2, Dimitrios

    This paper will present a general methodology by which weave draping manufacturing simulation results can be utilized to include the effects of weave draping and scissor angle in a structural multiscale simulation. While the methodology developed is general in nature, this paper will specifically demonstrate the methodology applied to a truncated pyramid, utilizing manufacturing simulation weave draping results from ESI PAM-FORM, and multiscale simulation using Altair Multiscale Designer (MDS) and OptiStruct. From a multiscale simulation perspective, the weave draping manufacturing simulation results will be used to develop a series of woven unit cells which cover the range of weave scissormore » angles existing within the part. For each unit cell, a multiscale material model will be developed, and applied to the corresponding spatial locations within the structural simulation mesh. In addition, the principal material orientation will be mapped from the wave draping manufacturing simulation mesh to the structural simulation mesh using Altair HyperMesh mapping technology. Results of the coupled simulation will be compared and verified against experimental data of the same available via General Motors (GM) Department of Energy (DOE) project.« less

  15. Effectiveness of online simulation training: Measuring faculty knowledge, perceptions, and intention to adopt.

    PubMed

    Kim, Sujeong; Park, Chang; O'Rourke, Jennifer

    2017-04-01

    Best practice standards of simulation recommend standardized simulation training for nursing faculty. Online training may offer an effective and more widely available alternative to in-person training. Using the Theory of Planned Behavior, this study evaluated the effectiveness of an online simulation training program, examining faculty's foundational knowledge of simulation as well as perceptions and intention to adopt. One-group pretest-posttest design. A large school of nursing with a main campus and five regional campuses in the Midwestern United States. Convenience sample of 52 faculty participants. Knowledge of foundational simulation principles was measured by pre/post-training module quizzes. Perceptions and the intention to adopt simulation were measured using the Faculty Attitudes and Intent to Use Related to the Human Patient Simulator questionnaire. There was a significant improvement in faculty knowledge after training and observable improvements in attitudes. Attitudes significantly influenced the intention to adopt simulation (B=2.54, p<0.001). Online simulation training provides an effective alternative for training large numbers of nursing faculty who seek to implement best practice of standards within their institutions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. SimulaTE: simulating complex landscapes of transposable elements of populations.

    PubMed

    Kofler, Robert

    2018-04-15

    Estimating the abundance of transposable elements (TEs) in populations (or tissues) promises to answer many open research questions. However, progress is hampered by the lack of concordance between different approaches for TE identification and thus potentially unreliable results. To address this problem, we developed SimulaTE a tool that generates TE landscapes for populations using a newly developed domain specific language (DSL). The simple syntax of our DSL allows for easily building even complex TE landscapes that have, for example, nested, truncated and highly diverged TE insertions. Reads may be simulated for the populations using different sequencing technologies (PacBio, Illumina paired-ends) and strategies (sequencing individuals and pooled populations). The comparison between the expected (i.e. simulated) and the observed results will guide researchers in finding the most suitable approach for a particular research question. SimulaTE is implemented in Python and available at https://sourceforge.net/projects/simulates/. Manual https://sourceforge.net/p/simulates/wiki/Home/#manual; Test data and tutorials https://sourceforge.net/p/simulates/wiki/Home/#walkthrough; Validation https://sourceforge.net/p/simulates/wiki/Home/#validation. robert.kofler@vetmeduni.ac.at.

  17. From Simulation to Real Robots with Predictable Results: Methods and Examples

    NASA Astrophysics Data System (ADS)

    Balakirsky, S.; Carpin, S.; Dimitoglou, G.; Balaguer, B.

    From a theoretical perspective, one may easily argue (as we will in this chapter) that simulation accelerates the algorithm development cycle. However, in practice many in the robotics development community share the sentiment that “Simulation is doomed to succeed” (Brooks, R., Matarić, M., Robot Learning, Kluwer Academic Press, Hingham, MA, 1993, p. 209). This comes in large part from the fact that many simulation systems are brittle; they do a fair-to-good job of simulating the expected, and fail to simulate the unexpected. It is the authors' belief that a simulation system is only as good as its models, and that deficiencies in these models lead to the majority of these failures. This chapter will attempt to address these deficiencies by presenting a systematic methodology with examples for the development of both simulated mobility models and sensor models for use with one of today's leading simulation engines. Techniques for using simulation for algorithm development leading to real-robot implementation will be presented, as well as opportunities for involvement in international robotics competitions based on these techniques.

  18. Genetic Simulation Resources: a website for the registration and discovery of genetic data simulators

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Clarke, Lauren; Gillanders, Elizabeth; Feuer, Eric J.

    2013-01-01

    Summary: Many simulation methods and programs have been developed to simulate genetic data of the human genome. These data have been widely used, for example, to predict properties of populations retrospectively or prospectively according to mathematically intractable genetic models, and to assist the validation, statistical inference and power analysis of a variety of statistical models. However, owing to the differences in type of genetic data of interest, simulation methods, evolutionary features, input and output formats, terminologies and assumptions for different applications, choosing the right tool for a particular study can be a resource-intensive process that usually involves searching, downloading and testing many different simulation programs. Genetic Simulation Resources (GSR) is a website provided by the National Cancer Institute (NCI) that aims to help researchers compare and choose the appropriate simulation tools for their studies. This website allows authors of simulation software to register their applications and describe them with well-defined attributes, thus allowing site users to search and compare simulators according to specified features. Availability: http://popmodels.cancercontrol.cancer.gov/gsr. Contact: gsr@mail.nih.gov PMID:23435068

  19. Judicious use of simulation technology in continuing medical education.

    PubMed

    Curtis, Michael T; DiazGranados, Deborah; Feldman, Moshe

    2012-01-01

    Use of simulation-based training is fast becoming a vital source of experiential learning in medical education. Although simulation is a common tool for undergraduate and graduate medical education curricula, the utilization of simulation in continuing medical education (CME) is still an area of growth. As more CME programs turn to simulation to address their training needs, it is important to highlight concepts of simulation technology that can help to optimize learning outcomes. This article discusses the role of fidelity in medical simulation. It provides support from a cross section of simulation training domains for determining the appropriate levels of fidelity, and it offers guidelines for creating an optimal balance of skill practice and realism for efficient training outcomes. After defining fidelity, 3 dimensions of fidelity, drawn from the human factors literature, are discussed in terms of their relevance to medical simulation. From this, research-based guidelines are provided to inform CME providers regarding the use of simulation in CME training. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.

  20. Large eddy simulation for atmospheric boundary layer flow over flat and complex terrains

    NASA Astrophysics Data System (ADS)

    Han, Yi; Stoellinger, Michael; Naughton, Jonathan

    2016-09-01

    In this work, we present Large Eddy Simulation (LES) results of atmospheric boundary layer (ABL) flow over complex terrain with neutral stratification using the OpenFOAM-based simulator for on/offshore wind farm applications (SOWFA). The complete work flow to investigate the LES for the ABL over real complex terrain is described including meteorological-tower data analysis, mesh generation and case set-up. New boundary conditions for the lateral and top boundaries are developed and validated to allow inflow and outflow as required in complex terrain simulations. The turbulent inflow data for the terrain simulation is generated using a precursor simulation of a flat and neutral ABL. Conditionally averaged met-tower data is used to specify the conditions for the flat precursor simulation and is also used for comparison with the simulation results of the terrain LES. A qualitative analysis of the simulation results reveals boundary layer separation and recirculation downstream of a prominent ridge that runs across the simulation domain. Comparisons of mean wind speed, standard deviation and direction between the computed results and the conditionally averaged tower data show a reasonable agreement.

Top