Sample records for event realistically simulated

  1. Novel high-fidelity realistic explosion damage simulation for urban environments

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  2. Evaluating average and atypical response in radiation effects simulations

    NASA Astrophysics Data System (ADS)

    Weller, R. A.; Sternberg, A. L.; Massengill, L. W.; Schrimpf, R. D.; Fleetwood, D. M.

    2003-12-01

    We examine the limits of performing single-event simulations using pre-averaged radiation events. Geant4 simulations show the necessity, for future devices, to supplement current methods with ensemble averaging of device-level responses to physically realistic radiation events. Initial Monte Carlo simulations have generated a significant number of extremal events in local energy deposition. These simulations strongly suggest that proton strikes of sufficient energy, even those that initiate purely electronic interactions, can initiate device response capable in principle of producing single event upset or microdose damage in highly scaled devices.

  3. Assessment of upper-ocean variability and the Madden-Julian Oscillation in extended-range air-ocean coupled mesoscale simulations

    NASA Astrophysics Data System (ADS)

    Hong, Xiaodong; Reynolds, Carolyn A.; Doyle, James D.; May, Paul; O'Neill, Larry

    2017-06-01

    Atmosphere-ocean interaction, particular the ocean response to strong atmospheric forcing, is a fundamental component of the Madden-Julian Oscillation (MJO). In this paper, we examine how model errors in previous Madden-Julian Oscillation (MJO) events can affect the simulation of subsequent MJO events due to increased errors that develop in the upper-ocean before the MJO initiation stage. Two fully coupled numerical simulations with 45-km and 27-km horizontal resolutions were integrated for a two-month period from November to December 2011 using the Navy's limited area Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS®). There are three MJO events that occurred subsequently in early November, mid-November, and mid-December during the simulations. The 45-km simulation shows an excessive warming of the SSTs during the suppressed phase that occurs before the initiation of the second MJO event due to erroneously strong surface net heat fluxes. The simulated second MJO event stalls over the Maritime Continent which prevents the recovery of the deep mixed layer and associated barrier layer. Cross-wavelet analysis of solar radiation and SSTs reveals that the diurnal warming is absent during the second suppressed phase after the second MJO event. The mixed layer heat budget indicates that the cooling is primarily caused by horizontal advection associated with the stalling of the second MJO event and the cool SSTs fail to initiate the third MJO event. When the horizontal resolution is increased to 27-km, three MJOs are simulated and compare well with observations on multi-month timescales. The higher-resolution simulation of the second MJO event and more-realistic upper-ocean response promote the onset of the third MJO event. Simulations performed with analyzed SSTs indicate that the stalling of the second MJO in the 45-km run is a robust feature, regardless of ocean forcing, while the diurnal cycle analysis indicates that both 45-km and 27-km ocean resolutions respond realistically when provided with realistic atmospheric forcing. Thus, the problem in the 45-km simulation appears to originate in the atmosphere. Additional simulations show that while the details of the simulations are sensitive to small changes in the initial integration time, the large differences between the 45-km and 27-km runs during the suppressed phase in early December are robust.

  4. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    PubMed

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  5. A seat cushion to provide realistic acceleration cues for aircraft simulators

    NASA Technical Reports Server (NTRS)

    Ashworth, B. R.

    1976-01-01

    A seat cushion to provide acceleration cues for aircraft simulator pilots was built, performance tested, and evaluated. The four cell seat, using a thin air cushion with highly responsive pressure control, attempts to reproduce the same events which occur in an aircraft seat under acceleration loading. The pressure controller provides seat cushion responses which are considered adequate for current high performance aircraft simulations. The initial tests of the seat cushions have resulted in excellent pilot opinion of the cushion's ability to provide realistic and useful cues to the simulator pilot.

  6. Radiation Damage to Nervous System: Designing Optimal Models for Realistic Neuron Morphology in Hippocampus

    NASA Astrophysics Data System (ADS)

    Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov

    2018-02-01

    The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.

  7. Fully 3D modeling of tokamak vertical displacement events with realistic parameters

    NASA Astrophysics Data System (ADS)

    Pfefferle, David; Ferraro, Nathaniel; Jardin, Stephen; Bhattacharjee, Amitava

    2016-10-01

    In this work, we model the complex multi-domain and highly non-linear physics of Vertical Displacement Events (VDEs), one of the most damaging off-normal events in tokamaks, with the implicit 3D extended MHD code M3D-C1. The code has recently acquired the capability to include finite thickness conducting structures within the computational domain. By exploiting the possibility of running a linear 3D calculation on top of a non-linear 2D simulation, we monitor the non-axisymmetric stability and assess the eigen-structure of kink modes as the simulation proceeds. Once a stability boundary is crossed, a fully 3D non-linear calculation is launched for the remainder of the simulation, starting from an earlier time of the 2D run. This procedure, along with adaptive zoning, greatly increases the efficiency of the calculation, and allows to perform VDE simulations with realistic parameters and high resolution. Simulations are being validated with NSTX data where both axisymmetric (toroidally averaged) and non-axisymmetric induced and conductive (halo) currents have been measured. This work is supported by US DOE Grant DE-AC02-09CH11466.

  8. Challenges to the development of complex virtual reality surgical simulations.

    PubMed

    Seymour, N E; Røtnes, J S

    2006-11-01

    Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.

  9. Intervention: Simulating the War on Global Terrorism

    ERIC Educational Resources Information Center

    Steinbrink, John E.; Helmer, Joel W.

    2004-01-01

    Students analyze a contemporary geopolitical event from a comprehensive geographic perspective using role play simulation, discussion, and decision-making. The three-day activity provides teachers with a realistic, ready-made classroom lesson that combines powerful conceptual learning with drama and surprise. The task of the teacher is to…

  10. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  11. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  12. Intensification of convective extremes driven by cloud-cloud interaction

    NASA Astrophysics Data System (ADS)

    Moseley, Christopher; Hohenegger, Cathy; Berg, Peter; Haerter, Jan O.

    2016-10-01

    In a changing climate, a key role may be played by the response of convective-type cloud and precipitation to temperature changes. Yet, it is unclear if convective precipitation intensities will increase mainly due to thermodynamic or dynamical processes. Here we perform large eddy simulations of convection by imposing a realistic diurnal cycle of surface temperature. We find convective events to gradually self-organize into larger cloud clusters and those events occurring late in the day to produce the highest precipitation intensities. Tracking rain cells throughout their life cycles, we show that events which result from collisions respond strongly to changes in boundary conditions, such as temperature changes. Conversely, events not resulting from collisions remain largely unaffected by the boundary conditions. Increased surface temperature indeed leads to more interaction between events and stronger precipitation extremes. However, comparable intensification occurs when leaving temperature unchanged but simply granting more time for self-organization. These findings imply that the convective field as a whole acquires a memory of past precipitation and inter-cloud dynamics, driving extremes. For global climate model projections, our results suggest that the interaction between convective clouds must be incorporated to simulate convective extremes and the diurnal cycle more realistically.

  13. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in space. We undertake a number of quality checks of the stochastic model and compare real and simulated footprints to show that the method is able to re-create realistic patterns even at continental scales where there is large variation in flood generating mechanisms. We then show how these patterns can be used to drive a large scale 2D hydraulic to predict regional scale flooding.

  14. Assessing methane emission estimation methods based on atmospheric measurements from oil and gas production using LES simulations

    NASA Astrophysics Data System (ADS)

    Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.

    2017-12-01

    There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation results from advanced methods such as variational inverse modeling, and Bayesian inference and stochastic sampling techniques. Future directions including other types of observations, other hydrocarbons being considered, and assessment of additional emission estimation methods will be discussed.

  15. Modeling of extreme freshwater outflow from the north-eastern Japanese river basins to western Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Troselj, Josko; Sayama, Takahiro; Varlamov, Sergey M.; Sasaki, Toshiharu; Racault, Marie-Fanny; Takara, Kaoru; Miyazawa, Yasumasa; Kuroki, Ryusuke; Yamagata, Toshio; Yamashiki, Yosuke

    2017-12-01

    This study demonstrates the importance of accurate extreme discharge input in hydrological and oceanographic combined modeling by introducing two extreme typhoon events. We investigated the effects of extreme freshwater outflow events from river mouths on sea surface salinity distribution (SSS) in the coastal zone of the north-eastern Japan. Previous studies have used observed discharge at the river mouth, as well as seasonally averaged inter-annual, annual, monthly or daily simulated data. Here, we reproduced the hourly peak discharge during two typhoon events for a targeted set of nine rivers and compared their impact on SSS in the coastal zone based on observed, climatological and simulated freshwater outflows in conjunction with verification of the results using satellite remote-sensing data. We created a set of hourly simulated freshwater outflow data from nine first-class Japanese river basins flowing to the western Pacific Ocean for the two targeted typhoon events (Chataan and Roke) and used it with the integrated hydrological (CDRMV3.1.1) and oceanographic (JCOPE-T) model, to compare the case using climatological mean monthly discharges as freshwater input from rivers with the case using our hydrological model simulated discharges. By using the CDRMV model optimized with the SCE-UA method, we successfully reproduced hindcasts for peak discharges of extreme typhoon events at the river mouths and could consider multiple river basin locations. Modeled SSS results were verified by comparison with Chlorophyll-a distribution, observed by satellite remote sensing. The projection of SSS in the coastal zone became more realistic than without including extreme freshwater outflow. These results suggest that our hydrological models with optimized model parameters calibrated to the Typhoon Roke and Chataan cases can be successfully used to predict runoff values from other extreme precipitation events with similar physical characteristics. Proper simulation of extreme typhoon events provides more realistic coastal SSS and may allow a different scenario analysis with various precipitation inputs for developing a nowcasting analysis in the future.

  16. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  17. Modeling Supermassive Black Holes in Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  18. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  19. Track reconstruction and particle identification developments for a study of event-by-event fluctuations in heavy ion collisions at NICA

    NASA Astrophysics Data System (ADS)

    Mudrokh, A. A.; Zinchenko, A. I.

    2017-01-01

    A Monte Carlo simulation of heavy ion collisions (Au+Au) has been performed at MPD (Multi Purpose Detector) at NICA (Dubna) for a study of the possible critical point in the phase diagram of the hot nuclear matter. The simulation took into account real detector effects with as many details as possible to properly describe the apparatus response. Particle identification (PID) has been tuned to account for modifications in track reconstruction. Some results on hadron identification in the TPC and TOF (Time Of Flight) detectors with realistically simulated response have been also obtained.

  20. Seat cushion to provide realistic acceleration cues to aircraft simulator pilot

    NASA Technical Reports Server (NTRS)

    Ashworth, B. R. (Inventor)

    1979-01-01

    Seat cushions, each including an air cell with a non-compressible surface, are disclosed. The apparatus are provided for initially controlling the air pressure in the air cells to allow the two main support areas of the simulator pilot to touch the non-compressible surface and thus begin to compress the flesh near these areas. During a simulated flight the apparatus control the air pressure in the cells to simulate the events that occur in a seat cushion during actual flight.

  1. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Mesoscale Simulations of a Florida Sea Breeze Using the PLACE Land Surface Model Coupled to a 1.5-Order Turbulence Parameterization

    NASA Technical Reports Server (NTRS)

    Lynn, Barry H.; Stauffer, David R.; Wetzel, Peter J.; Tao, Wei-Kuo; Perlin, Natal; Baker, R. David; Munoz, Ricardo; Boone, Aaron; Jia, Yiqin

    1999-01-01

    A sophisticated land-surface model, PLACE, the Parameterization for Land Atmospheric Convective Exchange, has been coupled to a 1.5-order turbulent kinetic energy (TKE) turbulence sub-model. Both have been incorporated into the Penn State/National Center for Atmospheric Research (PSU/NCAR) mesoscale model MM5. Such model improvements should have their greatest effect in conditions where surface contrasts dominate over dynamic processes, such as the simulation of warm-season, convective events. A validation study used the newly coupled model, MM5 TKE-PLACE, to simulate the evolution of Florida sea-breeze moist convection during the Convection and Precipitation Electrification Experiment (CaPE). Overall, eight simulations tested the sensitivity of the MM5 model to combinations of the new and default model physics, and initialization of soil moisture and temperature. The TKE-PLACE model produced more realistic surface sensible heat flux, lower biases for surface variables, more realistic rainfall, and cloud cover than the default model. Of the 8 simulations with different factors (i.e., model physics or initialization), TKE-PLACE compared very well when each simulation was ranked in terms of biases of the surface variables and rainfall, and percent and root mean square of cloud cover. A factor separation analysis showed that a successful simulation required the inclusion of a multi-layered, land surface soil vegetation model, realistic initial soil moisture, and higher order closure of the planetary boundary layer (PBL). These were needed to realistically model the effect of individual, joint, and synergistic contributions from the land surface and PBL on the CAPE sea-breeze, Lake Okeechobee lake breeze, and moist convection.

  3. Turbulent Extreme Event Simulations for Lidar-Assisted Wind Turbine Control

    NASA Astrophysics Data System (ADS)

    Schlipf, David; Raach, Steffen

    2016-09-01

    This work presents a wind field generator which allows to shape wind fields in the time domain while maintaining the spectral properties. This is done by an iterative generation of wind fields and by minimizing the error between wind characteristics of the generated wind fields and desired values. The method leads towards realistic ultimate load calculations for lidar-assisted control. This is demonstrated by fitting a turbulent wind field to an Extreme Operating Gust. The wind field is then used to compare a baseline feedback controller alone against a combined feedback and feedforward controller using simulated lidar measurements. The comparison confirms that the lidar-assisted controller is still able to significantly reduce the ultimate loads on the tower base under this more realistic conditions.

  4. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  5. Consistent simulation of nonresonant diphoton production in hadron collisions including associated jet production up to two jets

    NASA Astrophysics Data System (ADS)

    Odaka, Shigeru; Kurihara, Yoshimasa

    2016-12-01

    An event generator for diphoton (γ γ ) production in hadron collisions that includes associated jet production up to two jets has been developed using a subtraction method based on the limited leading-log subtraction. The parton shower (PS) simulation to restore the subtracted divergent components involves both quantum electrodynamic (QED) and quantum chromodynamic radiation, and QED radiation at very small Q2 is simulated by referring to a fragmentation function (FF). The PS/FF simulation has the ability to enforce the radiation of a given number of energetic photons. The generated events can be fed to PYTHIA to obtain particle (hadron) level event information, which enables us to perform realistic simulations of photon isolation and hadron-jet reconstruction. The simulated events, in which the loop-mediated g g →γ γ process is involved, reasonably reproduce the diphoton kinematics measured at the LHC. Using the developed simulation, we found that the two-jet processes significantly contribute to diphoton production. A large two-jet contribution can be considered as a common feature in electroweak-boson production in hadron collisions although the reason is yet to be understood. Discussion concerning the treatment of the underlying events in photon isolation is necessary for future higher precision measurements.

  6. Realistic training scenario simulations and simulation techniques

    DOEpatents

    Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.

    2017-12-05

    In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.

  7. Simulation and testing of a multichannel system for 3D sound localization

    NASA Astrophysics Data System (ADS)

    Matthews, Edward Albert

    Three-dimensional (3D) audio involves the ability to localize sound anywhere in a three-dimensional space. 3D audio can be used to provide the listener with the perception of moving sounds and can provide a realistic listening experience for applications such as gaming, video conferencing, movies, and concerts. The purpose of this research is to simulate and test 3D audio by incorporating auditory localization techniques in a multi-channel speaker system. The objective is to develop an algorithm that can place an audio event in a desired location by calculating and controlling the gain factors of each speaker. A MATLAB simulation displays the location of the speakers and perceived sound, which is verified through experimentation. The scenario in which the listener is not equidistant from each of the speakers is also investigated and simulated. This research is envisioned to lead to a better understanding of human localization of sound, and will contribute to a more realistic listening experience.

  8. Advances in computer simulation of genome evolution: toward more realistic evolutionary genomics analysis by approximate bayesian computation.

    PubMed

    Arenas, Miguel

    2015-04-01

    NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.

  9. Advanced Simulation of Coupled Earthquake and Tsunami Events

    NASA Astrophysics Data System (ADS)

    Behrens, Joern

    2013-04-01

    Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.

  10. Modeling the Magnetopause Shadowing Loss during the October 2012 Dropout Event

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Cunningham, Gregory

    2017-04-01

    The relativistic electron flux in Earth's outer radiation belt are observed to drop by orders of magnitude on timescales of a few hours, which is called radiation belt dropouts. Where do the electrons go during the dropouts? This is one of the most important outstanding questions in radiation belt studies. Radiation belt electrons can be lost either by precipitation into the atmosphere or by transport across the magnetopause into interplanetary space. The latter mechanism is called magnetopause shadowing, usually combined with outward radial diffusion of electrons due to the sharp radial gradient it creates. In order to quantify the relative contribution of these two mechanisms to radiation belt dropout, we performed an event study on the October 2012 dropout event observed by Van Allen Probes. First, the precipitating MeV electrons observed by multiple NOAA POES satellites at low altitude did not show evidence of enhanced precipitation during the dropout, which suggested that precipitation was not the dominant loss mechanism for the event. Then, in order to simulate the magnetopause shadowing loss and outward radial diffusion during the dropout, we applied a radial diffusion model with electron lifetimes on the order of electron drift periods outside the last closed drift shell. In addition, realistic and event-specific inputs of radial diffusion coefficients (DLL) and last closed drift shell (LCDS) were implemented in the model. Specifically, we used the new DLL developed by Cunningham [JGR 2016] which were estimated in realistic TS04 [Tsyganenko and Sitnov, JGR 2005] storm time magnetic field model and included physical K (2nd adiabatic invariant) or pitch angle dependence. Event-specific LCDS traced in TS04 model with realistic K dependence was also implemented. Our simulation results showed that these event-specific inputs are critical to explain the electron dropout during the event. The new DLL greatly improved the model performance at low L* regions (L*<3.6) compared to empirical Kp-dependent DLL [Brautigam and Albert, JGR 2000] used in previous radial diffusion models. Combining the event-specific DLL and LCDS, our model well captured the magnetopause shadowing loss and reproduced the electron dropout at L*=4.0-4.5. In addition, we found the K-dependent LCDS is critical to reproduce the pitch angle dependence of the observed electron dropout.

  11. BlackMax: A black-hole event generator with rotation, recoil, split branes, and brane tension

    NASA Astrophysics Data System (ADS)

    Dai, De-Chang; Starkman, Glenn; Stojkovic, Dejan; Issever, Cigdem; Rizvi, Eram; Tseng, Jeff

    2008-04-01

    We present a comprehensive black-hole event generator, BlackMax, which simulates the experimental signatures of microscopic and Planckian black-hole production and evolution at the LHC in the context of brane world models with low-scale quantum gravity. The generator is based on phenomenologically realistic models free of serious problems that plague low-scale gravity, thus offering more realistic predictions for hadron-hadron colliders. The generator includes all of the black-hole gray-body factors known to date and incorporates the effects of black-hole rotation, splitting between the fermions, nonzero brane tension, and black-hole recoil due to Hawking radiation (although not all simultaneously). The generator can be interfaced with Herwig and Pythia. The main code can be downloaded from http://www-pnp.physics.ox.ac.uk/~issever/BlackMax/blackmax.html.

  12. Complex discrete dynamics from simple continuous population models.

    PubMed

    Gamarra, Javier G P; Solé, Ricard V

    2002-05-01

    Nonoverlapping generations have been classically modelled as difference equations in order to account for the discrete nature of reproductive events. However, other events such as resource consumption or mortality are continuous and take place in the within-generation time. We have realistically assumed a hybrid ODE bidimensional model of resources and consumers with discrete events for reproduction. Numerical and analytical approaches showed that the resulting dynamics resembles a Ricker map, including the doubling route to chaos. Stochastic simulations with a handling-time parameter for indirect competition of juveniles may affect the qualitative behaviour of the model.

  13. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  14. Seismic shaking scenarios in realistic 3D crustal model of Northern Italy

    NASA Astrophysics Data System (ADS)

    Molinari, I.; Morelli, A.; Basini, P.; Berbellini, A.

    2013-12-01

    Simulation of seismic wave propagation in realistic crustal structures is a fundamental tool to evaluate earthquake-generated ground shaking and assess seismic hazard. Current-generation numerical codes, and modern HPC infrastructures, allow for realistic simulations in complex 3D geologic structures. We apply such methodology to the Po Plain in Northern Italy -- a region with relatively rare earthquakes but having large property and industrial exposure, as it became clear during the two M~6 events of May 20-29, 2012. Historical seismicity is well known in this region, with maximum magnitudes estimates reaching M~7, and wave field amplitudes may be significantly amplified by the presence of the very thick sedimentary basin. Our goal is to produce estimates of expected ground shaking in Northern Italy through detailed deterministic simulations of ground motion due to expected earthquakes. We defined a three-dimensional model of the earth's crust using geo-statistical tools to merge the abundant information existing in the form of borehole data and seismic reflection profiles that had been shot in the '70s and the '80s for hydrocarbon exploration. Such information, that has been used by geologists to infer the deep structural setup, had never been merged to build a 3D model to be used for seismological simulations. We implement the model in SPECFEM3D_Cartesian and a hexahedral mesh with elements of ~2km, that allows us to simulate waves with minimum period of ~2 seconds. The model has then been optimized through comparison between simulated and recorded seismograms for the ~20 moderate-magnitude events (Mw > 4.5) that have been instrumentally recorded in the last 15 years. Realistic simulations in the frequency band of most common engineering relevance -- say, ~1 Hz -- at such a large scale would require an extremely detailed structural model, currently not available, and prohibitive computational resources. However, an interest is growing in longer period ground motion -- that impacts on the seismic response of taller structures (Cauzzi and Faccioli, 2008) -- and it is not unusual to consider the wave field up to 20s. In such period range, our Po Plain structural model has shown to be able to reproduce well basin resonance and amplification effects at stations boarding the sedimentary plain. We then simulate seismic shaking scenarios for possible sources tied to devastating historical earthquakes that are known to have occurred in the region --- such as the M~6 event that hit Modena in 1501; and the Verona, M~6.7 in 1117, quake that caused well-documented strong effects in an unusually wide area with radius of hundreds of kilometers. We explore different source geometries and rupture histories for each earthquake. We mainly focus our attention on the synthesis of the prominent surface waves that are highly amplified in deep sedimentary basin structures (e.g., Smerzini et al, 2011; Koketsu and Miyage, 2008). Such simulations hold high relevance because of the large local property exposure, due to extensive industrial and touristic infrastructure. We show that deterministic ground motion calculation can indeed provide information to be actively used to mitigate the effects of desctructive earthquakes on critical infrastructures.

  15. Nucleation and arrest of slow slip earthquakes: mechanisms and nonlinear simulations using realistic fault geometries and heterogeneous medium properties

    NASA Astrophysics Data System (ADS)

    Alves da Silva Junior, J.; Frank, W.; Campillo, M.; Juanes, R.

    2017-12-01

    Current models for slow slip earthquakes (SSE) assume a simplified fault embedded on a homogeneous half-space. In these models SSE events nucleate on the transition from velocity strengthening (VS) to velocity weakening (VW) down dip from the trench and propagate towards the base of the seismogenic zone, where high normal effective stress is assumed to arrest slip. Here, we investigate SSE nucleation and arrest using quasi-static finite element simulations, with rate and state friction, on a domain with heterogeneous properties and realistic fault geometry. We use the fault geometry of the Guerrero Gap in the Cocos subduction zone, where SSE events occurs every 4 years, as a proxy for subduction zone. Our model is calibrated using surface displacements from GPS observations. We apply boundary conditions according to the plate convergence rate and impose a depth-dependent pore pressure on the fault. Our simulations indicate that the fault geometry and elastic properties of the medium play a key role in the arrest of SSE events at the base of the seismogenic zone. SSE arrest occurs due to aseismic deformations of the domain that result in areas with elevated effective stress. SSE nucleation occurs in the transition from VS to VW and propagates as a crack-like expansion with increased nucleation length prior to dynamic instability. Our simulations encompassing multiple seismic cycles indicate SSE interval times between 1 and 10 years and, importantly, a systematic increase of rupture area prior to dynamic instability, followed by a hiatus in the SSE occurrence. We hypothesize that these SSE characteristics, if confirmed by GPS observations in different subduction zones, can add to the understanding of nucleation of large earthquakes in the seismogenic zone.

  16. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

  17. Mechanisms controlling primary and new production in a global ecosystem model - Part I: Validation of the biological simulation

    NASA Astrophysics Data System (ADS)

    Popova, E. E.; Coward, A. C.; Nurser, G. A.; de Cuevas, B.; Fasham, M. J. R.; Anderson, T. R.

    2006-12-01

    A global general circulation model coupled to a simple six-compartment ecosystem model is used to study the extent to which global variability in primary and export production can be realistically predicted on the basis of advanced parameterizations of upper mixed layer physics, without recourse to introducing extra complexity in model biology. The "K profile parameterization" (KPP) scheme employed, combined with 6-hourly external forcing, is able to capture short-term periodic and episodic events such as diurnal cycling and storm-induced deepening. The model realistically reproduces various features of global ecosystem dynamics that have been problematic in previous global modelling studies, using a single generic parameter set. The realistic simulation of deep convection in the North Atlantic, and lack of it in the North Pacific and Southern Oceans, leads to good predictions of chlorophyll and primary production in these contrasting areas. Realistic levels of primary production are predicted in the oligotrophic gyres due to high frequency external forcing of the upper mixed layer (accompanying paper Popova et al., 2006) and novel parameterizations of zooplankton excretion. Good agreement is shown between model and observations at various JGOFS time series sites: BATS, KERFIX, Papa and HOT. One exception is the northern North Atlantic where lower grazing rates are needed, perhaps related to the dominance of mesozooplankton there. The model is therefore not globally robust in the sense that additional parameterizations are needed to realistically simulate ecosystem dynamics in the North Atlantic. Nevertheless, the work emphasises the need to pay particular attention to the parameterization of mixed layer physics in global ocean ecosystem modelling as a prerequisite to increasing the complexity of ecosystem models.

  18. BlackMax: A black-hole event generator with rotation, recoil, split branes, and brane tension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai Dechang; Starkman, Glenn; Stojkovic, Dejan

    2008-04-01

    We present a comprehensive black-hole event generator, BlackMax, which simulates the experimental signatures of microscopic and Planckian black-hole production and evolution at the LHC in the context of brane world models with low-scale quantum gravity. The generator is based on phenomenologically realistic models free of serious problems that plague low-scale gravity, thus offering more realistic predictions for hadron-hadron colliders. The generator includes all of the black-hole gray-body factors known to date and incorporates the effects of black-hole rotation, splitting between the fermions, nonzero brane tension, and black-hole recoil due to Hawking radiation (although not all simultaneously). The generator can bemore » interfaced with Herwig and Pythia. The main code can be downloaded from http://www-pnp.physics.ox.ac.uk/{approx}issever/BlackMax/blackmax.html.« less

  19. The Attentional Demand of Automobile Driving Revisited: Occlusion Distance as a Function of Task-Relevant Event Density in Realistic Driving Scenarios.

    PubMed

    Kujala, Tuomo; Mäkelä, Jakke; Kotilainen, Ilkka; Tokkonen, Timo

    2016-02-01

    We studied the utility of occlusion distance as a function of task-relevant event density in realistic traffic scenarios with self-controlled speed. The visual occlusion technique is an established method for assessing visual demands of driving. However, occlusion time is not a highly informative measure of environmental task-relevant event density in self-paced driving scenarios because it partials out the effects of changes in driving speed. Self-determined occlusion times and distances of 97 drivers with varying backgrounds were analyzed in driving scenarios simulating real Finnish suburban and highway traffic environments with self-determined vehicle speed. Occlusion distances varied systematically with the expected environmental demands of the manipulated driving scenarios whereas the distributions of occlusion times remained more static across the scenarios. Systematic individual differences in the preferred occlusion distances were observed. More experienced drivers achieved better lane-keeping accuracy than inexperienced drivers with similar occlusion distances; however, driving experience was unexpectedly not a major factor for the preferred occlusion distances. Occlusion distance seems to be an informative measure for assessing task-relevant event density in realistic traffic scenarios with self-controlled speed. Occlusion time measures the visual demand of driving as the task-relevant event rate in time intervals, whereas occlusion distance measures the experienced task-relevant event density in distance intervals. The findings can be utilized in context-aware distraction mitigation systems, human-automated vehicle interaction, road speed prediction and design, as well as in the testing of visual in-vehicle tasks for inappropriate in-vehicle glancing behaviors in any dynamic traffic scenario for which appropriate individual occlusion distances can be defined. © 2015, Human Factors and Ergonomics Society.

  20. Modeling surface backgrounds from radon progeny plate-out

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumpilly, G.; Guiseppe, V. E.; Snyder, N.

    2013-08-08

    The next generation low-background detectors operating deep underground aim for unprecedented low levels of radioactive backgrounds. The surface deposition and subsequent implantation of radon progeny in detector materials will be a source of energetic background events. We investigate Monte Carlo and model-based simulations to understand the surface implantation profile of radon progeny. Depending on the material and region of interest of a rare event search, these partial energy depositions can be problematic. Motivated by the use of Ge crystals for the detection of neutrinoless double-beta decay, we wish to understand the detector response of surface backgrounds from radon progeny. Wemore » look at the simulation of surface decays using a validated implantation distribution based on nuclear recoils and a realistic surface texture. Results of the simulations and measured α spectra are presented.« less

  1. Practical Sequential Design Procedures for Submarine ASW Search Operational Testing: A Simulation Study

    DTIC Science & Technology

    1998-10-01

    The efficient design of a free play , 24 hour per day, operational test (OT) of an ASW search system remains a challenge to the OT community. It will...efficient, realistic, free play , 24 hour per day OT. The basic test control premise described here is to stop the test event if the time without a

  2. Extreme Landfalling Atmospheric River Events in Arizona: Possible Future Changes

    NASA Astrophysics Data System (ADS)

    Singh, I.; Dominguez, F.

    2016-12-01

    Changing climate could impact the frequency and intensity of extreme atmospheric river events. This can have important consequences for regions like the Southwestern United Sates that rely upon AR-related precipitation for meeting their water demand and are prone to AR-related flooding. This study investigates the effects of climate change on extreme AR events in the Salt and Verde river basins in Central Arizona using a pseudo global warming method (PGW). First, the five most extreme events that affected the region were selected. High-resolution control simulations of these events using the Weather Research and Forecasting model realistically captured the magnitude and spatial distribution of precipitation. Subsequently, following the PGW approach, the WRF initial and lateral boundary conditions were perturbed. The perturbation signals were obtained from an ensemble of 9 General Circulation Models for two warming scenarios - Representative Concentration Pathway (RCP) 4.5 and RCP8.5. Several simulations were conducted changing the temperature and relative humidity fields. PGW simulations reveal that while the overall dynamics of the storms did not change significantly, there was marked strengthening of associated Integrated Vertical Transport (IVT) plumes. There was a general increase in the precipitation over the basins due to increased moisture availability, but heterogeneous spatial changes. Additionally, no significant changes in the strength of the pre-cold frontal low-level jet in the future simulations were observed.

  3. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    NASA Astrophysics Data System (ADS)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  4. Joint independent component analysis for simultaneous EEG-fMRI: principle and simulation.

    PubMed

    Moosmann, Matthias; Eichele, Tom; Nordby, Helge; Hugdahl, Kenneth; Calhoun, Vince D

    2008-03-01

    An optimized scheme for the fusion of electroencephalography and event related potentials with functional magnetic resonance imaging (BOLD-fMRI) data should simultaneously assess all available electrophysiologic and hemodynamic information in a common data space. In doing so, it should be possible to identify features of latent neural sources whose trial-to-trial dynamics are jointly reflected in both modalities. We present a joint independent component analysis (jICA) model for analysis of simultaneous single trial EEG-fMRI measurements from multiple subjects. We outline the general idea underlying the jICA approach and present results from simulated data under realistic noise conditions. Our results indicate that this approach is a feasible and physiologically plausible data-driven way to achieve spatiotemporal mapping of event related responses in the human brain.

  5. Synthetic earthquake catalogs simulating seismic activity in the Corinth Gulf, Greece, fault system

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Carluccio, Roberto; Papadimitriou, Eleftheria; Karakostas, Vassilis

    2015-01-01

    The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence distribution is difficult to establish. This is the case, for instance, of the Corinth Gulf Fault System (CGFS), for which documents about strong earthquakes exist for at least 2000 years, although they can be considered complete for M ≥ 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for individual fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes ≥ 4.0. The main features of our simulation algorithm are (1) an average slip rate released by earthquakes for every single segment in the investigated fault system, (2) heuristic procedures for rupture growth and stop, leading to a self-organized earthquake magnitude distribution, (3) the interaction between earthquake sources, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the CGFS has shown realistic features in time, space, and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher-magnitude range.

  6. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    NASA Astrophysics Data System (ADS)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.

  7. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  8. Simulation of monsoon intraseasonal oscillations in a coarse-resolution aquaplanet GCM

    NASA Astrophysics Data System (ADS)

    Ajayamohan, R. S.; Khouider, Boualem; Majda, Andrew J.

    2014-08-01

    The skill of the global climate models (GCMs) to realistically simulate the monsoon intraseasonal oscillations (MISOs) is related to the sensitivity of their convective parameterization schemes. Here we show that by coupling a simple multicloud parameterization to a coarse-resolution aquaplanet GCM, realistic MISOs can be simulated. We conduct three different simulations with a fixed nonhomogeneous sea surface temperature mimicking the Indian Ocean/western Pacific warm pool (WP) centered at the three latitudes 5°N, 10°N, and 15°N, respectively, to replicate the seasonal migration of the Tropical Convergence Zone (TCZ). This results in the generation of mean circulation resembling the monsoonal flow pattern in boreal summer. Succession of eastward propagating Madden-Julian Oscillation (MJO) disturbances with phase speed, amplitude, and structure similar to summer MJOs are simulated when the WP is at 5°N. When the WP is located over 10°N, northward and eastward propagating MISOs are simulated. This case captures the meridional seesaw of convection between continental and oceanic TCZ observed during boreal summer over South Asia. Westward propagating Rossby wave-like disturbances are simulated when the WP is over 15°N congruous with the synoptic disturbances seen over the monsoon trough. The initiation of intraseasonal oscillations in the model can occur internally through organization of convective events above the WP associated with internal dynamics.

  9. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.

  10. Simulation of patient flow in multiple healthcare units using process and data mining techniques for model identification.

    PubMed

    Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N

    2018-06-01

    An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Modeling flow around bluff bodies and predicting urban dispersion using large eddy simulation.

    PubMed

    Tseng, Yu-Heng; Meneveau, Charles; Parlange, Marc B

    2006-04-15

    Modeling air pollutant transport and dispersion in urban environments is especially challenging due to complex ground topography. In this study, we describe a large eddy simulation (LES) tool including a new dynamic subgrid closure and boundary treatment to model urban dispersion problems. The numerical model is developed, validated, and extended to a realistic urban layout. In such applications fairly coarse grids must be used in which each building can be represented using relatively few grid-points only. By carrying out LES of flow around a square cylinder and of flow over surface-mounted cubes, the coarsest resolution required to resolve the bluff body's cross section while still producing meaningful results is established. Specifically, we perform grid refinement studies showing that at least 6-8 grid points across the bluff body are required for reasonable results. The performance of several subgrid models is also compared. Although effects of the subgrid models on the mean flow are found to be small, dynamic Lagrangian models give a physically more realistic subgrid-scale (SGS) viscosity field. When scale-dependence is taken into consideration, these models lead to more realistic resolved fluctuating velocities and spectra. These results set the minimum grid resolution and subgrid model requirements needed to apply LES in simulations of neutral atmospheric boundary layer flow and scalar transport over a realistic urban geometry. The results also illustrate the advantages of LES over traditional modeling approaches, particularly its ability to take into account the complex boundary details and the unsteady nature of atmospheric boundary layer flow. Thus LES can be used to evaluate probabilities of extreme events (such as probabilities of exceeding threshold pollutant concentrations). Some comments about computer resources required for LES are also included.

  12. Bayesian imperfect information analysis for clinical recurrent data

    PubMed Central

    Chang, Chih-Kuang; Chang, Chi-Chang

    2015-01-01

    In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. PMID:25565853

  13. Global Response to Local Ionospheric Mass Ejection

    NASA Technical Reports Server (NTRS)

    Moore, T. E.; Fok, M.-C.; Delcourt, D. C.; Slinker, S. P.; Fedder, J. A.

    2010-01-01

    We revisit a reported "Ionospheric Mass Ejection" using prior event observations to guide a global simulation of local ionospheric outflows, global magnetospheric circulation, and plasma sheet pressurization, and comparing our results with the observed global response. Our simulation framework is based on test particle motions in the Lyon-Fedder-Mobarry (LFM) global circulation model electromagnetic fields. The inner magnetosphere is simulated with the Comprehensive Ring Current Model (CRCM) of Fok and Wolf, driven by the transpolar potential developed by the LFM magnetosphere, and includes an embedded plasmaspheric simulation. Global circulation is stimulated using the observed solar wind conditions for the period 24-25 Sept 1998. This period begins with the arrival of a Coronal Mass Ejection, initially with northward, but later with southward interplanetary magnetic field. Test particles are launched from the ionosphere with fluxes specified by local empirical relationships of outflow to electrodynamic and particle precipitation imposed by the MIlD simulation. Particles are tracked until they are lost from the system downstream or into the atmosphere, using the full equations of motion. Results are compared with the observed ring current and a simulation of polar and auroral wind outflows driven globally by solar wind dynamic pressure. We find good quantitative agreement with the observed ring current, and reasonable qualitative agreement with earlier simulation results, suggesting that the solar wind driven global simulation generates realistic energy dissipation in the ionosphere and that the Strangeway relations provide a realistic local outflow description.

  14. Stress response and communication in surgeons undergoing training in endoscopic management of major vessel hemorrhage: a mixed methods study.

    PubMed

    Jukes, Alistair K; Mascarenhas, Annika; Murphy, Jae; Stepan, Lia; Muñoz, Tamara N; Callejas, Claudio A; Valentine, Rowan; Wormald, P J; Psaltis, Alkis J

    2017-06-01

    Major vessel hemorrhage in endoscopic, endonasal skull-base surgery is a rare but potentially fatal event. Surgical simulation models have been developed to train surgeons in the techniques required to manage this complication. This mixed-methods study aims to quantify the stress responses the model induces, determine how realistic the experience is, and how it changes the confidence levels of surgeons in their ability to deal with major vascular injury in an endoscopic setting. Forty consultant surgeons and surgeons in training underwent training on an endoscopic sheep model of jugular vein and carotid artery injury. Pre-course and post-course questionnaires providing demographics, experience level, confidence, and realism scores were taken, based on a 5-point Likert scale. Objective markers of stress response including blood pressure, heart rate, and salivary alpha-amylase levels were measured. Mean "realism" score assessed posttraining showed the model to be perceived as highly realistic by the participants (score 4.02). Difference in participant self-rated pre-course and post-course confidence levels was significant (p < 0.0001): mean pre-course confidence level 1.66 (95% confidence interval [CI], 1.43 to 1.90); mean post-course confidence level 3.42 (95% CI, 3.19 to 3.65). Differences in subjects' heart rates (HRs) and mean arterial blood pressures (MAPs) were significant between injury models (p = 0.0008, p = 0.0387, respectively). No statistically significant difference in salivary alpha-amylase levels pretraining and posttraining was observed. Results from this study indicate that this highly realistic simulation model provides surgeons with an increased level of confidence in their ability to deal with the rare but potentially catastrophic event of major vessel injury in endoscopic skull-base surgery. © 2017 ARS-AAOA, LLC.

  15. MJO-Related Tropical Convection Anomalies Lead to More Accurate Stratospheric Vortex Variability in Subseasonal Forecast Models.

    PubMed

    Garfinkel, C I; Schwartz, C

    2017-10-16

    The effect of the Madden-Julian Oscillation (MJO) on the Northern Hemisphere wintertime stratospheric polar vortex in the period preceding stratospheric sudden warmings is evaluated in operational subseasonal forecasting models. Reforecasts which simulate stronger MJO-related convection in the Tropical West Pacific also simulate enhanced heat flux in the lowermost stratosphere and a more realistic vortex evolution. The time scale on which vortex predictability is enhanced lies between 2 and 4 weeks for nearly all cases. Those stratospheric sudden warmings that were preceded by a strong MJO event are more predictable at ∼20 day leads than stratospheric sudden warmings not preceded by a MJO event. Hence, knowledge of the MJO can contribute to enhanced predictability, at least in a probabilistic sense, of the Northern Hemisphere polar stratosphere.

  16. Event detection and localization for small mobile robots using reservoir computing.

    PubMed

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  17. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  18. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  19. A New Look at Stratospheric Sudden Warmings. Part II: Evaluation of Numerical Model Simulations

    NASA Technical Reports Server (NTRS)

    Charlton, Andrew J.; Polvani, Lorenza M.; Perlwitz, Judith; Sassi, Fabrizio; Manzini, Elisa; Shibata, Kiyotaka; Pawson, Steven; Nielsen, J. Eric; Rind, David

    2007-01-01

    The simulation of major midwinter stratospheric sudden warmings (SSWs) in six stratosphere-resolving general circulation models (GCMs) is examined. The GCMs are compared to a new climatology of SSWs, based on the dynamical characteristics of the events. First, the number, type, and temporal distribution of SSW events are evaluated. Most of the models show a lower frequency of SSW events than the climatology, which has a mean frequency of 6.0 SSWs per decade. Statistical tests show that three of the six models produce significantly fewer SSWs than the climatology, between 1.0 and 2.6 SSWs per decade. Second, four process-based diagnostics are calculated for all of the SSW events in each model. It is found that SSWs in the GCMs compare favorably with dynamical benchmarks for SSW established in the first part of the study. These results indicate that GCMs are capable of quite accurately simulating the dynamics required to produce SSWs, but with lower frequency than the climatology. Further dynamical diagnostics hint that, in at least one case, this is due to a lack of meridional heat flux in the lower stratosphere. Even though the SSWs simulated by most GCMs are dynamically realistic when compared to the NCEP-NCAR reanalysis, the reasons for the relative paucity of SSWs in GCMs remains an important and open question.

  20. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  1. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    NASA Astrophysics Data System (ADS)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  2. A Prototype Two-Decade Fully-Coupled Fine-Resolution CCSM Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClean, Julie L.; Bader, David C; Bryan, Frank O.

    2011-01-01

    A fully coupled global simulation using the Community Climate System Model (CCSM) was configured using grid resolutions of 0.1{sup o} for the ocean and sea-ice, and 0.25{sup o} for the atmosphere and land, and was run under present-day greenhouse gas conditions for 20 years. It represents one of the first efforts to simulate the planetary system at such high horizontal resolution. The climatology of the circulation of the atmosphere and the upper ocean were compared with observational data and reanalysis products to identify persistent mean climate biases. Intensified and contracted polar vortices, and too cold sea surface temperatures (SSTs) inmore » the subpolar and mid-latitude Northern Hemisphere were the dominant biases produced by the model. Intense category 4 cyclones formed spontaneously in the tropical North Pacific. A case study of the ocean response to one such event shows the realistic formation of a cold SST wake, mixed layer deepening, and warming below the mixed layer. Too many tropical cyclones formed in the North Pacific however, due to too high SSTs in the tropical eastern Pacific. In the North Atlantic anomalously low SSTs lead to a dearth of hurricanes. Agulhas eddy pathways are more realistic than in equivalent stand-alone ocean simulations forced with atmospheric reanalysis.« less

  3. On the contributions of diffusion and thermal activation to electron transfer between Phormidium laminosum plastocyanin and cytochrome f: Brownian dynamics simulations with explicit modeling of nonpolar desolvation interactions and electron transfer events.

    PubMed

    Gabdoulline, Razif R; Wade, Rebecca C

    2009-07-08

    The factors that determine the extent to which diffusion and thermal activation processes govern electron transfer (ET) between proteins are debated. The process of ET between plastocyanin (PC) and cytochrome f (CytF) from the cyanobacterium Phormidium laminosum was initially thought to be diffusion-controlled but later was found to be under activation control (Schlarb-Ridley, B. G.; et al. Biochemistry 2005, 44, 6232). Here we describe Brownian dynamics simulations of the diffusional association of PC and CytF, from which ET rates were computed using a detailed model of ET events that was applied to all of the generated protein configurations. The proteins were modeled as rigid bodies represented in atomic detail. In addition to electrostatic forces, which were modeled as in our previous simulations of protein-protein association, the proteins interacted by a nonpolar desolvation (hydrophobic) force whose derivation is described here. The simulations yielded close to realistic residence times of transient protein-protein encounter complexes of up to tens of microseconds. The activation barrier for individual ET events derived from the simulations was positive. Whereas the electrostatic interactions between P. laminosum PC and CytF are weak, simulations for a second cyanobacterial PC-CytF pair, that from Nostoc sp. PCC 7119, revealed ET rates influenced by stronger electrostatic interactions. In both cases, the simulations imply significant contributions to ET from both diffusion and thermal activation processes.

  4. A Computer Model of Drafting Effects on Collective Behavior in Elite 10,000-m Runners.

    PubMed

    Trenchard, Hugh; Renfree, Andrew; Peters, Derek M

    2017-03-01

    Drafting in cycling influences collective behavior of pelotons. Although evidence for collective behavior in competitive running events exists, it is not clear if this results from energetic savings conferred by drafting. This study modeled the effects of drafting on behavior in elite 10,000-m runners. Using performance data from a men's elite 10,000-m track running event, computer simulations were constructed using Netlogo 5.1 to test the effects of 3 different drafting quantities on collective behavior: no drafting, drafting to 3 m behind with up to ~8% energy savings (a realistic running draft), and drafting up to 3 m behind with up to 38% energy savings (a realistic cycling draft). Three measures of collective behavior were analyzed in each condition: mean speed, mean group stretch (distance between first- and last-placed runner), and runner-convergence ratio (RCR), which represents the degree of drafting benefit obtained by the follower in a pair of coupled runners. Mean speeds were 6.32 ± 0.28, 5.57 ± 0.18, and 5.51 ± 0.13 m/s in the cycling-draft, runner-draft, and no-draft conditions, respectively (all P < .001). RCR was lower in the cycling-draft condition but did not differ between the other 2. Mean stretch did not differ between conditions. Collective behaviors observed in running events cannot be fully explained through energetic savings conferred by realistic drafting benefits. They may therefore result from other, possibly psychological, processes. The benefits or otherwise of engaging in such behavior are as yet unclear.

  5. Evaluation of a low-cost 3D sound system for immersive virtual reality training systems.

    PubMed

    Doerr, Kai-Uwe; Rademacher, Holger; Huesgen, Silke; Kubbat, Wolfgang

    2007-01-01

    Since Head Mounted Displays (HMD), datagloves, tracking systems, and powerful computer graphics resources are nowadays in an affordable price range, the usage of PC-based "Virtual Training Systems" becomes very attractive. However, due to the limited field of view of HMD devices, additional modalities have to be provided to benefit from 3D environments. A 3D sound simulation can improve the capabilities of VR systems dramatically. Unfortunately, realistic 3D sound simulations are expensive and demand a tremendous amount of computational power to calculate reverberation, occlusion, and obstruction effects. To use 3D sound in a PC-based training system as a way to direct and guide trainees to observe specific events in 3D space, a cheaper alternative has to be provided, so that a broader range of applications can take advantage of this modality. To address this issue, we focus in this paper on the evaluation of a low-cost 3D sound simulation that is capable of providing traceable 3D sound events. We describe our experimental system setup using conventional stereo headsets in combination with a tracked HMD device and present our results with regard to precision, speed, and used signal types for localizing simulated sound events in a virtual training environment.

  6. HiL simulation in biomechanics: a new approach for testing total joint replacements.

    PubMed

    Herrmann, Sven; Kaehler, Michael; Souffrant, Robert; Rachholz, Roman; Zierath, János; Kluess, Daniel; Mittelmeier, Wolfram; Woernle, Christoph; Bader, Rainer

    2012-02-01

    Instability of artificial joints is still one of the most prevalent reasons for revision surgery caused by various influencing factors. In order to investigate instability mechanisms such as dislocation under reproducible, physiologically realistic boundary conditions, a novel test approach is introduced by means of a hardware-in-the-loop (HiL) simulation involving a highly flexible mechatronic test system. In this work, the underlying concept and implementation of all required units is presented enabling comparable investigations of different total hip and knee replacements, respectively. The HiL joint simulator consists of two units: a physical setup composed of a six-axes industrial robot and a numerical multibody model running in real-time. Within the multibody model, the anatomical environment of the considered joint is represented such that the soft tissue response is accounted for during an instability event. Hence, the robot loads and moves the real implant components according to the information provided by the multibody model while transferring back the position and resisting moment recorded. Functionality of the simulator is proved by testing the underlying control principles, and verified by reproducing the dislocation process of a standard total hip replacement. HiL simulations provide a new biomechanical testing tool for analyzing different joint replacement systems with respect to their instability behavior under realistic movements and physiological load conditions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Inferring Lower Boundary Driving Conditions Using Vector Magnetic Field Observations

    NASA Technical Reports Server (NTRS)

    Schuck, Peter W.; Linton, Mark; Leake, James; MacNeice, Peter; Allred, Joel

    2012-01-01

    Low-beta coronal MHD simulations of realistic CME events require the detailed specification of the magnetic fields, velocities, densities, temperatures, etc., in the low corona. Presently, the most accurate estimates of solar vector magnetic fields are made in the high-beta photosphere. Several techniques have been developed that provide accurate estimates of the associated photospheric plasma velocities such as the Differential Affine Velocity Estimator for Vector Magnetograms and the Poloidal/Toroidal Decomposition. Nominally, these velocities are consistent with the evolution of the radial magnetic field. To evolve the tangential magnetic field radial gradients must be specified. In addition to estimating the photospheric vector magnetic and velocity fields, a further challenge involves incorporating these fields into an MHD simulation. The simulation boundary must be driven, consistent with the numerical boundary equations, with the goal of accurately reproducing the observed magnetic fields and estimated velocities at some height within the simulation. Even if this goal is achieved, many unanswered questions remain. How can the photospheric magnetic fields and velocities be propagated to the low corona through the transition region? At what cadence must we observe the photosphere to realistically simulate the corona? How do we model the magnetic fields and plasma velocities in the quiet Sun? How sensitive are the solutions to other unknowns that must be specified, such as the global solar magnetic field, and the photospheric temperature and density?

  8. Fabrication of An Inexpensive but Effective Colonoscopic Simulator.

    PubMed

    Jones, Mark W; Deere, Matthew J; Harris, Justin R; Chen, Anthony J; Henning, Werner H

    2017-01-01

    Because of increasing requirements for simulator training before actual clinical endoscopies, the demand for realistic, inexpensive endoscopic simulators is increasing. We describe the steps involved in the design and fabrication of an effective and realistic mechanical colonoscopic simulator.

  9. Land surface modeling in convection permitting simulations

    NASA Astrophysics Data System (ADS)

    van Heerwaarden, Chiel; Benedict, Imme

    2017-04-01

    The next generation of weather and climate models permits convection, albeit at a grid spacing that is not sufficient to resolve all details of the clouds. Whereas much attention is being devoted to the correct simulation of convective clouds and associated precipitation, the role of the land surface has received far less interest. In our view, convective permitting simulations pose a set of problems that need to be solved before accurate weather and climate prediction is possible. The heart of the problem lies at the direct runoff and at the nonlinearity of the surface stress as a function of soil moisture. In coarse resolution simulations, where convection is not permitted, precipitation that reaches the land surface is uniformly distributed over the grid cell. Subsequently, a fraction of this precipitation is intercepted by vegetation or leaves the grid cell via direct runoff, whereas the remainder infiltrates into the soil. As soon as we move to convection permitting simulations, this precipitation falls often locally in large amounts. If the same land-surface model is used as in simulations with parameterized convection, this leads to an increase in direct runoff. Furthermore, spatially non-uniform infiltration leads to a very different surface stress, when scaled up to the course resolution of simulations without convection. Based on large-eddy simulation of realistic convection events at a large domain, this study presents a quantification of the errors made at the land surface in convection permitting simulation. It compares the magnitude of the errors to those made in the convection itself due to the coarse resolution of the simulation. We find that, convection permitting simulations have less evaporation than simulations with parameterized convection, resulting in a non-realistic drying of the atmosphere. We present solutions to resolve this problem.

  10. Super-Eddington Accretion in Tidal Disruption Events: the Impact of Realistic Fallback Rates on Accretion Rates

    NASA Astrophysics Data System (ADS)

    Wu, Samantha; Coughlin, Eric R.; Nixon, Chris

    2018-04-01

    After the tidal disruption of a star by a massive black hole, disrupted stellar debris can fall back to the hole at a rate significantly exceeding its Eddington limit. To understand how black hole mass affects the duration of super-Eddington accretion in tidal disruption events, we first run a suite of simulations of the disruption of a Solar-like star by a supermassive black hole of varying mass to directly measure the fallback rate onto the hole, and we compare these fallback rates to the analytic predictions of the "frozen-in" model. Then, adopting a Zero-Bernoulli Accretion flow as an analytic prescription for the accretion flow around the hole, we investigate how the accretion rate onto the black hole evolves with the more accurate fallback rates calculated from the simulations. We find that numerically-simulated fallback rates yield accretion rates onto the hole that can, depending on the black hole mass, be nearly an order of magnitude larger than those predicted by the frozen-in approximation. Our results place new limits on the maximum black hole mass for which super-Eddington accretion occurs in tidal disruption events.

  11. Modeling the 2004Indian Ocean Tsunami for Introductory Physics Students

    NASA Astrophysics Data System (ADS)

    DiLisi, Gregory A.; Rarick, Richard A.

    2006-12-01

    In this paper we develop materials to address student interest in the Indian Ocean tsunami of December 2004. We discuss the physical characteristics of tsunamis and some of the specific data regarding the 2004 event. Finally, we create an easy-to-make tsunami tank to run simulations in the classroom. The simulations exhibit three dramatic signatures of tsunamis, namely, as a tsunami moves into shallow water its amplitude increases, its wavelength and speed decrease, and its leading edge becomes increasingly steep as if to "break" or "crash." Using our tsunami tank, these realistic features were easy to observe in the classroom and evoked an enthusiastic response from our students.

  12. A formal language for the specification and verification of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1993-01-01

    A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.

  13. Virtual reality in urban water management: communicating urban flooding with particle-based CFD simulations.

    PubMed

    Winkler, Daniel; Zischg, Jonatan; Rauch, Wolfgang

    2018-01-01

    For communicating urban flood risk to authorities and the public, a realistic three-dimensional visual display is frequently more suitable than detailed flood maps. Virtual reality could also serve to plan short-term flooding interventions. We introduce here an alternative approach for simulating three-dimensional flooding dynamics in large- and small-scale urban scenes by reaching out to computer graphics. This approach, denoted 'particle in cell', is a particle-based CFD method that is used to predict physically plausible results instead of accurate flow dynamics. We exemplify the approach for the real flooding event in July 2016 in Innsbruck.

  14. ENSO Diversity Changes Due To Global Warming In CESM-LE

    NASA Astrophysics Data System (ADS)

    Carreric, A.; Dewitte, B.; Guemas, V.

    2017-12-01

    The El Niño Southern Oscillation (ENSO) is predicted to be modified due to global warming based on the CMIP3 and CMIP5 data bases. In particular the frequency of occurrence of extreme Eastern Pacific El Niño events is to double in the future in response to the increase in green-house gazes. Such forecast relies however on state-of-the-art models that still present mean state biases and do not simulate realistically key features of El Niño events such as its diversity which is related to the existence of at least two types of El Niño events, the Eastern Pacific (EP) El Nino and the Central Pacific (CP) El Niño events. Here we take advantage of the Community Earth System Model (CESM) Large Ensemble (LE) that provides 35 realizations of the climate of the 1920-2100 period with a combination of both natural and anthropogenic climate forcing factors, to explore on the one hand methods to detect changes in ENSO statistics and on the other hand to investigate changes in thermodynamical processes associated to the increase oceanic stratification owed to global warming. The CESM simulates realistically many aspects of the ENSO diversity, in particular the non-linear evolution of the phase space of the first two EOF modes of Sea Surface Temperature (SST) anomalies in the tropical Pacific. Based on indices accounting for the two ENSO regimes used in the literature, we show that, although there is no statistically significant (i.e. confidence level > 95%) changes in the occurrence of El Niño types from the present to the future climate, the estimate of the changes is sensitive to the definition of ENSO indices that is used. CESM simulates in particular an increase occurrence of extreme El Niño events that can vary by 28% from one method to the other. It is shown that the seasonal evolution of EP El Niño events is modified from the present to the future climate, with in particular a larger occurrence of events taking place in Austral summer in the warmer climate compared to events peaking in Austral winter. The ENSO non-linearity is also showed to increase, which is interpreted as resulting from the increased stratification based on the analysis of the control experiment and an estimate of the oceanic mixed-layer heat budget. Implications for understanding processes associated to change in ENSO in a warmer climate are discussed.

  15. Quantifying the added value of convection-permitting climate simulations in complex terrain: a systematic evaluation of WRF over the Himalayas

    NASA Astrophysics Data System (ADS)

    Karki, Ramchandra; Hasson, Shabeh ul; Gerlitz, Lars; Schickhoff, Udo; Scholten, Thomas; Böhner, Jürgen

    2017-07-01

    Mesoscale dynamical refinements of global climate models or atmospheric reanalysis have shown their potential to resolve intricate atmospheric processes, their land surface interactions, and subsequently, realistic distribution of climatic fields in complex terrains. Given that such potential is yet to be explored within the central Himalayan region of Nepal, we investigate the skill of the Weather Research and Forecasting (WRF) model with different spatial resolutions in reproducing the spatial, seasonal, and diurnal characteristics of the near-surface air temperature and precipitation as well as the spatial shifts in the diurnal monsoonal precipitation peak over the Khumbu (Everest), Rolwaling, and adjacent southern areas. Therefore, the ERA-Interim (0.75°) reanalysis has been dynamically refined to 25, 5, and 1 km (D1, D2, and D3) for one complete hydrological year (October 2014-September 2015), using the one-way nested WRF model run with mild nudging and parameterized convection for the outer but explicitly resolved convection for the inner domains. Our results suggest that D3 realistically reproduces the monsoonal precipitation, as compared to its underestimation by D1 but overestimation by D2. All three resolutions, however, overestimate precipitation from the westerly disturbances, owing to simulating anomalously higher intensity of few intermittent events. Temperatures are generally reproduced well by all resolutions; however, winter and pre-monsoon seasons feature a high cold bias for high elevations while lower elevations show a simultaneous warm bias. Unlike higher resolutions, D1 fails to realistically reproduce the regional-scale nocturnal monsoonal peak precipitation observed in the Himalayan foothills and its diurnal shift towards high elevations, whereas D2 resolves these characteristics but exhibits a limited skill in reproducing such a peak on the river valley scale due to the limited representation of the narrow valleys at 5 km resolution. Nonetheless, featuring a substantial skill over D1 and D2, D3 simulates almost realistic shapes of the seasonal and diurnal precipitation and the peak timings even on valley scales. These findings clearly suggest an added value of the convective-scale resolutions in realistically resolving the topoclimates over the central Himalayas, which in turn allows simulating their interactions with the synoptic-scale weather systems prevailing over high Asia.

  16. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  17. Cloud life cycle investigated via high resolution and full microphysics simulations in the surroundings of Manaus, Central Amazonia

    NASA Astrophysics Data System (ADS)

    Pauliquevis, T.; Gomes, H. B.; Barbosa, H. M.

    2014-12-01

    In this study we evaluate the skill of WRF model to simulate the actual diurnal cycle of convection in the Amazon basin. Models tipically are not capable to simulate the well documented cycle of 1) shallow cumulus in the morning; 2) towering process around noon; 3) shallow-to-deep convection and rain around 14h (LT). The fail in models is explained by the typical size of shallow cumulus (~0.5 - 2.0 km) and the coarse resolution of models using convection parameterisation (> 20 km). In this study we employed high spatial resolution (Dx = 0.625 km) to reach the shallow cumulus scale. . The simulations corresponds to a dynamical downscaling of ERA-Interim from 25 to 28 February 2013 with 40 vertical levels, 30 minutes outputs,and three nested grids (10 km, 2.5 km, 0.625 km). Improved vegetation (USGS + PROVEG), albedo and greenfrac (computed from MODIS-NDVI + LEAF-2 land surface parameterization), as well as pseudo analysis of soil moisture were used as input data sets, resulting in more realistic precipitation fields when compared to observations in sensitivity tests. Convective parameterization was switched off for the 2.5/0.625 km grids, where cloud formation was solely resolved by the microphysics module (WSM6 scheme, which provided better results). Results showed a significant improved capability of the model to simulate diurnal cycle. Shallow cumulus begin to appear in the first hours in the morning. They were followed by a towering process that culminates with precipitation in the early afternoon, which is a behavior well described by observations but rarely obtained in models. Rain volumes were also realistic (~20 mm for single events) when compared to typical events during the period, which is in the core of the wet season. Cloud fields evolution also differed with respect to Amazonas River bank, which is a clear evidence of the interaction between river breeze and large scale circulation.

  18. Predicting Pilot Performance in Off-Nominal Conditions: A Meta-Analysis and Model Validation

    NASA Technical Reports Server (NTRS)

    Wickens, C.D.; Hooey, B.L.; Gore, B.F.; Sebok, A.; Koenecke, C.; Salud, E.

    2009-01-01

    Pilot response to off-nominal (very rare) events represents a critical component to understanding the safety of next generation airspace technology and procedures. We describe a meta-analysis designed to integrate the existing data regarding pilot accuracy of detecting rare, unexpected events such as runway incursions in realistic flight simulations. Thirty-five studies were identified and pilot responses were categorized by expectancy, event location, and whether the pilot was flying with a highway-in-the-sky display. All three dichotomies produced large, significant effects on event miss rate. A model of human attention and noticing, N-SEEV, was then used to predict event noticing performance as a function of event salience and expectancy, and retinal eccentricity. Eccentricity is predicted from steady state scanning by the SEEV model of attention allocation. The model was used to predict miss rates for the expectancy, location and highway-in-the-sky (HITS) effects identified in the meta-analysis. The correlation between model-predicted results and data from the meta-analysis was 0.72.

  19. Characteristics of atmospheric circulation patterns associated with extreme temperatures over North America in observations and climate models

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.

    Motivated by a desire to understand the physical mechanisms involved in future anthropogenic changes in extreme temperature events, the key atmospheric circulation patterns associated with extreme daily temperatures over North America in the current climate are identified. Several novel metrics are used to systematically identify and describe these patterns for the entire continent. The orientation, physical characteristics, and spatial scale of these circulation patterns vary based on latitude, season, and proximity to important geographic features (i.e., mountains, coastlines). The anomaly patterns associated with extreme cold events tend to be similar to, but opposite in sign of, those associated with extreme warm events, especially within the westerlies, and tend to scale with temperature in the same locations. The influence of the Pacific North American (PNA) pattern, the Northern Annular Mode (NAM), and the El Niño-Southern Oscillation (ENSO) on extreme temperature days and months shows that associations between extreme temperatures and the PNA and NAM are stronger than associations with ENSO. In general, the association with extremes tends to be stronger on monthly than daily time scales. Extreme temperatures are associated with the PNA and NAM in locations typically influenced by these circulation patterns; however many extremes still occur on days when the amplitude and polarity of these patterns do not favor their occurrence. In winter, synoptic-scale, transient weather disturbances are important drivers of extreme temperature days; however these smaller-scale events are often concurrent with amplified PNA or NAM patterns. Associations are weaker in summer when other physical mechanisms affecting the surface energy balance, such as anomalous soil moisture content, are associated with extreme temperatures. Analysis of historical runs from seventeen climate models from the CMIP5 database suggests that most models simulate realistic circulation patterns associated with extreme temperature days in most places. Model-simulated patterns tend to resemble observed patterns better in the winter than the summer and at 500 hPa than at the surface. There is substantial variability among the suite of models analyzed and most models simulate circulation patterns more realistically away from influential features such as large bodies of water and complex topography.

  20. Simulation for ward processes of surgical care.

    PubMed

    Pucher, Philip H; Darzi, Ara; Aggarwal, Rajesh

    2013-07-01

    The role of simulation in surgical education, initially confined to technical skills and procedural tasks, increasingly includes training nontechnical skills including communication, crisis management, and teamwork. Research suggests that many preventable adverse events can be attributed to nontechnical error occurring within a ward context. Ward rounds represent the primary point of interaction between patient and physician but take place without formalized training or assessment. The simulated ward should provide an environment in which processes of perioperative care can be performed safely and realistically, allowing multidisciplinary assessment and training of full ward rounds. We review existing literature and describe our experience in setting up our ward simulator. We examine the facilities, equipment, cost, and personnel required for establishing a surgical ward simulator and consider the scenario development, assessment, and feedback tools necessary to integrate it into a surgical curriculum. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Modeling of long range frequency sweeping for energetic particle modes

    NASA Astrophysics Data System (ADS)

    Nyqvist, R. M.; Breizman, B. N.

    2013-04-01

    Long range frequency sweeping events are simulated numerically within a one-dimensional, electrostatic bump-on-tail model with fast particle sources and collisions. The numerical solution accounts for fast particle trapping and detrapping in an evolving wave field with a fixed wavelength, and it includes three distinct collisions operators: Drag (dynamical friction on the background electrons), Krook-type collisions, and velocity space diffusion. The effects of particle trapping and diffusion on the evolution of holes and clumps are investigated, and the occurrence of non-monotonic (hooked) frequency sweeping and asymptotically steady holes is discussed. The presented solution constitutes a step towards predictive modeling of frequency sweeping events in more realistic geometries.

  2. Reconstruction of bar {p}p events in PANDA

    NASA Astrophysics Data System (ADS)

    Spataro, S.

    2012-08-01

    The PANDA experiment will study anti-proton proton and anti-proton nucleus collisions in the HESR complex of the facility FAIR, in a beam momentum range from 2 GeV jc up to 15 GeV/c. In preparation for the experiment, a software framework based on ROOT (PandaRoot) is being developed for the simulation, reconstruction and analysis of physics events, running also on a GRID infrastructure. Detailed geometry descriptions and different realistic reconstruction algorithms are implemented, currently used for the realization of the Technical Design Reports. The contribution will report about the reconstruction capabilities of the Panda spectrometer, focusing mainly on the performances of the tracking system and the results for the analysis of physics benchmark channels.

  3. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  4. The Role of Temporal Evolution in Modeling Atmospheric Emissions from Tropical Fires

    NASA Technical Reports Server (NTRS)

    Marlier, Miriam E.; Voulgarakis, Apostolos; Shindell, Drew T.; Faluvegi, Gregory S.; Henry, Candise L.; Randerson, James T.

    2014-01-01

    Fire emissions associated with tropical land use change and maintenance influence atmospheric composition, air quality, and climate. In this study, we explore the effects of representing fire emissions at daily versus monthly resolution in a global composition-climate model. We find that simulations of aerosols are impacted more by the temporal resolution of fire emissions than trace gases such as carbon monoxide or ozone. Daily-resolved datasets concentrate emissions from fire events over shorter time periods and allow them to more realistically interact with model meteorology, reducing how often emissions are concurrently released with precipitation events and in turn increasing peak aerosol concentrations. The magnitude of this effect varies across tropical ecosystem types, ranging from smaller changes in modeling the low intensity, frequent burning typical of savanna ecosystems to larger differences when modeling the short-term, intense fires that characterize deforestation events. The utility of modeling fire emissions at a daily resolution also depends on the application, such as modeling exceedances of particulate matter concentrations over air quality guidelines or simulating regional atmospheric heating patterns.

  5. Markov state modeling of sliding friction

    NASA Astrophysics Data System (ADS)

    Pellegrini, F.; Landes, François P.; Laio, A.; Prestipino, S.; Tosatti, E.

    2016-11-01

    Markov state modeling (MSM) has recently emerged as one of the key techniques for the discovery of collective variables and the analysis of rare events in molecular simulations. In particular in biochemistry this approach is successfully exploited to find the metastable states of complex systems and their evolution in thermal equilibrium, including rare events, such as a protein undergoing folding. The physics of sliding friction and its atomistic simulations under external forces constitute a nonequilibrium field where relevant variables are in principle unknown and where a proper theory describing violent and rare events such as stick slip is still lacking. Here we show that MSM can be extended to the study of nonequilibrium phenomena and in particular friction. The approach is benchmarked on the Frenkel-Kontorova model, used here as a test system whose properties are well established. We demonstrate that the method allows the least prejudiced identification of a minimal basis of natural microscopic variables necessary for the description of the forced dynamics of sliding, through their probabilistic evolution. The steps necessary for the application to realistic frictional systems are highlighted.

  6. A task-related and resting state realistic fMRI simulator for fMRI data validation

    NASA Astrophysics Data System (ADS)

    Hill, Jason E.; Liu, Xiangyu; Nutter, Brian; Mitra, Sunanda

    2017-02-01

    After more than 25 years of published functional magnetic resonance imaging (fMRI) studies, careful scrutiny reveals that most of the reported results lack fully decisive validation. The complex nature of fMRI data generation and acquisition results in unavoidable uncertainties in the true estimation and interpretation of both task-related activation maps and resting state functional connectivity networks, despite the use of various statistical data analysis methodologies. The goal of developing the proposed STANCE (Spontaneous and Task-related Activation of Neuronally Correlated Events) simulator is to generate realistic task-related and/or resting-state 4D blood oxygenation level dependent (BOLD) signals, given the experimental paradigm and scan protocol, by using digital phantoms of twenty normal brains available from BrainWeb (http://brainweb.bic.mni.mcgill.ca/brainweb/). The proposed simulator will include estimated system and modelled physiological noise as well as motion to serve as a reference to measured brain activities. In its current form, STANCE is a MATLAB toolbox with command line functions serving as an open-source add-on to SPM8 (http://www.fil.ion.ucl.ac.uk/spm/software/spm8/). The STANCE simulator has been designed in a modular framework so that the hemodynamic response (HR) and various noise models can be iteratively improved to include evolving knowledge about such models.

  7. Classifier for gravitational-wave inspiral signals in nonideal single-detector data

    NASA Astrophysics Data System (ADS)

    Kapadia, S. J.; Dent, T.; Dal Canton, T.

    2017-11-01

    We describe a multivariate classifier for candidate events in a templated search for gravitational-wave (GW) inspiral signals from neutron-star-black-hole (NS-BH) binaries, in data from ground-based detectors where sensitivity is limited by non-Gaussian noise transients. The standard signal-to-noise ratio (SNR) and chi-squared test for inspiral searches use only properties of a single matched filter at the time of an event; instead, we propose a classifier using features derived from a bank of inspiral templates around the time of each event, and also from a search using approximate sine-Gaussian templates. The classifier thus extracts additional information from strain data to discriminate inspiral signals from noise transients. We evaluate a random forest classifier on a set of single-detector events obtained from realistic simulated advanced LIGO data, using simulated NS-BH signals added to the data. The new classifier detects a factor of 1.5-2 more signals at low false positive rates as compared to the standard "reweighted SNR" statistic, and does not require the chi-squared test to be computed. Conversely, if only the SNR and chi-squared values of single-detector events are available, random forest classification performs nearly identically to the reweighted SNR.

  8. Simulation study of pedestrian flow in a station hall during the Spring Festival travel rush

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Qian; Cai, Yun; Zhang, Jianlin; Ma, Qingguo

    2013-05-01

    The Spring Festival is the most important festival in China. How can passengers go home smoothly and quickly during the Spring Festival travel rush, especially when emergencies of terrible winter weather happen? By modifying the social force model, we simulated the pedestrian flow in a station hall. The simulation revealed casualties happened when passengers escaped from panic induced by crowd turbulence. The results suggest that passenger numbers, ticket checking patterns, baggage volumes, and anxiety can affect the speed of passing through the waiting corridor. Our approach is meaningful in understanding the feature of a crowd moving and can be served to reproduce mass events. Therefore, it not only develops a realistic modeling of pedestrian flow but also is important for a better preparation of emergency management.

  9. Fast Low-to-High Confinement Mode Bifurcation Dynamics in a Tokamak Edge Plasma Gyrokinetic Simulation.

    PubMed

    Chang, C S; Ku, S; Tynan, G R; Hager, R; Churchill, R M; Cziegler, I; Greenwald, M; Hubbard, A E; Hughes, J W

    2017-04-28

    Transport barrier formation and its relation to sheared flows in fluids and plasmas are of fundamental interest in various natural and laboratory observations and of critical importance in achieving an economical energy production in a magnetic fusion device. Here we report the first observation of an edge transport barrier formation event in an electrostatic gyrokinetic simulation carried out in a realistic diverted tokamak edge geometry under strong forcing by a high rate of heat deposition. The results show that turbulent Reynolds-stress-driven sheared E×B flows act in concert with neoclassical orbit loss to quench turbulent transport and form a transport barrier just inside the last closed magnetic flux surface.

  10. A continuous analog of run length distributions reflecting accumulated fractionation events.

    PubMed

    Yu, Zhe; Sankoff, David

    2016-11-11

    We propose a new, continuous model of the fractionation process (duplicate gene deletion after polyploidization) on the real line. The aim is to infer how much DNA is deleted at a time, based on segment lengths for alternating deleted (invisible) and undeleted (visible) regions. After deriving a number of analytical results for "one-sided" fractionation, we undertake a series of simulations that help us identify the distribution of segment lengths as a gamma with shape and rate parameters evolving over time. This leads to an inference procedure based on observed length distributions for visible and invisible segments. We suggest extensions of this mathematical and simulation work to biologically realistic discrete models, including two-sided fractionation.

  11. A design of hardware haptic interface for gastrointestinal endoscopy simulation.

    PubMed

    Gu, Yunjin; Lee, Doo Yong

    2011-01-01

    Gastrointestinal endoscopy simulations have been developed to train endoscopic procedures which require hundreds of practices to be competent in the skills. Even though realistic haptic feedback is important to provide realistic sensation to the user, most of previous simulations including commercialized simulation have mainly focused on providing realistic visual feedback. In this paper, we propose a novel design of portable haptic interface, which provides 2DOF force feedback, for the gastrointestinal endoscopy simulation. The haptic interface consists of translational and rotational force feedback mechanism which are completely decoupled, and gripping mechanism for controlling connection between the endoscope and the force feedback mechanism.

  12. The "Virtual ChemLab" Project: A Realistic and Sophisticated Simulation of Organic Synthesis and Organic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard

    2005-01-01

    A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.

  13. The May 17, 2012 Solar Event: Back-Tracing Analysis and Flux Reconstruction with PAMELA

    NASA Technical Reports Server (NTRS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; hide

    2016-01-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  14. MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance

    PubMed Central

    2014-01-01

    Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441

  15. Stochastic summation of empirical Green's functions

    USGS Publications Warehouse

    Wennerberg, Leif

    1990-01-01

    Two simple strategies are presented that use random delay times for repeatedly summing the record of a relatively small earthquake to simulate the effects of a larger earthquake. The simulations do not assume any fault plane geometry or rupture dynamics, but realy only on the ω−2 spectral model of an earthquake source and elementary notions of source complexity. The strategies simulate ground motions for all frequencies within the bandwidth of the record of the event used as a summand. The first strategy, which introduces the basic ideas, is a single-stage procedure that consists of simply adding many small events with random time delays. The probability distribution for delays has the property that its amplitude spectrum is determined by the ratio of ω−2 spectra, and its phase spectrum is identically zero. A simple expression is given for the computation of this zero-phase scaling distribution. The moment rate function resulting from the single-stage simulation is quite simple and hence is probably not realistic for high-frequency (>1 Hz) ground motion of events larger than ML∼ 4.5 to 5. The second strategy is a two-stage summation that simulates source complexity with a few random subevent delays determined using the zero-phase scaling distribution, and then clusters energy around these delays to get an ω−2 spectrum for the sum. Thus, the two-stage strategy allows simulations of complex events of any size for which the ω−2 spectral model applies. Interestingly, a single-stage simulation with too few ω−2records to get a good fit to an ω−2 large-event target spectrum yields a record whose spectral asymptotes are consistent with the ω−2 model, but that includes a region in its spectrum between the corner frequencies of the larger and smaller events reasonably approximated by a power law trend. This spectral feature has also been discussed as reflecting the process of partial stress release (Brune, 1970), an asperity failure (Boatwright, 1984), or the breakdown of ω−2 scaling due to rupture significantly longer than the width of the seismogenic zone (Joyner, 1984).

  16. Comparative Study of the Effectiveness of Three Learning Environments: Hyper-Realistic Virtual Simulations, Traditional Schematic Simulations and Traditional Laboratory

    ERIC Educational Resources Information Center

    Martinez, Guadalupe; Naranjo, Francisco L.; Perez, Angel L.; Suero, Maria Isabel; Pardo, Pedro J.

    2011-01-01

    This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output.…

  17. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yang; Song, Huajing; Zhang, Feng

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a “persistent embryo” method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleationmore » of the B2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. In conclusion, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.« less

  18. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach

    DOE PAGES

    Sun, Yang; Song, Huajing; Zhang, Feng; ...

    2018-02-23

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a “persistent embryo” method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleationmore » of the B2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. In conclusion, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.« less

  19. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach

    NASA Astrophysics Data System (ADS)

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I.; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-01

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B 2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  20. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach.

    PubMed

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-23

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  1. Development and validation of an artificial wetlab training system for the lumbar discectomy.

    PubMed

    Adermann, Jens; Geissler, Norman; Bernal, Luis E; Kotzsch, Susanne; Korb, Werner

    2014-09-01

    An initial research indicated that realistic haptic simulators with an adapted training concept are needed to enhance the training for spinal surgery. A cognitive task analysis (CTA) was performed to define a realistic and helpful scenario-based simulation. Based on the results a simulator for lumbar discectomy was developed. Additionally, a realistic training operating room was built for a pilot. The results were validated. The CTA showed a need for realistic scenario-based training in spine surgery. The developed simulator consists of synthetic bone structures, synthetic soft tissue and an advanced bleeding system. Due to the close interdisciplinary cooperation of surgeons between engineers and psychologists, the iterative multicentre validation showed that the simulator is visually and haptically realistic. The simulator offers integrated sensors for the evaluation of the traction being used and the compression during surgery. The participating surgeons in the pilot workshop rated the simulator and the training concept as very useful for the improvement of their surgical skills. In the context of the present work a precise definition for the simulator and training concept was developed. The additional implementation of sensors allows the objective evaluation of the surgical training by the trainer. Compared to other training simulators and concepts, the high degree of objectivity strengthens the acceptance of the feedback. The measured data of the nerve root tension and the compression of the dura can be used for intraoperative control and a detailed postoperative evaluation.

  2. What can we learn from simulating Stratospheric Sudden Warming periods with the Thermosphere-Ionosphere-Mesosphere-Electrodynamics GCM?

    NASA Astrophysics Data System (ADS)

    Maute, A. I.; Hagan, M. E.; Roble, R. G.; Richmond, A. D.; Yudin, V. A.; Liu, H.; Goncharenko, L. P.; Burns, A. G.; Maruyama, N.

    2013-12-01

    The ionosphere-thermosphere system is not only influenced from geospace but also by meteorological variability. Ionospheric observations of GPS TEC during the current solar cycle have shown that the meteorological variability is important during solar minimum, but also can have significant ionospheric effects during solar medium to maximum conditions. Numerical models can be used to help understand the mechanisms that couple the lower and upper atmosphere over the solar cycle. Numerical modelers invoke different methods to simulate realistic, specified events of meteorological variability, e.g. specify the lower boundary forcing, nudge the middle atmosphere, data assimilation. To study the vertical coupling, we first need to assess the numerical models and the various methods used to simulate realistic events with respect to the dynamics of the mesosphere-lower thermosphere (MLT) region, the electrodynamics, and the ionosphere. This study focuses on Stratospheric Sudden Warming (SSW) periods since these are associated with a strongly disturbed middle atmosphere which can have effects up to the ionosphere. We will use the NCAR Thermosphere-Ionosphere-Mesosphere-Electrodynamics General Circulation model (TIME-GCM) to examine several recent SSW periods, e.g. 2009, 2012, and 2013. The SSW period in TIME-GCM will be specified in three different ways: 1. using reanalysis data to specify the lower boundary; 2. nudging the neutral atmosphere (temperature and winds) with the Whole Atmosphere Community Climate Model (WACCM)/Goddard Earth Observing System Model, Version 5 (GEOS-5) results; 3. nudging the background atmosphere (temperature and winds) with WACCM/GEOS5 results. The different forcing methods will be evaluated for the SSW periods with respect to the dynamics of the MLT region, the low latitude vertical drift changes, and the ionospheric effects for the different SSW periods. With the help of ionospheric data at different longitudinal sectors it will be possible to assess the simulations of the SSW periods and provide guidance for future studies.

  3. Intercomparison of oceanic and atmospheric forced and coupled mesoscale simulations. Part I: Surface fluxes

    NASA Astrophysics Data System (ADS)

    Josse, P.; Caniaux, G.; Giordani, H.; Planton, S.

    1999-04-01

    A mesoscale non-hydrostatic atmospheric model has been coupled with a mesoscale oceanic model. The case study is a four-day simulation of a strong storm event observed during the SEMAPHORE experiment over a 500 × 500 km2 domain. This domain encompasses a thermohaline front associated with the Azores current. In order to analyze the effect of mesoscale coupling, three simulations are compared: the first one with the atmospheric model forced by realistic sea surface temperature analyses; the second one with the ocean model forced by atmospheric fields, derived from weather forecast re-analyses; the third one with the models being coupled. For these three simulations the surface fluxes were computed with the same bulk parametrization. All three simulations succeed well in representing the main oceanic or atmospheric features observed during the storm. Comparison of surface fields with in situ observations reveals that the winds of the fine mesh atmospheric model are more realistic than those of the weather forecast re-analyses. The low-level winds simulated with the atmospheric model in the forced and coupled simulations are appreciably stronger than the re-analyzed winds. They also generate stronger fluxes. The coupled simulation has the strongest surface heat fluxes: the difference in the net heat budget with the oceanic forced simulation reaches on average 50 Wm-2 over the simulation period. Sea surface-temperature cooling is too weak in both simulations, but is improved in the coupled run and matches better the cooling observed with drifters. The spatial distributions of sea surface-temperature cooling and surface fluxes are strongly inhomogeneous over the simulation domain. The amplitude of the flux variation is maximum in the coupled run. Moreover the weak correlation between the cooling and heat flux patterns indicates that the surface fluxes are not responsible for the whole cooling and suggests that the response of the ocean mixed layer to the atmosphere is highly non-local and enhanced in the coupled simulation.

  4. Realistic Simulations of Coronagraphic Observations with Future Space Telescopes

    NASA Astrophysics Data System (ADS)

    Rizzo, M. J.; Roberge, A.; Lincowski, A. P.; Zimmerman, N. T.; Juanola-Parramon, R.; Pueyo, L.; Hu, M.; Harness, A.

    2017-11-01

    We present a framework to simulate realistic observations of future space-based coronagraphic instruments. This gathers state-of-the-art scientific and instrumental expertise allowing robust characterization of future instrument concepts.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gisler, Galen R.; Weaver, R. P.; Mader, Charles L.

    Kick-em Jenny, in the Eastern Caribbean, is a submerged volcanic cone that has erupted a dozen or more times since its discovery in 1939. The most likely hazard posed by this volcano is to shipping in the immediate vicinity (through volcanic missiles or loss-of-buoyancy), but it is of interest to estimate upper limits on tsunamis that might be produced by a catastrophic explosive eruption. To this end, we have performed two-dimensional simulations of such an event in a geometry resembling that of Kick-em Jenny with our SAGE adaptive mesh Eulerian multifluid compressible hydrocode. We use realistic equations of state formore » air, water, and basalt, and follow the event from the initial explosive eruption, through the generation of a transient water cavity and the propagation of waves away from the site. We find that even for extremely catastrophic explosive eruptions, tsunamis from Kick-em Jenny are unlikely to pose significant danger to nearby islands. For comparison, we have also performed simulations of explosive eruptions at the much larger shield volcano Vailuluu in the Samoan chain, where the greater energy available can produce a more impressive wave. In general, however, we conclude that explosive eruptions do not couple well to water waves. The waves that are produced from such events are turbulent and highly dissipative, and don't propagate well. This is consistent with what we have found previously in simulations of asteroid-impact generated tsunamis. Non-explosive events, however, such as landslides or gas hydrate releases, do couple well to waves, and our simulations of tsunamis generated by subaerial and sub-aqueous landslides demonstrate this.« less

  6. Spontaneous abrupt climate change due to an atmospheric blocking-sea-ice-ocean feedback in an unforced climate model simulation.

    PubMed

    Drijfhout, Sybren; Gleeson, Emily; Dijkstra, Henk A; Livina, Valerie

    2013-12-03

    Abrupt climate change is abundant in geological records, but climate models rarely have been able to simulate such events in response to realistic forcing. Here we report on a spontaneous abrupt cooling event, lasting for more than a century, with a temperature anomaly similar to that of the Little Ice Age. The event was simulated in the preindustrial control run of a high-resolution climate model, without imposing external perturbations. Initial cooling started with a period of enhanced atmospheric blocking over the eastern subpolar gyre. In response, a southward progression of the sea-ice margin occurred, and the sea-level pressure anomaly was locked to the sea-ice margin through thermal forcing. The cold-core high steered more cold air to the area, reinforcing the sea-ice concentration anomaly east of Greenland. The sea-ice surplus was carried southward by ocean currents around the tip of Greenland. South of 70 °N, sea ice already started melting and the associated freshwater anomaly was carried to the Labrador Sea, shutting off deep convection. There, surface waters were exposed longer to atmospheric cooling and sea surface temperature dropped, causing an even larger thermally forced high above the Labrador Sea. In consequence, east of Greenland, anomalous winds changed from north to south, terminating the event with similar abruptness to its onset. Our results imply that only climate models that possess sufficient resolution to correctly represent atmospheric blocking, in combination with a sensitive sea-ice model, are able to simulate this kind of abrupt climate change.

  7. Spontaneous abrupt climate change due to an atmospheric blocking–sea-ice–ocean feedback in an unforced climate model simulation

    PubMed Central

    Drijfhout, Sybren; Gleeson, Emily; Dijkstra, Henk A.; Livina, Valerie

    2013-01-01

    Abrupt climate change is abundant in geological records, but climate models rarely have been able to simulate such events in response to realistic forcing. Here we report on a spontaneous abrupt cooling event, lasting for more than a century, with a temperature anomaly similar to that of the Little Ice Age. The event was simulated in the preindustrial control run of a high-resolution climate model, without imposing external perturbations. Initial cooling started with a period of enhanced atmospheric blocking over the eastern subpolar gyre. In response, a southward progression of the sea-ice margin occurred, and the sea-level pressure anomaly was locked to the sea-ice margin through thermal forcing. The cold-core high steered more cold air to the area, reinforcing the sea-ice concentration anomaly east of Greenland. The sea-ice surplus was carried southward by ocean currents around the tip of Greenland. South of 70°N, sea ice already started melting and the associated freshwater anomaly was carried to the Labrador Sea, shutting off deep convection. There, surface waters were exposed longer to atmospheric cooling and sea surface temperature dropped, causing an even larger thermally forced high above the Labrador Sea. In consequence, east of Greenland, anomalous winds changed from north to south, terminating the event with similar abruptness to its onset. Our results imply that only climate models that possess sufficient resolution to correctly represent atmospheric blocking, in combination with a sensitive sea-ice model, are able to simulate this kind of abrupt climate change. PMID:24248352

  8. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    ERIC Educational Resources Information Center

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  9. MHD Calculation of halo currents and vessel forces in NSTX VDEs

    NASA Astrophysics Data System (ADS)

    Breslau, J. A.; Strauss, H. R.; Paccagnella, R.

    2012-10-01

    Research tokamaks such as ITER must be designed to tolerate a limited number of disruptions without sustaining significant damage. It is therefore vital to have numerical tools that can accurately predict the effects of these events. The 3D nonlinear extended MHD code M3D [1] can be used to simulate disruptions and calculate the associated wall currents and forces. It has now been validated against halo current data from NSTX experiments in which vertical displacement events (VDEs) were deliberately induced by turning off vertical feedback control. The results of high-resolution numerical simulations at realistic Lundquist numbers show reasonable agreement with the data, supporting a model in which the most dangerously asymmetric currents and heat loads, and the largest horizontal forces, arise in situations where a fast-growing ideal 2,1 external kink mode is destabilized by the scraping-off of flux surfaces with safety factor q>2 during the course of the VDE. [4pt] [1] W. Park, et al., Phys. Plasmas 6 (1999) 1796.

  10. The Evolution of On-Board Emergency Training for the International Space Station Crew

    NASA Technical Reports Server (NTRS)

    LaBuff, Skyler

    2015-01-01

    The crew of the International Space Station (ISS) receives extensive ground-training in order to safely and effectively respond to any potential emergency event while on-orbit, but few people realize that their training is not concluded when they launch into space. The evolution of the emergency On- Board Training events (OBTs) has recently moved from paper "scripts" to an intranet-based software simulation that allows for the crew, as well as the flight control teams in Mission Control Centers across the world, to share in an improved and more realistic training event. This emergency OBT simulator ensures that the participants experience the training event as it unfolds, completely unaware of the type, location, or severity of the simulated emergency until the scenario begins. The crew interfaces with the simulation software via iPads that they keep with them as they translate through the ISS modules, receiving prompts and information as they proceed through the response. Personnel in the control centers bring up the simulation via an intranet browser at their console workstations, and can view additional telemetry signatures in simulated ground displays in order to assist the crew and communicate vital information to them as applicable. The Chief Training Officers and emergency instructors set the simulation in motion, choosing the type of emergency (rapid depressurization, fire, or toxic atmosphere) and specific initial conditions to emphasize the desired training objectives. Project development, testing, and implementation was a collaborative effort between ISS emergency instructors, Chief Training Officers, Flight Directors, and the Crew Office using commercial off the shelf (COTS) hardware along with simulation software created in-house. Due to the success of the Emergency OBT simulator, the already-developed software has been leveraged and repurposed to develop a new emulator used during fire response ground-training to deliver data that the crew receives from the handheld Compound Specific Analyzer for Combustion Products (CSA-CP). This CSA-CP emulator makes use of a portion of codebase from the Emergency OBT simulator dealing with atmospheric contamination during fire scenarios, and feeds various data signatures to crew via an iPod Touch with a flight-like CSA-CP display. These innovative simulations, which make use of COTS hardware with custom in-house software, have yielded drastic improvements to emergency training effectiveness and risk reduction for ISS crew and flight control teams during on-orbit and ground training events.

  11. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    NASA Astrophysics Data System (ADS)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  12. Ulysses Observations of Tripolar Guide-Magnetic Field Perturbations Across Solar Wind Reconnection Exhausts

    NASA Astrophysics Data System (ADS)

    Eriksson, S.; Peng, B.; Markidis, S.; Gosling, J. T.; McComas, D. J.; Lapenta, G.; Newman, D. L.

    2014-12-01

    We report observations from 15 solar wind reconnection exhausts encountered along the Ulysses orbit beyond 4 AU in 1996-1999 and 2002-2005. The events, which lasted between 17 and 45 min, were found at heliospheric latitudes between -36o and 21o with one event detected as high as 58o. All events shared a common characteristic of a tripolar guide-magnetic field perturbation being detected across the observed exhausts. The signature consists of an enhanced guide field magnitude within the exhaust center and two regions of significantly depressed guide-fields adjacent to the center region. The events displayed magnetic field shear angles as low as 37o with a mean of 89o. This corresponds to a strong external guide field relative to the anti-parallel reconnecting component of the magnetic field with a mean ratio of 1.3 and a maximum ratio of 3.1. A 2-D kinetic reconnection simulation for realistic solar wind conditions reveals that tripolar guide fields form at current sheets in the presence of multiple X-lines as two magnetic islands interact with one another for such strong guide fields. The Ulysses observations are also compared with the results of a 3-D kinetic simulation of multiple flux ropes in a strong guide field.

  13. Modeling solar energetic particle events using ENLIL heliosphere simulations

    NASA Astrophysics Data System (ADS)

    Luhmann, J. G.; Mays, M. L.; Odstrcil, D.; Li, Yan; Bain, H.; Lee, C. O.; Galvin, A. B.; Mewaldt, R. A.; Cohen, C. M. S.; Leske, R. A.; Larson, D.; Futaana, Y.

    2017-07-01

    Solar energetic particle (SEP) event modeling has gained renewed attention in part because of the availability of a decade of multipoint measurements from STEREO and L1 spacecraft at 1 AU. These observations are coupled with improving simulations of the geometry and strength of heliospheric shocks obtained by using coronagraph images to send erupted material into realistic solar wind backgrounds. The STEREO and ACE measurements in particular have highlighted the sometimes surprisingly widespread nature of SEP events. It is thus an opportune time for testing SEP models, which typically focus on protons 1-100 MeV, toward both physical insight to these observations and potentially useful space radiation environment forecasting tools. Some approaches emphasize the concept of particle acceleration and propagation from close to the Sun, while others emphasize the local field line connection to a traveling, evolving shock source. Among the latter is the previously introduced SEPMOD treatment, based on the widely accessible and well-exercised WSA-ENLIL-cone model. SEPMOD produces SEP proton time profiles at any location within the ENLIL domain. Here we demonstrate a SEPMOD version that accommodates multiple, concurrent shock sources occurring over periods of several weeks. The results illustrate the importance of considering longer-duration time periods and multiple CME contributions in analyzing, modeling, and forecasting SEP events.

  14. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    PubMed

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  15. Asynchronous discrete event schemes for PDEs

    NASA Astrophysics Data System (ADS)

    Stone, D.; Geiger, S.; Lord, G. J.

    2017-08-01

    A new class of asynchronous discrete-event simulation schemes for advection-diffusion-reaction equations is introduced, based on the principle of allowing quanta of mass to pass through faces of a (regular, structured) Cartesian finite volume grid. The timescales of these events are linked to the flux on the face. The resulting schemes are self-adaptive, and local in both time and space. Experiments are performed on realistic physical systems related to porous media flow applications, including a large 3D advection diffusion equation and advection diffusion reaction systems. The results are compared to highly accurate reference solutions where the temporal evolution is computed with exponential integrator schemes using the same finite volume discretisation. This allows a reliable estimation of the solution error. Our results indicate a first order convergence of the error as a control parameter is decreased, and we outline a framework for analysis.

  16. Fast low-to-high confinement mode bifurcation dynamics in a tokamak edge plasma gyrokinetic simulation

    DOE PAGES

    Chang, C. S.; Ku, S.; Tynan, G. R.; ...

    2017-04-25

    Transport barrier formation and its relation to sheared flows in fluids and plasmas are of fundamental interest in various natural and laboratory observations and of critical importance in achieving an economical energy production in a magnetic fusion device. Here we report the first observation of an edge transport barrier formation event in an electrostatic gyrokinetic simulation carried out in a realistic diverted tokamak edge geometry under strong forcing by a high rate of heat deposition. Here, the results show that turbulent Reynolds-stress-driven sheared E x B flows act in concert with neoclassical orbit loss to quench turbulent transport and formmore » a transport barrier just inside the last closed magnetic flux surface.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, A. J.

    This is the final report for United States Geological Survey (USGS) National Earthquake Hazard Reduction Program (NEHRP) Project 08HQGR0022, entitled “Quantifying Uncertainties in Ground Motion Simulations for Scenario Earthquakes on the HaywardRodgers Creek Fault System Using the USGS 3D Seismic Velocity Model and Realistic Pseudodynamics Ruptures”. Work for this project involved three-dimensional (3D) simulations of ground motions for Hayward Fault (HF) earthquakes. We modeled moderate events on the HF and used them to evaluate the USGS 3D model of the San Francisco Bay Area. We also contributed to ground motions modeling effort for a large suite of scenario earthquakes onmore » the HF. Results were presented at conferences (see appendix) and in one peer-reviewed publication (Aagaard et al., 2010).« less

  18. Does preliminary optimisation of an anatomically correct skull-brain model using simple simulants produce clinically realistic ballistic injury fracture patterns?

    PubMed

    Mahoney, P F; Carr, D J; Delaney, R J; Hunt, N; Harrison, S; Breeze, J; Gibb, I

    2017-07-01

    Ballistic head injury remains a significant threat to military personnel. Studying such injuries requires a model that can be used with a military helmet. This paper describes further work on a skull-brain model using skulls made from three different polyurethane plastics and a series of skull 'fills' to simulate brain (3, 5, 7 and 10% gelatine by mass and PermaGel™). The models were subjected to ballistic impact from 7.62 × 39 mm mild steel core bullets. The first part of the work compares the different polyurethanes (mean bullet muzzle velocity of 708 m/s), and the second part compares the different fills (mean bullet muzzle velocity of 680 m/s). The impact events were filmed using high speed cameras. The resulting fracture patterns in the skulls were reviewed and scored by five clinicians experienced in assessing penetrating head injury. In over half of the models, one or more assessors felt aspects of the fracture pattern were close to real injury. Limitations of the model include the skull being manufactured in two parts and the lack of a realistic skin layer. Further work is ongoing to address these.

  19. Forecasting and visualization of wildfires in a 3D geographical information system

    NASA Astrophysics Data System (ADS)

    Castrillón, M.; Jorge, P. A.; López, I. J.; Macías, A.; Martín, D.; Nebot, R. J.; Sabbagh, I.; Quintana, F. M.; Sánchez, J.; Sánchez, A. J.; Suárez, J. P.; Trujillo, A.

    2011-03-01

    This paper describes a wildfire forecasting application based on a 3D virtual environment and a fire simulation engine. A novel open-source framework is presented for the development of 3D graphics applications over large geographic areas, offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community. The application includes a remote module that allows simultaneous connections of several users for monitoring a real wildfire event. The system is able to make a realistic composition of what is really happening in the area of the wildfire with dynamic 3D objects and location of human and material resources in real time, providing a new perspective to analyze the wildfire information. The user is enabled to simulate and visualize the propagation of a fire on the terrain integrating at the same time spatial information on topography and vegetation types with weather and wind data. The application communicates with a remote web service that is in charge of the simulation task. The user may specify several parameters through a friendly interface before the application sends the information to the remote server responsible of carrying out the wildfire forecasting using the FARSITE simulation model. During the process, the server connects to different external resources to obtain up-to-date meteorological data. The client application implements a realistic 3D visualization of the fire evolution on the landscape. A Level Of Detail (LOD) strategy contributes to improve the performance of the visualization system.

  20. A Mw 6.3 earthquake scenario in the city of Nice (southeast France): ground motion simulations

    NASA Astrophysics Data System (ADS)

    Salichon, Jérome; Kohrs-Sansorny, Carine; Bertrand, Etienne; Courboulex, Françoise

    2010-07-01

    The southern Alps-Ligurian basin junction is one of the most seismically active zone of the western Europe. A constant microseismicity and moderate size events (3.5 < M < 5) are regularly recorded. The last reported historical event took place in February 1887 and reached an estimated magnitude between 6 and 6.5, causing human losses and extensive damages (intensity X, Medvedev-Sponheuer-Karnik). Such an event, occurring nowadays, could have critical consequences given the high density of population living on the French and Italian Riviera. We study the case of an offshore Mw 6.3 earthquake located at the place where two moderate size events (Mw 4.5) occurred recently and where a morphotectonic feature has been detected by a bathymetric survey. We used a stochastic empirical Green’s functions (EGFs) summation method to produce a population of realistic accelerograms on rock and soil sites in the city of Nice. The ground motion simulations are calibrated on a rock site with a set of ground motion prediction equations (GMPEs) in order to estimate a reasonable stress-drop ratio between the February 25th, 2001, Mw 4.5, event taken as an EGF and the target earthquake. Our results show that the combination of the GMPEs and EGF techniques is an interesting tool for site-specific strong ground motion estimation.

  1. Development of a three-dimensional transient code for reactivity-initiated events of BWRs (boiling water reactors) - Models and code verifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uematsu, Hitoshi; Yamamoto, Toru; Izutsu, Sadayuki

    1990-06-01

    A reactivity-initiated event is a design-basis accident for the safety analysis of boiling water reactors. It is defined as a rapid transient of reactor power caused by a reactivity insertion of over $1.0 due to a postulated drop or abnormal withdrawal of the control rod from the core. Strong space-dependent feedback effects are associated with the local power increase due to control rod movement. A realistic treatment of the core status in a transient by a code with a detailed core model is recommended in evaluating this event. A three-dimensional transient code, ARIES, has been developed to meet this need.more » The code simulates the event with three-dimensional neutronics, coupled with multichannel thermal hydraulics, based on a nonequilibrium separated flow model. The experimental data obtained in reactivity accident tests performed with the SPERT III-E core are used to verify the entire code, including thermal-hydraulic models.« less

  2. Inter-annual variability of the Mediterranean thermohaline circulation in Med-CORDEX simulations

    NASA Astrophysics Data System (ADS)

    Vittoria Struglia, Maria; Adani, Mario; Carillo, Adriana; Pisacane, Giovanna; Sannino, Gianmaria; Beuvier, Jonathan; Lovato, Tomas; Sevault, Florence; Vervatis, Vassilios

    2016-04-01

    Recent atmospheric reanalysis products, such as ERA40 and ERA-interim, and their regional dynamical downscaling prompted the HyMeX/Med-CORDEX community to perform hind-cast simulations of the Mediterranean Sea, giving the opportunity to evaluate the response of different ocean models to a realistic inter-annual atmospheric forcing. Ocean numerical modeling studies have been steadily improving over the last decade through hind-cast processing, and are complementary to observations in studying the relative importance of the mechanisms playing a role in ocean variability, either external forcing or internal ocean variability. This work presents a review and an inter-comparison of the most recent hind-cast simulations of the Mediterranean Sea Circulation, produced in the framework of the Med-CORDEX initiative, at resolutions spanning from 1/8° to 1/16°. The richness of the simulations available for this study is exploited to address the effects of increasing resolution, both of models and forcing, the initialization procedure, and the prescription of the atmospheric boundary conditions, which are particularly relevant in order to model a realistic THC, in the perspective of fully coupled regional ocean-atmosphere models. The mean circulation is well reproduced by all the simulations. However, it can be observed that the horizontal resolution of both atmospheric forcing and ocean model plays a fundamental role in the reproduction of some specific features of both sub-basins and important differences can be observed among low and high resolution atmosphere forcing. We analyze the mean circulation on both the long-term and decadal time scale, and the represented inter-annual variability of intermediate and deep water mass formation processes in both the Eastern and Western sub-basins, finding that models agree with observations in correspondence of specific events, such as the 1992-1993 Eastern Mediterranean Transient, and the 2005-2006 event in the Gulf of Lion. Long-term trends of the hydrological properties have been investigated at sub-basin scale and have been interpreted in terms of response to forcing and boundary conditions, detectable differences resulting mainly due either to the different initialization and spin up procedure or to the different prescription of Atlantic boundary conditions.

  3. Impact of three task demand factors on simulated unmanned system intelligence, surveillance, and reconnaissance operations.

    PubMed

    Abich, Julian; Reinerman-Jones, Lauren; Matthews, Gerald

    2017-06-01

    The present study investigated how three task demand factors influenced performance, subjective workload and stress of novice intelligence, surveillance, and reconnaissance operators within a simulation of an unmanned ground vehicle. Manipulations were task type, dual-tasking and event rate. Participants were required to discriminate human targets within a street scene from a direct video feed (threat detection [TD] task) and detect changes in symbols presented in a map display (change detection [CD] task). Dual-tasking elevated workload and distress, and impaired performance for both tasks. However, with increasing event rate, CD task deteriorated, but TD improved. Thus, standard workload models provide a better guide to evaluating the demands of abstract symbols than to processing realistic human characters. Assessment of stress and workload may be especially important in the design and evaluation of systems in which human character critical signals must be detected in video images. Practitioner Summary: This experiment assessed subjective workload and stress during threat and CD tasks performed alone and in combination. Results indicated an increase in event rate led to significant improvements in performance during TD, but decrements during CD, yet both had associated increases in workload and engagement.

  4. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    PubMed

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  5. Investigating the Sensitivity of Model Intraseasonal Variability to Minimum Entrainment

    NASA Astrophysics Data System (ADS)

    Hannah, W. M.; Maloney, E. D.

    2008-12-01

    Previous studies have shown that using a Relaxed Arakawa-Schubert (RAS) convective parameterization with appropriate convective triggers and assumptions about rain re-evaporation produces realistic intraseasonal variability. RAS represents convection with an ensemble of clouds detraining at different heights, each with different entrainment rate, the highest clouds having the lowest entrainment rates. If tropospheric temperature gradients are weak and boundary layer moist static energy is relatively constant, then by limiting the minimum entrainment rate deep convection is suppressed in the presence of dry tropospheric air. This allows moist static energy to accumulate and be discharged during strong intraseasonal convective events, which is consistent with the discharge/recharge paradigm. This study will examine the sensitivity of intra-seasonal variability to changes in minimum entrainment rate in the NCAR-CAM3 with the RAS scheme. Simulations using several minimum entrainment rate thresholds will be investigated. A frequency-wavenumber analysis will show the improvement of the MJO signal as minimum entrainment rate is increased. The spatial and vertical structure of MJO-like disturbances will be examined, including an analysis of the time evolution of vertical humidity distribution for each simulation. Simulated results will be compared to observed MJO events in NCEP-1 reanalysis and CMAP precipitation.

  6. Atmospheric Rivers Induced Heavy Precipitation and Flooding in the Western U.S. Simulated by the WRF Regional Climate Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Lai R.; Qian, Yun

    2009-02-12

    Twenty years of regional climate simulated by the Weather Research and Forecasting model for North America has been analyzed to study the influence of the atmospheric rivers and the role of the land surface on heavy precipitation and flooding in the western U.S. Compared to observations, the simulation realistically captured the 95th percentile extreme precipitation, mean precipitation intensity, as well as the mean precipitation and temperature anomalies of all the atmospheric river events between 1980-1999. Contrasting the 1986 President Day and 1997 New Year Day atmospheric river events, differences in atmospheric stability are found to have an influence on themore » spatial distribution of precipitation in the Coastal Range of northern California. Although both cases yield similar amounts of heavy precipitation, the 1997 case was found to produce more runoff compared to the 1986 case. Antecedent soil moisture, the ratio of snowfall to total precipitation (which depends on temperature), and existing snowpack all seem to play a role, leading to a higher runoff to precipitation ratio simulated for the 1997 case. This study underscores the importance of characterizing or simulating atmospheric rivers and the land surface conditions for predicting floods, and for assessing the potential impacts of climate change on heavy precipitation and flooding in the western U.S.« less

  7. Comparison of LOPES measurements with CoREAS and REAS 3.11 simulations

    NASA Astrophysics Data System (ADS)

    Ludwig, M.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Chiavassa, A.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Fuchs, B.; Fuhrmann, D.; Gemmeke, H.; Grupen, C.; Haug, M.; Haungs, A.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huber, D.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Krömer, O.; Kuijpers, J.; Link, K.; Łuczak, P.; Mathes, H. J.; Melissas, M.; Morello, C.; Oehlschläger, J.; Palmieri, N.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Rühle, C.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Weindl, A.; Wochele, J.; Zabierowski, J.; Zensus, J. A.

    2013-05-01

    In the previous years, LOPES emerged as a very successful experiment measuring the radio emission from air showers in the MHz frequency range. In parallel, the theoretical description of radio emission was developed further and REAS became a widely used simulation Monte Carlo code. REAS 3 as well as CoREAS are based on the endpoint formalism, i.e. they calculate the emission of the air-shower without assuming specific emission mechanisms. While REAS 3 is based on histograms derived from CORSIKA simulations, CoREAS is directly implemented into CORSIKA without loss of information due to histogramming of the particle distributions. In contrast to the earlier versions of REAS, the newest version REAS 3.11 and CoREAS take into account a realistic atmospheric refractive index. To improve the understanding of the emission processes and judge the quality of the simulations, we compare their predictions with high-quality events measured by LOPES. We present results concerning the lateral distribution measured with 30 east-west aligned LOPES antennas. Only the simulation codes including the refractive index (REAS 3.11 and CoREAS) are able to reproduce the slope of measured lateral distributions, but REAS 3.0 predicts too steep lateral distributions, and does not predict rising lateral distributions as seen in a few LOPES events. Moreover, REAS 3.11 predicts an absolute amplitude compatible with the LOPES measurements.

  8. An analysis of simulated and observed storm characteristics

    NASA Astrophysics Data System (ADS)

    Benestad, R. E.

    2010-09-01

    A calculus-based cyclone identification (CCI) method has been applied to the most recent re-analysis (ERAINT) from the European Centre for Medium-range Weather Forecasts and results from regional climate model (RCM) simulations. The storm frequency for events with central pressure below a threshold value of 960-990hPa were examined, and the gradient wind from the simulated storm systems were compared with corresponding estimates from the re-analysis. The analysis also yielded estimates for the spatial extent of the storm systems, which was also included in the regional climate model cyclone evaluation. A comparison is presented between a number of RCMs and the ERAINT re-analysis in terms of their description of the gradient winds, number of cyclones, and spatial extent. Furthermore, a comparison between geostrophic wind estimated though triangules of interpolated or station measurements of SLP is presented. Wind still represents one of the more challenging variables to model realistically.

  9. Experiences with a Simulated Learning Environment--The SimuScape©: Virtual Environments in Medical Education

    ERIC Educational Resources Information Center

    Thies, Anna-Lena; Weissenstein, Anne; Haulsen, Ivo; Marschall, Bernhard; Friederichs, Hendrik

    2014-01-01

    Simulation as a tool for medical education has gained considerable importance in the past years. Various studies have shown that the mastering of basic skills happens best if taught in a realistic and workplace-based context. It is necessary that simulation itself takes place in the realistic background of a genuine clinical or in an accordingly…

  10. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  11. Decentralized real-time simulation of forest machines

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Adam, Frank; Hoffmann, Katharina; Rossmann, Juergen; Kraemer, Michael; Schluse, Michael

    2000-10-01

    To develop realistic forest machine simulators is a demanding task. A useful simulator has to provide a close- to-reality simulation of the forest environment as well as the simulation of the physics of the vehicle. Customers demand a highly realistic three dimensional forestry landscape and the realistic simulation of the complex motion of the vehicle even in rough terrain in order to be able to use the simulator for operator training under close-to- reality conditions. The realistic simulation of the vehicle, especially with the driver's seat mounted on a motion platform, greatly improves the effect of immersion into the virtual reality of a simulated forest and the achievable level of education of the driver. Thus, the connection of the real control devices of forest machines to the simulation system has to be supported, i.e. the real control devices like the joysticks or the board computer system to control the crane, the aggregate etc. Beyond, the fusion of the board computer system and the simulation system is realized by means of sensors, i.e. digital and analog signals. The decentralized system structure allows several virtual reality systems to evaluate and visualize the information of the control devices and the sensors. So, while the driver is practicing, the instructor can immerse into the same virtual forest to monitor the session from his own viewpoint. In this paper, we are describing the realized structure as well as the necessary software and hardware components and application experiences.

  12. Evidence for the need of realistic radio communications for airline pilot simulator training and evaluation

    DOT National Transportation Integrated Search

    2003-11-05

    This paper presents arguments in favor of realistic representation of radio communications during training and evaluation of airline pilots in the simulator. A survey of airlines showed that radio communications are mainly role-played by Instructor/E...

  13. Tropical Cyclones in the 7-km NASA Global Nature Run for Use in Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Reale, Oreste; Achuthavarier, Deepthi; Fuentes, Marangelly; Putman, William M.; Partyka, Gary

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Nature Run (NR), released for use in Observing System Simulation Experiments (OSSEs), is a 2-year long global non-hydrostatic free-running simulation at a horizontal resolution of 7 km, forced by observed sea-surface temperatures (SSTs) and sea ice, and inclusive of interactive aerosols and trace gases. This article evaluates the NR with respect to tropical cyclone (TC) activity. It is emphasized that to serve as a NR, a long-term simulation must be able to produce realistic TCs, which arise out of realistic large-scale forcings. The presence in the NR of the realistic, relevant dynamical features over the African Monsoon region and the tropical Atlantic is confirmed, along with realistic African Easterly Wave activity. The NR Atlantic TC seasons, produced with 2005 and 2006 SSTs, show interannual variability consistent with observations, with much stronger activity in 2005. An investigation of TC activity over all the other basins (eastern and western North Pacific, North and South Indian Ocean, and Australian region), together with relevant elements of the atmospheric circulation, such as, for example, the Somali Jet and westerly bursts, reveals that the model captures the fundamental aspects of TC seasons in every basin, producing realistic number of TCs with realistic tracks, life spans and structures. This confirms that the NASA NR is a very suitable tool for OSSEs targeting TCs and represents an improvement with respect to previous long simulations that have served the global atmospheric OSSE community.

  14. Tropical Cyclones in the 7km NASA Global Nature Run for use in Observing System Simulation Experiments

    PubMed Central

    Reale, Oreste; Achuthavarier, Deepthi; Fuentes, Marangelly; Putman, William M.; Partyka, Gary

    2018-01-01

    The National Aeronautics and Space Administration (NASA) Nature Run (NR), released for use in Observing System Simulation Experiments (OSSEs), is a 2-year long global non-hydrostatic free-running simulation at a horizontal resolution of 7 km, forced by observed sea-surface temperatures (SSTs) and sea ice, and inclusive of interactive aerosols and trace gases. This article evaluates the NR with respect to tropical cyclone (TC) activity. It is emphasized that to serve as a NR, a long-term simulation must be able to produce realistic TCs, which arise out of realistic large-scale forcings. The presence in the NR of the realistic, relevant dynamical features over the African Monsoon region and the tropical Atlantic is confirmed, along with realistic African Easterly Wave activity. The NR Atlantic TC seasons, produced with 2005 and 2006 SSTs, show interannual variability consistent with observations, with much stronger activity in 2005. An investigation of TC activity over all the other basins (eastern and western North Pacific, North and South Indian Ocean, and Australian region), together with relevant elements of the atmospheric circulation, such as, for example, the Somali Jet and westerly bursts, reveals that the model captures the fundamental aspects of TC seasons in every basin, producing realistic number of TCs with realistic tracks, life spans and structures. This confirms that the NASA NR is a very suitable tool for OSSEs targeting TCs and represents an improvement with respect to previous long simulations that have served the global atmospheric OSSE community. PMID:29674806

  15. Tropical Cyclones in the 7km NASA Global Nature Run for use in Observing System Simulation Experiments.

    PubMed

    Reale, Oreste; Achuthavarier, Deepthi; Fuentes, Marangelly; Putman, William M; Partyka, Gary

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Nature Run (NR), released for use in Observing System Simulation Experiments (OSSEs), is a 2-year long global non-hydrostatic free-running simulation at a horizontal resolution of 7 km, forced by observed sea-surface temperatures (SSTs) and sea ice, and inclusive of interactive aerosols and trace gases. This article evaluates the NR with respect to tropical cyclone (TC) activity. It is emphasized that to serve as a NR, a long-term simulation must be able to produce realistic TCs, which arise out of realistic large-scale forcings. The presence in the NR of the realistic, relevant dynamical features over the African Monsoon region and the tropical Atlantic is confirmed, along with realistic African Easterly Wave activity. The NR Atlantic TC seasons, produced with 2005 and 2006 SSTs, show interannual variability consistent with observations, with much stronger activity in 2005. An investigation of TC activity over all the other basins (eastern and western North Pacific, North and South Indian Ocean, and Australian region), together with relevant elements of the atmospheric circulation, such as, for example, the Somali Jet and westerly bursts, reveals that the model captures the fundamental aspects of TC seasons in every basin, producing realistic number of TCs with realistic tracks, life spans and structures. This confirms that the NASA NR is a very suitable tool for OSSEs targeting TCs and represents an improvement with respect to previous long simulations that have served the global atmospheric OSSE community.

  16. Finite element analyses of continuous filament ties for masonry applications : final report for the Arquin Corporation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinones, Armando, Sr.; Bibeau, Tiffany A.; Ho, Clifford Kuofei

    2008-08-01

    Finite-element analyses were performed to simulate the response of a hypothetical vertical masonry wall subject to different lateral loads with and without continuous horizontal filament ties laid between rows of concrete blocks. A static loading analysis and cost comparison were also performed to evaluate optimal materials and designs for the spacers affixed to the filaments. Results showed that polypropylene, ABS, and polyethylene (high density) were suitable materials for the spacers based on performance and cost, and the short T-spacer design was optimal based on its performance and functionality. Simulations of vertical walls subject to static loads representing 100 mph windsmore » (0.2 psi) and a seismic event (0.66 psi) showed that the simulated walls performed similarly and adequately when subject to these loads with and without the ties. Additional simulations and tests are required to assess the performance of actual walls with and without the ties under greater loads and more realistic conditions (e.g., cracks, non-linear response).« less

  17. Overcoming time scale and finite size limitations to compute nucleation rates from small scale well tempered metadynamics simulations.

    PubMed

    Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele

    2016-12-07

    Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.

  18. Overcoming time scale and finite size limitations to compute nucleation rates from small scale well tempered metadynamics simulations

    NASA Astrophysics Data System (ADS)

    Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele

    2016-12-01

    Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.

  19. Dynamical Scaling Relations and the Angular Momentum Problem in the FIRE Simulations

    NASA Astrophysics Data System (ADS)

    Schmitz, Denise; Hopkins, Philip F.; Quataert, Eliot; Keres, Dusan; Faucher-Giguere, Claude-Andre

    2015-01-01

    Simulations are an extremely important tool with which to study galaxy formation and evolution. However, even state-of-the-art simulations still fail to accurately predict important galaxy properties such as star formation rates and dynamical scaling relations. One possible explanation is the inadequacy of sub-grid models to capture the range of stellar feedback mechanisms which operate below the resolution limit of simulations. FIRE (Feedback in Realistic Environments) is a set of high-resolution cosmological galaxy simulations run using the code GIZMO. It includes more realistic models for various types of feedback including radiation pressure, supernovae, stellar winds, and photoionization and photoelectric heating. Recent FIRE results have demonstrated good agreement with the observed stellar mass-halo mass relation as well as more realistic star formation histories than previous simulations. We investigate the effects of FIRE's improved feedback prescriptions on the simulation "angular momentum problem," i.e., whether FIRE can reproduce observed scaling relations between galaxy stellar mass and rotational/dispersion velocities.

  20. End-to-end simulation and verification of GNC and robotic systems considering both space segment and ground segment

    NASA Astrophysics Data System (ADS)

    Benninghoff, Heike; Rems, Florian; Risse, Eicke; Brunner, Bernhard; Stelzer, Martin; Krenn, Rainer; Reiner, Matthias; Stangl, Christian; Gnat, Marcin

    2018-01-01

    In the framework of a project called on-orbit servicing end-to-end simulation, the final approach and capture of a tumbling client satellite in an on-orbit servicing mission are simulated. The necessary components are developed and the entire end-to-end chain is tested and verified. This involves both on-board and on-ground systems. The space segment comprises a passive client satellite, and an active service satellite with its rendezvous and berthing payload. The space segment is simulated using a software satellite simulator and two robotic, hardware-in-the-loop test beds, the European Proximity Operations Simulator (EPOS) 2.0 and the OOS-Sim. The ground segment is established as for a real servicing mission, such that realistic operations can be performed from the different consoles in the control room. During the simulation of the telerobotic operation, it is important to provide a realistic communication environment with different parameters like they occur in the real world (realistic delay and jitter, for example).

  1. Large-Eddy Simulation of the Impact of Great Garuda Project on Wind and Thermal Environment over Built-Up Area in Jakarta

    NASA Astrophysics Data System (ADS)

    Yucel, M.; Sueishi, T.; Inagaki, A.; Kanda, M.

    2017-12-01

    `Great Garuda' project is an eagle-shaped offshore structure with 17 artificial islands. This project has been designed for the coastal protection and land reclamation of Jakarta due to catastrophic flooding in the city. It offers an urban generation for 300.000 inhabitants and 600.000 workers in addition to its water safety goal. A broad coalition of Indonesian scientists has criticized the project for being negative impacts on the surrounding environment. Despite the vast research by Indonesian scientist on maritime environment, studies on wind and thermal environment over built-up area are still lacking. However, the construction of the various islands off the coast may result changes in wind patterns and thermal environment due to the alteration of the coastline and urbanization in the Jakarta Bay. Therefore, it is important to understand the airflow within the urban canopy in case of unpredictable gust events. These gust events may occur through the closely-packed high-rise buildings and pedestrians may be harmed from such gusts. Accordingly, we used numerical simulations to investigate the impact of the sea wall and the artificial islands over built-up area and, the intensity of wind gusts at the pedestrian level. Considering the fact that the size of turbulence organized structure sufficiently large computational domain is required. Therefore, a 19.2km×4.8km×1.0 km simulation domain with 2-m resolution in all directions was created to explicitly resolve the detailed shapes of buildings and the flow at the pedestrian level. This complex computation was accomplished by implementing a large-eddy simulation (LES) model. Two case studies were conducted considering the effect of realistic surface roughness and upward heat flux. Case_1 was conducted based on the current built environment and Case_2 for investigating the effect of the project on the chosen coastal region of the city. Fig.1 illustrates the schematic of the large-eddy simulation domains of two cases with and without Great Garuda Sea Wall and 17 artificial islands. 3D model of Great Garuda is shown in Fig.2. In addition to the cases mentioned above, the simulation will be generated assigning more realistic heat flux outputs from energy balance model and, inflow boundary conditions coupling with mesoscale model (Weather Research and Forecast model).

  2. Regional variability of the frequency distribution of daily precipitation and the synoptic characteristics of heavy precipitation events in present and future climate simulations

    NASA Astrophysics Data System (ADS)

    DeAngelis, Anthony M.

    Changes in the characteristics of daily precipitation in response to global warming may have serious impacts on human life and property. An analysis of precipitation in climate models is performed to evaluate how well the models simulate the present climate and how precipitation may change in the future. Models participating in phase 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) have substantial biases in their simulation of heavy precipitation intensity over parts of North America during the 20th century. Despite these biases, the large-scale atmospheric circulation accompanying heavy precipitation is either simulated realistically or the strength of the circulation is overestimated. The biases are not related to the large-scale flow in a simple way, pointing toward the importance of other model deficiencies, such as coarse horizontal resolution and convective parameterizations, for the accurate simulation of intense precipitation. Although the models may not sufficiently simulate the intensity of precipitation, their realistic portrayal of the large-scale circulation suggests that projections of future precipitation may be reliable. In the CMIP5 ensemble, the distribution of daily precipitation is projected to undergo substantial changes in response to future atmospheric warming. The regional distribution of these changes was investigated, revealing that dry days and days with heavy-extreme precipitation are projected to increase at the expense of light-moderate precipitation over much of the middle and low latitudes. Such projections have serious implications for future impacts from flood and drought events. In other places, changes in the daily precipitation distribution are characterized by a shift toward either wetter or drier conditions in the future, with heavy-extreme precipitation projected to increase in all but the driest subtropical subsidence regions. Further analysis shows that increases in heavy precipitation in midlatitudes are largely explained by thermodynamics, including increases in atmospheric water vapor. However, in low latitudes and northern high latitudes, changes in vertical velocity accompanying heavy precipitation are also important. The strength of the large-scale atmospheric circulation is projected to change in accordance with vertical velocity in many places, though the circulation patterns, and therefore physical mechanisms that generate heavy precipitation, may remain the same.

  3. The Seismicity of the Central Apennines Region Studied by Means of a Physics-Based Earthquake Simulator

    NASA Astrophysics Data System (ADS)

    Console, R.; Vannoli, P.; Carluccio, R.

    2016-12-01

    The application of a physics-based earthquake simulation algorithm to the central Apennines region, where the 24 August 2016 Amatrice earthquake occurred, allowed the compilation of a synthetic seismic catalog lasting 100 ky, and containing more than 500,000 M ≥ 4.0 events, without the limitations that real catalogs suffer in terms of completeness, homogeneity and time duration. The algorithm on which this simulator is based is constrained by several physical elements as: (a) an average slip rate for every single fault in the investigated fault systems, (b) the process of rupture growth and termination, leading to a self-organized earthquake magnitude distribution, and (c) interaction between earthquake sources, including small magnitude events. Events nucleated in one fault are allowed to expand into neighboring faults, even belonging to a different fault system, if they are separated by less than a given maximum distance. The seismogenic model upon which we applied the simulator code, was derived from the DISS 3.2.0 database (http://diss.rm.ingv.it/diss/), selecting all the fault systems that are recognized in the central Apennines region, for a total of 24 fault systems. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which are comparable with those of real observations. These features include long-term periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the linear Gutenberg-Richter distribution in the moderate and higher magnitude range. The statistical distribution of earthquakes with M ≥ 6.0 on single faults exhibits a fairly clear pseudo-periodic behavior, with a coefficient of variation Cv of the order of 0.3-0.6. We found in our synthetic catalog a clear trend of long-term acceleration of seismic activity preceding M ≥ 6.0 earthquakes and quiescence following those earthquakes. Lastly, as an example of a possible use of synthetic catalogs, an attenuation law was applied to all the events reported in the synthetic catalog for the production of maps showing the exceedence probability of given values of peak acceleration (PGA) on the territory under investigation. The application of a physics-based earthquake simulation algorithm to the central Apennines region, where the 24 August 2016 Amatrice earthquake occurred, allowed the compilation of a synthetic seismic catalog lasting 100 ky, and containing more than 500,000 M ≥ 4.0 events, without the limitations that real catalogs suffer in terms of completeness, homogeneity and time duration. The algorithm on which this simulator is based is constrained by several physical elements as: (a) an average slip rate for every single fault in the investigated fault systems, (b) the process of rupture growth and termination, leading to a self-organized earthquake magnitude distribution, and (c) interaction between earthquake sources, including small magnitude events. Events nucleated in one fault are allowed to expand into neighboring faults, even belonging to a different fault system, if they are separated by less than a given maximum distance. The seismogenic model upon which we applied the simulator code, was derived from the DISS 3.2.0 database (http://diss.rm.ingv.it/diss/), selecting all the fault systems that are recognized in the central Apennines region, for a total of 24 fault systems. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which are comparable with those of real observations. These features include long-term periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the linear Gutenberg-Richter distribution in the moderate and higher magnitude range. The statistical distribution of earthquakes with M ≥ 6.0 on single faults exhibits a fairly clear pseudo-periodic behavior, with a coefficient of variation Cv of the order of 0.3-0.6. We found in our synthetic catalog a clear trend of long-term acceleration of seismic activity preceding M ≥ 6.0 earthquakes and quiescence following those earthquakes. Lastly, as an example of a possible use of synthetic catalogs, an attenuation law was applied to all the events reported in the synthetic catalog for the production of maps showing the exceedence probability of given values of peak acceleration (PGA) on the territory under investigation.

  4. Detection methods for non-Gaussian gravitational wave stochastic backgrounds

    NASA Astrophysics Data System (ADS)

    Drasco, Steve; Flanagan, Éanna É.

    2003-04-01

    A gravitational wave stochastic background can be produced by a collection of independent gravitational wave events. There are two classes of such backgrounds, one for which the ratio of the average time between events to the average duration of an event is small (i.e., many events are on at once), and one for which the ratio is large. In the first case the signal is continuous, sounds something like a constant hiss, and has a Gaussian probability distribution. In the second case, the discontinuous or intermittent signal sounds something like popcorn popping, and is described by a non-Gaussian probability distribution. In this paper we address the issue of finding an optimal detection method for such a non-Gaussian background. As a first step, we examine the idealized situation in which the event durations are short compared to the detector sampling time, so that the time structure of the events cannot be resolved, and we assume white, Gaussian noise in two collocated, aligned detectors. For this situation we derive an appropriate version of the maximum likelihood detection statistic. We compare the performance of this statistic to that of the standard cross-correlation statistic both analytically and with Monte Carlo simulations. In general the maximum likelihood statistic performs better than the cross-correlation statistic when the stochastic background is sufficiently non-Gaussian, resulting in a gain factor in the minimum gravitational-wave energy density necessary for detection. This gain factor ranges roughly between 1 and 3, depending on the duty cycle of the background, for realistic observing times and signal strengths for both ground and space based detectors. The computational cost of the statistic, although significantly greater than that of the cross-correlation statistic, is not unreasonable. Before the statistic can be used in practice with real detector data, further work is required to generalize our analysis to accommodate separated, misaligned detectors with realistic, colored, non-Gaussian noise.

  5. Impact of realistic soil moisture initialization on the representation of extreme events in the western Mediterranean

    NASA Astrophysics Data System (ADS)

    Helgert, Sebastian; Khodayar, Samiro

    2017-04-01

    In a warmer Mediterranean climate an increase in the intensity and frequency of extreme events like floods, droughts and extreme heat is expected. The ability to predict such events is still a great challenge and exhibits many uncertainties in the weather forecast and climate predictions. Thereby the missing knowledge about soil moisture-atmosphere interactions and their representation in models is identified as one of the main sources of uncertainty. In this context the soil moisture(SM) plays an important role in the partitioning of sensible and latent heat fluxes on the surface and consequently influences the boundary-layer stability and the precipitation formation. The aim of this research work is to assess the influence of soil moisture-atmosphere interactions on the initiation and development of extreme events in the western Mediterranean (WMED). In this respect the impact of realistic SM initialization on the model representation of extreme events is investigated. High-resolution simulations of different regions in the WMED, including various climate zones from moderate to arid climate, are conducted with the atmospheric COSMO (Consortium for Small-scale Modeling) model in the numerical weather prediction and climate mode. A multiscale temporal and spatial approach is used (days to years, 7km to 2.8km grid spacing). Observational data provided by the framework of the HYdrological cycle in the Mediterranean EXperiment (HyMeX) as well as satellite data such as precipitation from CMORPH (CPC MORPHing technique), evapotranspiration from Land Surface Analysis Satellite Applications Facility (LSA-SAF) and atmospheric moisture from MODIS (Moderate Resolution Imaging Spectroradiometer) are used for process understanding and model validation. To select extreme dry and wet periods the Effective Drought Index (EDI) is calculated. In these periods sensitivity studies of extreme SM initialization scenarios are performed to prove a possible impact of soil moisture on precipitation in the WMED. For the realistic SM initialization different state-of-art high-resolution SM products (25km up to 1km grid spacing) of the Soil Moisture Ocean Salinity mission (SMOS) are examined. A CDF-matching method is applied to reduce the bias between model and SMOS-satellite observation. Moreover, techniques to estimate the initial soil moisture profile from satellite data are tested.

  6. Perceptual evaluation of visual alerts in surveillance videos

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Topkara, Mercan; Pfeiffer, William; Hampapur, Arun

    2015-03-01

    Visual alerts are commonly used in video monitoring and surveillance systems to mark events, presumably making them more salient to human observers. Surprisingly, the effectiveness of computer-generated alerts in improving human performance has not been widely studied. To address this gap, we have developed a tool for simulating different alert parameters in a realistic visual monitoring situation, and have measured human detection performance under conditions that emulated different set-points in a surveillance algorithm. In the High-Sensitivity condition, the simulated alerts identified 100% of the events with many false alarms. In the Lower-Sensitivity condition, the simulated alerts correctly identified 70% of the targets, with fewer false alarms. In the control condition, no simulated alerts were provided. To explore the effects of learning, subjects performed these tasks in three sessions, on separate days, in a counterbalanced, within subject design. We explore these results within the context of cognitive models of human attention and learning. We found that human observers were more likely to respond to events when marked by a visual alert. Learning played a major role in the two alert conditions. In the first session, observers generated almost twice as many False Alarms as in the No-Alert condition, as the observers responded pre-attentively to the computer-generated false alarms. However, this rate dropped equally dramatically in later sessions, as observers learned to discount the false cues. Highest observer Precision, Hits/(Hits + False Alarms), was achieved in the High Sensitivity condition, but only after training. The successful evaluation of surveillance systems depends on understanding human attention and performance.

  7. Improving atomic displacement and replacement calculations with physically realistic damage models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.

    Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less

  8. Improving atomic displacement and replacement calculations with physically realistic damage models

    DOE PAGES

    Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.; ...

    2018-03-14

    Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less

  9. Improving atomic displacement and replacement calculations with physically realistic damage models.

    PubMed

    Nordlund, Kai; Zinkle, Steven J; Sand, Andrea E; Granberg, Fredric; Averback, Robert S; Stoller, Roger; Suzudo, Tomoaki; Malerba, Lorenzo; Banhart, Florian; Weber, William J; Willaime, Francois; Dudarev, Sergei L; Simeone, David

    2018-03-14

    Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor of 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.

  10. Describing Site Amplification for Surface Waves in Realistic Basins

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Tsai, V. C.

    2017-12-01

    Standard characterizations of site-specific site response assume a vertically-incident shear wave; given a 1D velocity profile, amplification and resonances can be calculated based on conservation of energy. A similar approach can be applied to surface waves, resulting in an estimate of amplification relative to a hard rock site that is different in terms of both amount of amplification and frequency. This prediction of surface-wave site amplification has been well validated through simple simulations, and in this presentation we explore the extent to which a 1D profile can explain observed amplifications in more realistic scenarios. Comparisons of various simple 2D and 3D simulations, for example, allow us to explore the effect of different basin shapes and the relative importance of effects such as focusing, conversion of wave-types and lateral surface wave resonances. Additionally, the 1D estimates for vertically-incident shear waves and for surface waves are compared to spectral ratios of historic events in deep sedimentary basins to demonstrate the appropriateness of the two different predictions. This difference in amplification responses between the wave types implies that a single measurement of site response, whether analytically calculated from 1D models or empirically observed, is insufficient for regions where surface waves play a strong role.

  11. Dimits shift in realistic gyrokinetic plasma-turbulence simulations.

    PubMed

    Mikkelsen, D R; Dorland, W

    2008-09-26

    In simulations of turbulent plasma transport due to long wavelength (k perpendicular rhoi < or = 1) electrostatic drift-type instabilities, we find a persistent nonlinear up-shift of the effective threshold. Next-generation tokamaks will likely benefit from the higher effective threshold for turbulent transport, and transport models should incorporate suitable corrections to linear thresholds. The gyrokinetic simulations reported here are more realistic than previous reports of a Dimits shift because they include nonadiabatic electron dynamics, strong collisional damping of zonal flows, and finite electron and ion collisionality together with realistic shaped magnetic geometry. Reversing previously reported results based on idealized adiabatic electrons, we find that increasing collisionality reduces the heat flux because collisionality reduces the nonadiabatic electron microinstability drive.

  12. Atmospheric River Frequency and Intensity Changes in CMIP5 Climate Model Projections

    NASA Astrophysics Data System (ADS)

    Warner, M.; Mass, C.; Salathe, E. P., Jr.

    2012-12-01

    Most extreme precipitation events that occur along the North American west coast are associated with narrow plumes of above-average water vapor concentration that stretch from the tropics or subtropics to the West Coast. These events generally occur during the wet season (October-March) and are referred to as atmospheric rivers (AR). ARs can cause major river management problems, damage from flooding or landslides, and loss of life. It is currently unclear how these events will change in frequency and intensity as a result of climate change in the coming century. While climate model global mean precipitation match observations reasonably well in historical runs, precipitation frequency and intensity is generally poorly represented at local scales; however, synoptic-scale features are more realistically simulated by climate models, and AR events can be identified by extremely high values of integrated water vapor flux at points near the West Coast. There have been many recent studies indicating changes in synoptic-scale features under climate change that could have meaningful impacts on the frequency and intensity of ARs. In this study, a suite of CMIP5 models are used to analyze predicted changes in frequency and intensity of AR events impacting the West Coast from the contemporary period (1970-1999) to the end of this century (2070-2099). Generally, integrated water vapor is predicted to increase in these models (both the mean and extremes) while low-level wind decreases and upper-level wind increases. This study aims to determine the influence of these changes on precipitation intensity in AR events in future climate simulations.

  13. Gyrokinetic Simulations of Transport Scaling and Structure

    NASA Astrophysics Data System (ADS)

    Hahm, Taik Soo

    2001-10-01

    There is accumulating evidence from global gyrokinetic particle simulations with profile variations and experimental fluctuation measurements that microturbulence, with its time-averaged eddy size which scales with the ion gyroradius, can cause ion thermal transport which deviates from the gyro-Bohm scaling. The physics here can be best addressed by large scale (rho* = rho_i/a = 0.001) full torus gyrokinetic particle-in-cell turbulence simulations using our massively parallel, general geometry gyrokinetic toroidal code with field-aligned mesh. Simulation results from device-size scans for realistic parameters show that ``wave transport'' mechanism is not the dominant contribution for this Bohm-like transport and that transport is mostly diffusive driven by microscopic scale fluctuations in the presence of self-generated zonal flows. In this work, we analyze the turbulence and zonal flow statistics from simulations and compare to nonlinear theoretical predictions including the radial decorrelation of the transport events by zonal flows and the resulting probability distribution function (PDF). In particular, possible deviation of the characteristic radial size of transport processes from the time-averaged radial size of the density fluctuation eddys will be critically examined.

  14. Uncovering molecular processes in crystal nucleation and growth by using molecular simulation.

    PubMed

    Anwar, Jamshed; Zahn, Dirk

    2011-02-25

    Exploring nucleation processes by molecular simulation provides a mechanistic understanding at the atomic level and also enables kinetic and thermodynamic quantities to be estimated. However, whilst the potential for modeling crystal nucleation and growth processes is immense, there are specific technical challenges to modeling. In general, rare events, such as nucleation cannot be simulated using a direct "brute force" molecular dynamics approach. The limited time and length scales that are accessible by conventional molecular dynamics simulations have inspired a number of advances to tackle problems that were considered outside the scope of molecular simulation. While general insights and features could be explored from efficient generic models, new methods paved the way to realistic crystal nucleation scenarios. The association of single ions in solvent environments, the mechanisms of motif formation, ripening reactions, and the self-organization of nanocrystals can now be investigated at the molecular level. The analysis of interactions with growth-controlling additives gives a new understanding of functionalized nanocrystals and the precipitation of composite materials. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Immersive Learning Technologies: Realism and Online Authentic Learning

    ERIC Educational Resources Information Center

    Herrington, Jan; Reeves, Thomas C.; Oliver, Ron

    2007-01-01

    The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Prescott, Steven; Coleman, Justin

    This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less

  17. Exploring transmembrane transport through alpha-hemolysin with grid-steered molecular dynamics.

    PubMed

    Wells, David B; Abramkina, Volha; Aksimentiev, Aleksei

    2007-09-28

    The transport of biomolecules across cell boundaries is central to cellular function. While structures of many membrane channels are known, the permeation mechanism is known only for a select few. Molecular dynamics (MD) is a computational method that can provide an accurate description of permeation events at the atomic level, which is required for understanding the transport mechanism. However, due to the relatively short time scales accessible to this method, it is of limited utility. Here, we present a method for all-atom simulation of electric field-driven transport of large solutes through membrane channels, which in tens of nanoseconds can provide a realistic account of a permeation event that would require a millisecond simulation using conventional MD. In this method, the average distribution of the electrostatic potential in a membrane channel under a transmembrane bias of interest is determined first from an all-atom MD simulation. This electrostatic potential, defined on a grid, is subsequently applied to a charged solute to steer its permeation through the membrane channel. We apply this method to investigate permeation of DNA strands, DNA hairpins, and alpha-helical peptides through alpha-hemolysin. To test the accuracy of the method, we computed the relative permeation rates of DNA strands having different sequences and global orientations. The results of the G-SMD simulations were found to be in good agreement in experiment.

  18. Gravitational Reference Sensor Front-End Electronics Simulator for LISA

    NASA Astrophysics Data System (ADS)

    Meshksar, Neda; Ferraioli, Luigi; Mance, Davor; ten Pierick, Jan; Zweifel, Peter; Giardini, Domenico; ">LISA Pathfinder colaboration, Evacuation Simulation in Kalayaan Residence Hall, up Diliman Using Gama Simulation Software

    NASA Astrophysics Data System (ADS)

    Claridades, A. R. C.; Villanueva, J. K. S.; Macatulad, E. G.

    2016-09-01

    Agent-Based Modeling (ABM) has recently been adopted in some studies for the modelling of events as a dynamic system given a set of events and parameters. In principle, ABM employs individual agents with assigned attributes and behaviors and simulates their behavior around their environment and interaction with other agents. This can be a useful tool in both micro and macroscale-applications. In this study, a model initially created and applied to an academic building was implemented in a dormitory. In particular, this research integrates three-dimensional Geographic Information System (GIS) with GAMA as the multi-agent based evacuation simulation and is implemented in Kalayaan Residence Hall. A three-dimensional GIS model is created based on the floor plans and demographic data of the dorm, including respective pathways as networks, rooms, floors, exits and appropriate attributes. This model is then re-implemented in GAMA. Different states of the agents and their effect on their evacuation time were then observed. GAMA simulation with varying path width was also implemented. It has been found out that compared to their original states, panic, eating and studying will hasten evacuation, and on the other hand, sleeping and being on the bathrooms will be impedances. It is also concluded that evacuation time will be halved when path widths are doubled, however it is recommended for further studies for pathways to be modeled as spaces instead of lines. A more scientific basis for predicting agent behavior in these states is also recommended for more realistic results.

  19. A systematic comparison of recurrent event models for application to composite endpoints.

    PubMed

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  1. Knowledge-based simulation using object-oriented programming

    NASA Technical Reports Server (NTRS)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  2. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    PubMed

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  3. How sleep problems contribute to simulator sickness: Preliminary results from a realistic driving scenario.

    PubMed

    Altena, Ellemarije; Daviaux, Yannick; Sanz-Arigita, Ernesto; Bonhomme, Emilien; de Sevin, Étienne; Micoulaud-Franchi, Jean-Arthur; Bioulac, Stéphanie; Philip, Pierre

    2018-04-17

    Virtual reality and simulation tools enable us to assess daytime functioning in environments that simulate real life as close as possible. Simulator sickness, however, poses a problem in the application of these tools, and has been related to pre-existing health problems. How sleep problems contribute to simulator sickness has not yet been investigated. In the current study, 20 female chronic insomnia patients and 32 female age-matched controls drove in a driving simulator covering realistic city, country and highway scenes. Fifty percent of the insomnia patients as opposed to 12.5% of controls reported excessive simulator sickness leading to experiment withdrawal. In the remaining participants, patients with insomnia showed overall increased levels of oculomotor symptoms even before driving, while nausea symptoms further increased after driving. These results, as well as the realistic simulation paradigm developed, give more insight on how vestibular and oculomotor functions as well as interoceptive functions are affected in insomnia. Importantly, our results have direct implications for both the actual driving experience and the wider context of deploying simulation techniques to mimic real life functioning, in particular in those professions often exposed to sleep problems. © 2018 European Sleep Research Society.

  4. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments

    PubMed Central

    Slater, Mel

    2009-01-01

    In this paper, I address the question as to why participants tend to respond realistically to situations and events portrayed within an immersive virtual reality system. The idea is put forward, based on the experience of a large number of experimental studies, that there are two orthogonal components that contribute to this realistic response. The first is ‘being there’, often called ‘presence’, the qualia of having a sensation of being in a real place. We call this place illusion (PI). Second, plausibility illusion (Psi) refers to the illusion that the scenario being depicted is actually occurring. In the case of both PI and Psi the participant knows for sure that they are not ‘there’ and that the events are not occurring. PI is constrained by the sensorimotor contingencies afforded by the virtual reality system. Psi is determined by the extent to which the system can produce events that directly relate to the participant, the overall credibility of the scenario being depicted in comparison with expectations. We argue that when both PI and Psi occur, participants will respond realistically to the virtual reality. PMID:19884149

  5. Virtual Reality Simulation Training for Ebola Deployment.

    PubMed

    Ragazzoni, Luca; Ingrassia, Pier Luigi; Echeverri, Lina; Maccapani, Fabio; Berryman, Lizzy; Burkle, Frederick M; Della Corte, Francesco

    2015-10-01

    Both virtual and hybrid simulation training offer a realistic and effective educational framework and opportunity to provide virtual exposure to operational public health skills that are essential for infection control and Ebola treatment management. This training is designed to increase staff safety and create a safe and realistic environment where trainees can gain essential basic and advanced skills.

  6. On the accuracy of mass measurement for microlensing black holes as seen by Gaia and OGLE

    NASA Astrophysics Data System (ADS)

    Rybicki, Krzysztof A.; Wyrzykowski, Łukasz; Klencki, Jakub; de Bruijne, Jos; Belczyński, Krzysztof; Chruślińska, Martyna

    2018-05-01

    We investigate the impact of combining Gaia astrometry from space with precise, high cadence OGLE photometry from the ground. For the archival event OGLE3-ULENS-PAR-02, which is likely a black hole, we simulate a realistic astrometric time series of Gaia measurements and combine it with the real photometric data collected by the OGLE project. We predict that at the end of the nominal 5 yr of the Gaia mission, for the events brighter than G ≈ 15.5 mag at the baseline, caused by objects heavier than 10 M⊙, it will be possible to unambiguously derive masses of the lenses, with accuracy between a few and 15 per cent. We find that fainter events (G < 17.5) can still have their lens masses determined, provided that they are heavier than 30 M⊙. We estimate that the rate of astrometric microlensing events caused by the stellar-origin black holes is ≈ 4 × 10- 7 yr- 1, which implies, that after 5 yr of Gaia operation and ≈5 × 106 bright sources in Gaia, it will be possible to identify few such events in the Gaia final catalogues.

  7. Attribution of precipitation changes in African rainforest

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Allen, M. R.; Bowery, A.; Imbers, J.; Jones, R.; Massey, N.; Miller, J.; Rosier, S.; Rye, C.; Thurston, M.; Wilson, S.; Yamazaki, H.

    2012-04-01

    Global climate change is almost certainly affecting the magnitude and frequency of extreme weather and hydrological events. However, whether and to what extend the occurrence of such an event can be attributed to climate change remains a challenge that relies on good observations as well as climate modelling. A number of recent studies have attempted to quantify the role of human influence on climate in observed weather events as e.g. the 2010 Russian heat wave (Dole et al, 2011; Rahmstorf and Coumou, 2011; Otto et al, 2012). The overall approach is to simulate, with as realistic a model as possible and accounting as far as possible for modelling uncertainties, both the statistics of observed weather and the statistics of the weather that would have obtained had specific external drivers of climate change been absent. This approach requires a large ensemble size to provide results from which the statistical significance and the shape of the distribution of key variables can be assessed. Also, a sufficiently long period of time must be simulated to evaluate model bias and whether the model captures the observed distribution. The weatherathome.net within the climateprediction.net projects provides such an ensemble with many hundred ensemble members per year via volunteer distributed computing. Most previous attribution studies have been about European extreme weather events but the most vulnerable regions to climate change are in Asia and Africa. One of the most complex hydrological systems is the tropical rainforest, which is expected to react highly sensible to a changing climate. Analysing the weatherathome.net results we find that conditions which are too dry for rainforests to sustain without damages occurred more frequently and more severe in recent years. Furthermore the changes in precipitation in that region can be linked to El Nino/ La Nina events. Linking extreme weather events to large-scale teleconnections helps to understand the occurrence of this events and provides insights for developing forecast methods, also in a region with sparse observational data. We present an important step towards quantifying the link between climate change and extreme weather which is central both to the formulation of evidence-based adaptation policies and to a realistic assessment of the true cost of greenhouse gas emissions, other forms of pollution and land-use change. Dole, R., M. Hoerling, J. Perlwitz, J. Eischeid, P. Pegion, T. Zhang, Xiao-Wei Quan, Taiyi Xu, and D. Murray (2011): Was there a basis for anticipating the 2010 Russian Heat Wave?, GRL 38:L06702. Otto, F.E.L., N. Massey, R. Jones,G.J. van Oldenborgh, and M. R. Allen (2012): Reconciling two approaches to attribution of the 2010 Russian heat wave, GRL under revision. Rahmstorf,S., and D. Coumou (2011), Increase of extreme events in a warming world, PNAS early edition.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ku, S.; Chang, C. S.; Hager, R.

    Here, a fast edge turbulence suppression event has been simulated in the electrostatic version of the gyrokinetic particle-in-cell code XGC1 in a realistic diverted tokamak edge geometry under neutral particle recycling. The results show that the sequence of turbulent Reynolds stress followed by neoclassical ion orbit-loss driven together conspire to form the sustaining radial electric field shear and to quench turbulent transport just inside the last closed magnetic flux surface. As a result, the main suppression action is located in a thin radial layer around ψ N≃0.96–0.98, where ψ N is the normalized poloidal flux, with the time scale ~0.1more » ms.« less

  9. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  10. Magnetopause Losses of Radiation Belt Electrons During a Recent Magnetic Storm

    NASA Astrophysics Data System (ADS)

    Lemon, C. L.; Chen, M.; Roeder, J. L.; Fennell, J. F.; Mulligan, T. L.; Claudepierre, S. G.

    2013-12-01

    We present results from Van Allen Probes observations during the magnetic storm of June 1, 2013, and compare them with simulations of the same event using the RCM-E model. The RCM-E calculates ion and electron transport in self-consistently computed electric and magnetic fields. We examine the effect of the perturbed ring current magnetic field on the transport of energetic electrons, and the significance of this transport for explaining the observed evolution of radiation belt fluxes during this event. The event is notable because it is a relatively simple storm in which strong convection persists for approximately 7 hours, injecting a moderately strong ring current (minimum Dst of -120 nT); convection then quickly shuts off, leading to a long and smooth recovery phase. We use RCM-E simulations, constrained by Van Allen Probes data, to asses the rate of magnetopause losses of electrons (magnetopause shadowing), and to calculate electron drift times and the evolution of electron phase space densities during the storm event. We recently modified the RCM-E plasma drift calculations to include relativistic treatment of electrons and a more realistic electron loss model. The new electron loss model, although still somewhat simplistic, gives much more accurate loss rates in the inner magnetosphere (including the radiation belts), which significantly affects the resulting electron fluxes compared to previous simulations. This, in turn, modifies the transport of ions and electrons via feedback with both the electric and magnetic fields. Our results highlight the effect of the ring current on the evolution of the radiation belt electrons, with particular emphasis on the role that magnetopause losses play in the observed variation of radiation belt electron fluxes during the storm.

  11. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  12. Realistic full wave modeling of focal plane array pixels

    DOE PAGES

    Campione, Salvatore; Warne, Larry K.; Jorgenson, Roy E.; ...

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects,more » the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.« less

  13. Full Quantum Dynamics Simulation of a Realistic Molecular System Using the Adaptive Time-Dependent Density Matrix Renormalization Group Method.

    PubMed

    Yao, Yao; Sun, Ke-Wei; Luo, Zhen; Ma, Haibo

    2018-01-18

    The accurate theoretical interpretation of ultrafast time-resolved spectroscopy experiments relies on full quantum dynamics simulations for the investigated system, which is nevertheless computationally prohibitive for realistic molecular systems with a large number of electronic and/or vibrational degrees of freedom. In this work, we propose a unitary transformation approach for realistic vibronic Hamiltonians, which can be coped with using the adaptive time-dependent density matrix renormalization group (t-DMRG) method to efficiently evolve the nonadiabatic dynamics of a large molecular system. We demonstrate the accuracy and efficiency of this approach with an example of simulating the exciton dissociation process within an oligothiophene/fullerene heterojunction, indicating that t-DMRG can be a promising method for full quantum dynamics simulation in large chemical systems. Moreover, it is also shown that the proper vibronic features in the ultrafast electronic process can be obtained by simulating the two-dimensional (2D) electronic spectrum by virtue of the high computational efficiency of the t-DMRG method.

  14. Characterizing differences in precipitation regimes of extreme wet and dry years: implications for climate change experiments.

    PubMed

    Knapp, Alan K; Hoover, David L; Wilcox, Kevin R; Avolio, Meghan L; Koerner, Sally E; La Pierre, Kimberly J; Loik, Michael E; Luo, Yiqi; Sala, Osvaldo E; Smith, Melinda D

    2015-02-03

    Climate change is intensifying the hydrologic cycle and is expected to increase the frequency of extreme wet and dry years. Beyond precipitation amount, extreme wet and dry years may differ in other ways, such as the number of precipitation events, event size, and the time between events. We assessed 1614 long-term (100 year) precipitation records from around the world to identify key attributes of precipitation regimes, besides amount, that distinguish statistically extreme wet from extreme dry years. In general, in regions where mean annual precipitation (MAP) exceeded 1000 mm, precipitation amounts in extreme wet and dry years differed from average years by ~40% and 30%, respectively. The magnitude of these deviations increased to >60% for dry years and to >150% for wet years in arid regions (MAP<500 mm). Extreme wet years were primarily distinguished from average and extreme dry years by the presence of multiple extreme (large) daily precipitation events (events >99th percentile of all events); these occurred twice as often in extreme wet years compared to average years. In contrast, these large precipitation events were rare in extreme dry years. Less important for distinguishing extreme wet from dry years were mean event size and frequency, or the number of dry days between events. However, extreme dry years were distinguished from average years by an increase in the number of dry days between events. These precipitation regime attributes consistently differed between extreme wet and dry years across 12 major terrestrial ecoregions from around the world, from deserts to the tropics. Thus, we recommend that climate change experiments and model simulations incorporate these differences in key precipitation regime attributes, as well as amount into treatments. This will allow experiments to more realistically simulate extreme precipitation years and more accurately assess the ecological consequences. © 2015 John Wiley & Sons Ltd.

  15. sedFlow - a tool for simulating fractional bedload transport and longitudinal profile evolution in mountain streams

    NASA Astrophysics Data System (ADS)

    Heimann, F. U. M.; Rickenmann, D.; Turowski, J. M.; Kirchner, J. W.

    2015-01-01

    Especially in mountainous environments, the prediction of sediment dynamics is important for managing natural hazards, assessing in-stream habitats and understanding geomorphic evolution. We present the new modelling tool {sedFlow} for simulating fractional bedload transport dynamics in mountain streams. sedFlow is a one-dimensional model that aims to realistically reproduce the total transport volumes and overall morphodynamic changes resulting from sediment transport events such as major floods. The model is intended for temporal scales from the individual event (several hours to few days) up to longer-term evolution of stream channels (several years). The envisaged spatial scale covers complete catchments at a spatial discretisation of several tens of metres to a few hundreds of metres. sedFlow can deal with the effects of streambeds that slope uphill in a downstream direction and uses recently proposed and tested approaches for quantifying macro-roughness effects in steep channels. sedFlow offers different options for bedload transport equations, flow-resistance relationships and other elements which can be selected to fit the current application in a particular catchment. Local grain-size distributions are dynamically adjusted according to the transport dynamics of each grain-size fraction. sedFlow features fast calculations and straightforward pre- and postprocessing of simulation data. The high simulation speed allows for simulations of several years, which can be used, e.g., to assess the long-term impact of river engineering works or climate change effects. In combination with the straightforward pre- and postprocessing, the fast calculations facilitate efficient workflows for the simulation of individual flood events, because the modeller gets the immediate results as direct feedback to the selected parameter inputs. The model is provided together with its complete source code free of charge under the terms of the GNU General Public License (GPL) (www.wsl.ch/sedFlow). Examples of the application of sedFlow are given in a companion article by Heimann et al. (2015).

  16. The response of an ocean general circulation model to surface wind stress produced by an atmospheric general circulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, B.; Schneider, E.K.

    1995-10-01

    Two surface wind stress datasets for 1979-91, one based on observations and the other from an investigation of the COLA atmospheric general circulation model (AGCM) with prescribed SST, are used to drive the GFDL ocean general circulation model. These two runs are referred to as the control and COLA experiments, respectively. Simulated SST and upper-ocean heat contents (HC) in the tropical Pacific Ocean are compared with observations and between experiments. Both simulation reproduced the observed mean SST and HC fields as well as their annual cycles realistically. Major errors common to both runs are colder than observed SST in themore » eastern equatorial ocean and HC in the western Pacific south of the equator, with errors generally larger in the COLA experiment. New errors arising from the AGCM wind forcing include higher SST near the South American coast throughout the year and weaker HC gradients along the equator in boreal spring. The former is associated with suppressed coastal upwelling by weak along shore AGCM winds, and the latter is caused by weaker equatorial easterlies in boreal spring. The low-frequency ENSO fluctuations are also realistic for both runs. Correlations between the observed and simulated SST anomalies from the COLA simulation are as high as those from the control run in the central equatorial Pacific. A major problem in the COLA simulation is the appearance of unrealistic tropical cold anomalies during the boreal spring of mature El Nino years. These anomalies propagate along the equator from the western Pacific to the eastern coast in about three months, and temporarily eliminate the warm SST and HC anomalies in the eastern Pacific. This erroneous oceanic response in the COLA simulation is caused by a reversal of the westerly wind anomalies on the equator, associated with an unrealistic southward shift of the ITCZ in boreal spring during El Nino events. 66 refs., 16 figs.« less

  17. Realistic micromechanical modeling and simulation of two-phase heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Sreeranganathan, Arun

    This dissertation research focuses on micromechanical modeling and simulations of two-phase heterogeneous materials exhibiting anisotropic and non-uniform microstructures with long-range spatial correlations. Completed work involves development of methodologies for realistic micromechanical analyses of materials using a combination of stereological techniques, two- and three-dimensional digital image processing, and finite element based modeling tools. The methodologies are developed via its applications to two technologically important material systems, namely, discontinuously reinforced aluminum composites containing silicon carbide particles as reinforcement, and boron modified titanium alloys containing in situ formed titanium boride whiskers. Microstructural attributes such as the shape, size, volume fraction, and spatial distribution of the reinforcement phase in these materials were incorporated in the models without any simplifying assumptions. Instrumented indentation was used to determine the constitutive properties of individual microstructural phases. Micromechanical analyses were performed using realistic 2D and 3D models and the results were compared with experimental data. Results indicated that 2D models fail to capture the deformation behavior of these materials and 3D analyses are required for realistic simulations. The effect of clustering of silicon carbide particles and associated porosity on the mechanical response of discontinuously reinforced aluminum composites was investigated using 3D models. Parametric studies were carried out using computer simulated microstructures incorporating realistic microstructural attributes. The intrinsic merit of this research is the development and integration of the required enabling techniques and methodologies for representation, modeling, and simulations of complex geometry of microstructures in two- and three-dimensional space facilitating better understanding of the effects of microstructural geometry on the mechanical behavior of materials.

  18. The Electrostatic Instability for Realistic Pair Distributions in Blazar/EBL Cascades

    NASA Astrophysics Data System (ADS)

    Vafin, S.; Rafighi, I.; Pohl, M.; Niemiec, J.

    2018-04-01

    This work revisits the electrostatic instability for blazar-induced pair beams propagating through the intergalactic medium (IGM) using linear analysis and PIC simulations. We study the impact of the realistic distribution function of pairs resulting from the interaction of high-energy gamma-rays with the extragalactic background light. We present analytical and numerical calculations of the linear growth rate of the instability for the arbitrary orientation of wave vectors. Our results explicitly demonstrate that the finite angular spread of the beam dramatically affects the growth rate of the waves, leading to the fastest growth for wave vectors quasi-parallel to the beam direction and a growth rate at oblique directions that is only a factor of 2–4 smaller compared to the maximum. To study the nonlinear beam relaxation, we performed PIC simulations that take into account a realistic wide-energy distribution of beam particles. The parameters of the simulated beam-plasma system provide an adequate physical picture that can be extrapolated to realistic blazar-induced pairs. In our simulations, the beam looses only 1% of its energy, and we analytically estimate that the beam would lose its total energy over about 100 simulation times. An analytical scaling is then used to extrapolate the parameters of realistic blazar-induced pair beams. We find that they can dissipate their energy slightly faster by the electrostatic instability than through inverse-Compton scattering. The uncertainties arising from, e.g., details of the primary gamma-ray spectrum are too large to make firm statements for individual blazars, and an analysis based on their specific properties is required.

  19. Clathrate structure-type recognition: Application to hydrate nucleation and crystallisation

    NASA Astrophysics Data System (ADS)

    Lauricella, Marco; Meloni, Simone; Liang, Shuai; English, Niall J.; Kusalik, Peter G.; Ciccotti, Giovanni

    2015-06-01

    For clathrate-hydrate polymorphic structure-type (sI versus sII), geometric recognition criteria have been developed and validated. These are applied to the study of the rich interplay and development of both sI and sII motifs in a variety of hydrate-nucleation events for methane and H2S hydrate studied by direct and enhanced-sampling molecular dynamics (MD) simulations. In the case of nucleation of methane hydrate from enhanced-sampling simulation, we notice that already at the transition state, ˜80% of the enclathrated CH4 molecules are contained in a well-structured (sII) clathrate-like crystallite. For direct MD simulation of nucleation of H2S hydrate, some sI/sII polymorphic diversity was encountered, and it was found that a realistic dissipation of the nucleation energy (in view of non-equilibrium relaxation to either microcanonical (NVE) or isothermal-isobaric (NPT) distributions) is important to determine the relative propensity to form sI versus sII motifs.

  20. Astrophysical neutrino production diagnostics with the Glashow resonance

    NASA Astrophysics Data System (ADS)

    Biehl, Daniel; Fedynitch, Anatoli; Palladino, Andrea; Weiler, Tom J.; Winter, Walter

    2017-01-01

    We study the Glashow resonance bar nue + e- → W- → hadrons at 6.3 PeV as diagnostic of the production processes of ultra-high energy neutrinos. The focus lies on describing the physics of neutrino production from pion decay as accurate as possible by including the kinematics of weak decays and Monte Carlo simulations of pp and pγ interactions. We discuss optically thick (to photohadronic interactions) sources, sources of cosmic ray ``nuclei'' and muon damped sources. Even in the proposed upgrade IceCube-Gen2, a discrimination of scenarios such as pp versus pγ is extremely challenging under realistic assumptions. Nonetheless, the Glashow resonance can serve as a smoking gun signature of neutrino production from photohadronic (Aγ) interactions of heavier nuclei, as the expected Glashow event rate exceeds that of pp interactions. We finally quantify the exposures for which the non-observation of Glashow events exerts pressure on certain scenarios.

  1. The scientific challenges to forecasting the propagation of space weather through the heliosphere (Invited)

    NASA Astrophysics Data System (ADS)

    van der Holst, B.; Manchester, W.; Sokolov, I.; Toth, G.; Gombosi, T. I.

    2013-12-01

    Coronal mass ejections (CMEs) are a major source of potentially destructive space weather conditions. Understanding and forecasting these events are of utmost importance. In this presentation we discuss the progress towards a physics-based predictive capability within the Space Weather Modeling Framework (SWMF). We demonstrate our latest development in the AWSoM (Alfven Wave Solar Model) global model of the solar corona and inner heliosphere. This model accounts for the coupled thermodynamics of the electrons and protons via single fluid magnetohydrodynamics. The coronal heating and solar wind acceleration are addressed with Alfvén wave turbulence. The realistic 3D magnetic field is simulated using data from the photospheric magnetic field measurements. The AWSoM model serves as a workhorse for modeling CMEs from initial eruption to prediction at 1AU. With selected events we will demonstrate the complexity and challenges associated with CME propagation.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biehl, Daniel; Fedynitch, Anatoli; Winter, Walter

    We study the Glashow resonance ν-bar {sub e} + e {sup −} → W {sup −} → hadrons at 6.3 PeV as diagnostic of the production processes of ultra-high energy neutrinos. The focus lies on describing the physics of neutrino production from pion decay as accurate as possible by including the kinematics of weak decays and Monte Carlo simulations of pp and pγ interactions. We discuss optically thick (to photohadronic interactions) sources, sources of cosmic ray ''nuclei'' and muon damped sources. Even in the proposed upgrade IceCube-Gen2, a discrimination of scenarios such as pp versus pγ is extremely challenging undermore » realistic assumptions. Nonetheless, the Glashow resonance can serve as a smoking gun signature of neutrino production from photohadronic (Aγ) interactions of heavier nuclei, as the expected Glashow event rate exceeds that of pp interactions. We finally quantify the exposures for which the non-observation of Glashow events exerts pressure on certain scenarios.« less

  3. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.

    PubMed

    Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-06-23

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.

  4. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments

    PubMed Central

    Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-01-01

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375

  5. Realistic modeling of neurons and networks: towards brain simulation.

    PubMed

    D'Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    2013-01-01

    Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field.

  6. Realistic modeling of neurons and networks: towards brain simulation

    PubMed Central

    D’Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    Summary Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field. PMID:24139652

  7. DEVELOPMENT OF USER-FRIENDLY SIMULATION SYSTEM OF EARTHQUAKE INDUCED URBAN SPREADING FIRE

    NASA Astrophysics Data System (ADS)

    Tsujihara, Osamu; Gawa, Hidemi; Hayashi, Hirofumi

    In the simulation of earthquake induced urban spreading fire, the produce of the analytical model of the target area is required as well as the analysis of spreading fire and the presentati on of the results. In order to promote the use of the simulation, it is important that the simulation system is non-intrusive and the analysis results can be demonstrated by the realistic presentation. In this study, the simulation system is developed based on the Petri-net algorithm, in which the easy operation can be realized in the modeling of the target area of the simulation through the presentation of analytical results by realistic 3-D animation.

  8. Multi-year global climatic effects of atmospheric dust from large bolide impacts

    NASA Technical Reports Server (NTRS)

    Thompson, Starley L.

    1988-01-01

    The global climatic effects of dust generated by the impact of a 10 km-diameter bolide was simulated using a one-dimensional (vertical only) globally-averaged climate model by Pollack et al. The goal of the simulation is to examine the regional climate effects, including the possibility of coastal refugia, generated by a global dust cloud in a model having realistic geographic resolution. The climate model assumes the instantaneous appearance of a global stratospheric dust cloud with initial optical depth of 10,000. The time history of optical depth decreases according to the detailed calculations of Pollack et al., reaching an optical depth of unity at day 160, and subsequently decreasing with an e-folding time of 1 year. The simulation is carried out for three years in order to examine the atmospheric effects and recovery over several seasons. The simulation does not include any effects of NOx, CO2, or wildfire smoke injections that may accompany the creation of the dust cloud. The global distribution of surface temperature changes, freezing events, precipitation and soil moisture effects and sea ice increases will be discussed.

  9. Composite Study Of Aerosol Long-Range Transport Events From East Asia And North America

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Waliser, D. E.; Guan, B.; Xavier, P.; Petch, J.; Klingaman, N. P.; Woolnough, S.

    2011-12-01

    While the Madden-Julian Oscillation (MJO) exerts pronounced influences on global climate and weather systems, current general circulation models (GCMs) exhibit rather limited capability in representing this prominent tropical variability mode. Meanwhile, the fundamental physics of the MJO are still elusive. Given the central role of the diabatic heating for prevailing MJO theories and demands for reducing the model deficiencies in simulating the MJO, a global model inter-comparison project on diabatic processes and vertical heating structure associated with the MJO has been coordinated through a joint effort by the WCRP-WWRP/THORPEX YOTC MJO Task Force and GEWEX GASS Program. In this presentation, progress of this model inter-comparison project will be reported, with main focus on climate simulations from about 27 atmosphere-only and coupled GCMs. Vertical structures of heating and diabatic processes associated with the MJO based on multi-model simulations will be presented along with their reanalysis and satellite estimate counterparts. Key processes possibly responsible for a realistic simulation of the MJO, including moisture-convection interaction, gross moist stability, ocean coupling, and surface heat flux, will be discussed.

  10. Impact of the LHC beam abort kicker prefire on high luminosity insertion and CMS detector performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.I. Drozhdin, N.V. Mokhov and M. Huhtinen

    1999-04-13

    The effect of possible accidental beam loss in LHC on the IP5 insertion elements and CMS detector is studied via realistic Monte Carlo simulations. Such beam loss could be the consequence of an unsynchronized abort or in worst case an accidental prefire of one of the abort kicker modules. Simulations with the STRUCT code show that this beam losses would take place in the IP5 inner and outer triplets. MARS simulations of the hadronic and electro-magnetic cascades induced in such an event indicate severe heating of the inner triplet quadrupoles. In order to protect the IP5 elements, two methods aremore » proposed: a set of shadow collimators in the outer triplet and a prefired module compensation using a special module charged with an opposite voltage (antikicker). The remnants of the accidental beam loss entering the experimental hall have been used as input for FLUKA simulations in the CMS detector. It is shown that it is vital to take measures to reliably protect the expensive CMS tracker components.« less

  11. 3D Hybrid Simulations of Interactions of High-Velocity Plasmoids with Obstacles

    NASA Astrophysics Data System (ADS)

    Omelchenko, Y. A.; Weber, T. E.; Smith, R. J.

    2015-11-01

    Interactions of fast plasma streams and objects with magnetic obstacles (dipoles, mirrors, etc) lie at the core of many space and laboratory plasma phenomena ranging from magnetoshells and solar wind interactions with planetary magnetospheres to compact fusion plasmas (spheromaks and FRCs) to astrophysics-in-lab experiments. Properly modeling ion kinetic, finite-Larmor radius and Hall effects is essential for describing large-scale plasma dynamics, turbulence and heating in complex magnetic field geometries. Using an asynchronous parallel hybrid code, HYPERS, we conduct 3D hybrid (particle-in-cell ion, fluid electron) simulations of such interactions under realistic conditions that include magnetic flux coils, ion-ion collisions and the Chodura resistivity. HYPERS does not step simulation variables synchronously in time but instead performs time integration by executing asynchronous discrete events: updates of particles and fields carried out as frequently as dictated by local physical time scales. Simulations are compared with data from the MSX experiment which studies the physics of magnetized collisionless shocks through the acceleration and subsequent stagnation of FRC plasmoids against a strong magnetic mirror and flux-conserving boundary.

  12. Evaluation of simulation training in cardiothoracic surgery: the Senior Tour perspective.

    PubMed

    Fann, James I; Feins, Richard H; Hicks, George L; Nesbitt, Jonathan C; Hammon, John W; Crawford, Fred A

    2012-02-01

    The study objective was to introduce senior surgeons, referred to as members of the "Senior Tour," to simulation-based learning and evaluate ongoing simulation efforts in cardiothoracic surgery. Thirteen senior cardiothoracic surgeons participated in a 2½-day Senior Tour Meeting. Of 12 simulators, each participant focused on 6 cardiac (small vessel anastomosis, aortic cannulation, cardiopulmonary bypass, aortic valve replacement, mitral valve repair, and aortic root replacement) or 6 thoracic surgical simulators (hilar dissection, esophageal anastomosis, rigid bronchoscopy, video-assisted thoracoscopic surgery lobectomy, tracheal resection, and sleeve resection). The participants provided critical feedback regarding the realism and utility of the simulators, which served as the basis for a composite assessment of the simulators. All participants acknowledged that simulation may not provide a wholly immersive experience. For small vessel anastomosis, the portable chest model is less realistic compared with the porcine model, but is valuable in teaching anastomosis mechanics. The aortic cannulation model allows multiple cannulations and can serve as a thoracic aortic surgery model. The cardiopulmonary bypass simulator provides crisis management experience. The porcine aortic valve replacement, mitral valve annuloplasty, and aortic root models are realistic and permit standardized training. The hilar dissection model is subject to variability of porcine anatomy and fragility of the vascular structures. The realistic esophageal anastomosis simulator presents various approaches to esophageal anastomosis. The exercise associated with the rigid bronchoscopy model is brief, and adding additional procedures should be considered. The tracheal resection, sleeve resection, and video-assisted thoracoscopic surgery lobectomy models are highly realistic and simulate advanced maneuvers. By providing the necessary tools, such as task trainers and assessment instruments, the Senior Tour may be one means to enhance simulation-based learning in cardiothoracic surgery. The Senior Tour members can provide regular programmatic evaluation and critical analyses to ensure that proposed simulators are of educational value. Published by Mosby, Inc.

  13. Model simulations of dense bottom currents in the Western Baltic Sea

    NASA Astrophysics Data System (ADS)

    Burchard, Hans; Janssen, Frank; Bolding, Karsten; Umlauf, Lars; Rennau, Hannes

    2009-01-01

    Only recently, medium intensity inflow events into the Baltic Sea have gained more awareness because of their potential to ventilate intermediate layers in the Southern Baltic Sea basins. With the present high-resolution model study of the Western Baltic Sea a first attempt is made to obtain model based realistic estimates of turbulent mixing in this area where dense bottom currents resulting from medium intensity inflow events are weakened by turbulent entrainment. The numerical model simulation which is carried out using the General Estuarine Transport Model (GETM) during nine months in 2003 and 2004 is first validated by means of three automatic stations at the Drogden and Darss Sills and in the Arkona Sea. In order to obtain good agreement between observations and model results, the 0.5×0.5 nautical mile bathymetry had to be adjusted in order to account for the fact that even at that scale many relevant topographic features are not resolved. Current velocity, salinity and turbulence observations during a medium intensity inflow event through the Øresund are then compared to the model results. Given the general problems of point to point comparisons between observations and model simulations, the agreement is fairly good with the characteristic features of the inflow event well represented by the model simulations. Two different bulk measures for mixing activity are then introduced, the vertically integrated decay of salinity variance, which is equal to the production of micro-scale salinity variance, and the vertically integrated turbulent salt flux, which is related to an increase of potential energy due to vertical mixing of stably stratified flow. Both measures give qualitatively similar results and identify the Drogden and Darss Sills as well as the Bornholm Channel as mixing hot spots. Further regions of strong mixing are the dense bottom current pathways from these sills into the Arkona Sea, areas around Kriegers Flak (a shoal in the western Arkona Sea) and north-west of the island of Rügen.

  14. A Low-Cost Simulation Model for R-Wave Synchronized Atrial Pacing in Pediatric Patients with Postoperative Junctional Ectopic Tachycardia

    PubMed Central

    Michel, Miriam; Egender, Friedemann; Heßling, Vera; Dähnert, Ingo; Gebauer, Roman

    2016-01-01

    Background Postoperative junctional ectopic tachycardia (JET) occurs frequently after pediatric cardiac surgery. R-wave synchronized atrial (AVT) pacing is used to re-establish atrioventricular synchrony. AVT pacing is complex, with technical pitfalls. We sought to establish and to test a low-cost simulation model suitable for training and analysis in AVT pacing. Methods A simulation model was developed based on a JET simulator, a simulation doll, a cardiac monitor, and a pacemaker. A computer program simulated electrocardiograms. Ten experienced pediatric cardiologists tested the model. Their performance was analyzed using a testing protocol with 10 working steps. Results Four testers found the simulation model realistic; 6 found it very realistic. Nine claimed that the trial had improved their skills. All testers considered the model useful in teaching AVT pacing. The simulation test identified 5 working steps in which major mistakes in performance test may impede safe and effective AVT pacing and thus permitted specific training. The components of the model (exclusive monitor and pacemaker) cost less than $50. Assembly and training-session expenses were trivial. Conclusions A realistic, low-cost simulation model of AVT pacing is described. The model is suitable for teaching and analyzing AVT pacing technique. PMID:26943363

  15. GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.

    PubMed

    Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu

    2010-12-01

    In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.

  16. GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators

    PubMed Central

    Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu

    2010-01-01

    Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651

  17. Mesoscale Convective Systems in SCSMEX: Simulated by a Regional Climate Model and a Cloud Resolving Model

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Qian, I.; Lau, W.; Shie, C.-L.; Starr, David (Technical Monitor)

    2002-01-01

    A Regional Land-Atmosphere Climate Simulation (RELACS) System is being developed and implemented at NASA Goddard Space Flight Center. One of the major goals of RELACS is to use a regional scale model with improved physical processes, in particular land-related processes, to understand the role of the land surface and its interaction with convection and radiation as well as the water and energy cycles in Indo-China/ South China Sea (SCS)/China, N. America and S. America. The Penn State/NCAR MM5 atmospheric modeling system, a state of the art atmospheric numerical model designed to simulate regional weather and climate, has been successfully coupled to the Goddard Parameterization for Land-Atmosphere-C loud Exchange (PLACE) land surface model. PLACE allows for the effects of vegetation, and thus important physical processes such as evapotranspiration and interception are included. The PLACE model incorporates vegetation type and has been shown in international comparisons to accurately predict evapotranspiration and runoff over a wide variety of land surfaces. The coupling of MM5 and PLACE creates a numerical modeling system with the potential to more realistically simulate the atmosphere and land surface processes including land-sea interaction, regional circulations such as monsoons, and flash flood events. RELACS has been used to simulate the onset of the South China Sea Monsoon in 1986, 1997 and 1998. Sensitivity tests on various land surface models, cumulus parameterization schemes (CPSs), sea surface temperature (SST) variations and midlatitude influences have been performed. These tests have indicated that the land surface model has a major impact on the circulation over the S. China Sea. CPSs can effect the precipitation pattern while SST variation can effect the precipitation amounts over both land and ocean. RELACS has also been used to understand the soil-precipitation interaction and feedback associated with a flood event that occurred in and around China's Yantz River during 1998. The exact location (region) of the flooding can be effected by the soil-rainfall feedback. Also, the Goddard Cumulus Ensemble (GCE) model which allows for realistic moist processes as well as explicit interactions between cloud and radiation, and cloud and surface processes will be used to simulate convective systems associated with the onset of the South China Sea Monsoon in 1998. The GCE model also includes the same PLACE and radiation scheme used in the RELACS. A detailed comparison between the results from the GCE model and RELACS will be performed.

  18. Mesoscale Convective Systems in SCSMEX: Simulated by a Regional Climate Model and a Cloud Resolving Model

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Lau, W.; Jia, Y.; Johnson, D.; Shie, C.-L.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A Regional Land-Atmosphere Climate Simulation (RELACS) System is being developed and implemented at NASA Goddard Space Flight Center. One of the major goals of RELACS is to use a regional scale model with improved physical processes, in particular land-related processes, to understand the role of the land surface and its interaction with convection and radiation as well as the water and energy cycles in Indo-China/South China Sea (SCS)/China, North America and South America. The Penn State/NCAR MM5 atmospheric modeling system, a state of the art atmospheric numerical model designed to simulate regional weather and climate, has been successfully coupled to the Goddard Parameterization for Land-Atmosphere-Cloud Exchange (PLACE) land surface model, PLACE allows for the effect A vegetation, and thus important physical processes such as evapotranspiration and interception are included. The PLACE model incorporates vegetation type and has been shown in international comparisons to accurately predict evapotranspiration and runoff over a wide variety of land surfaces. The coupling of MM5 and PLACE creates a numerical modeling system with the potential to more realistically simulate the atmosphere and land surface processes including land-sea interaction, regional circulations such as monsoons, and flash flood events. RELACS has been used to simulate the onset of the South China Sea Monsoon in 1986, 1991 and 1998. Sensitivity tests on various land surface models, cumulus parameterization schemes (CPSs), sea surface temperature (SST) variations and midlatitude influences have been performed. These tests have indicated that the land surface model has a major impact on the circulation over the South China Sea. CPSs can effect the precipitation pattern while SST variation can effect the precipitation amounts over both land and ocean. RELACS has also been used to understand the soil-precipitation interaction and feedback associated with a flood event that occurred in and around China's Yantz River during 1998. The exact location (region) of the flooding can be effected by the soil-rainfall feedback. Also, the Goddard Cumulus Ensemble (GCE) model which allows for realistic moist processes as well as explicit interactions between cloud and radiation, and cloud and surface processes will be used to simulate convective systems associated with the onset of the South China Sea Monsoon in 1998. The GCE model also includes the same PLACE and radiation scheme used in the RELACS. A detailed comparison between the results from the GCE model and RELACS will be performed.

  19. A model based bayesian solution for characterization of complex damage scenarios in aerospace composite structures.

    PubMed

    Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J

    2018-01-01

    Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Evolution of Our Understanding of the Solar Dynamo During Solar Cycle 24

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, A.

    2017-12-01

    Solar cycle 24 has been an exciting cycle for our understanding of the solar dynamo: 1. It was the first cycle for which dynamo based predictions were ever used teaching us valuable lessons. 2. It has given us the opportunity to observe a deep minimum and a weak cycle with a high level of of observational detail . 3. It is full of breaktrhoughs in anelastic MHD dynamo simulations (regular cycles, buoyant flux-tubes, mounder-like events). 4. It has seen the creation of bridges between the kinematic flux-transport and anelastic MHD approaches. 5. It has ushered a new generation of realistic surface flux-transport simulations 6. We have achieved significant observational progress in our understanding of solar cycle propagation. The objective of this talk is to highlight some of the most important results, giving special emphasis on what they have taught us about solar cycle predictability.

  1. Representation of Stormflow and a More Responsive Water Table in a TOPMODEL-Based Hydrology Model

    NASA Technical Reports Server (NTRS)

    Shaman, Jeffrey; Stieglitz, Marc; Engel, Victor; Koster, Randal; Stark, Colin; Houser, Paul R. (Technical Monitor)

    2001-01-01

    This study presents two new modeling strategies. First, a methodology for representing the physical process of stormflow within a TOPMODEL framework is developed. In using this approach, discharge at quickflow time scales is simulated and a fuller depiction of hydrologic activity is brought about. Discharge of water from the vadose zone is permitted in a physically realistic manner without a priori assumption of the level within the soil column at which stormflow saturation can take place. Determination of the stormflow contribution to discharge is made using the equation for groundwater flow. No new parameters are needed. Instead, regions of near saturation that develop during storm events, producing vertical recharge, are allowed to contribute to soil column discharge. These stormflow contributions to river runoff, as for groundwater flow contributions, are a function of catchment topography and local hydraulic conductivity at the depth of these regions of near saturation. The second approach improves groundwater flow response through a reduction of porosity and field capacity with depth in the soil column. Large storm events are better captured and a more dynamic water table develops with application of this modified soil column profile (MSCP). The MSCP predominantly reflects soil depth differences in upland and lowland regions of a watershed. Combined, these two approaches - stormflow and the MSCP - provide a more accurate representation of the time scales at which soil column discharge responds and a more complete depiction of hydrologic activity. Storm events large and small are better simulated, and some of the biases previously evident in TOPMODEL simulations are reduced.

  2. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  3. Spline Laplacian estimate of EEG potentials over a realistic magnetic resonance-constructed scalp surface model.

    PubMed

    Babiloni, F; Babiloni, C; Carducci, F; Fattorini, L; Onorati, P; Urbano, A

    1996-04-01

    This paper presents a realistic Laplacian (RL) estimator based on a tensorial formulation of the surface Laplacian (SL) that uses the 2-D thin plate spline function to obtain a mathematical description of a realistic scalp surface. Because of this tensorial formulation, the RL does not need an orthogonal reference frame placed on the realistic scalp surface. In simulation experiments the RL was estimated with an increasing number of "electrodes" (up to 256) on a mathematical scalp model, the analytic Laplacian being used as a reference. Second and third order spherical spline Laplacian estimates were examined for comparison. Noise of increasing magnitude and spatial frequency was added to the simulated potential distributions. Movement-related potentials and somatosensory evoked potentials sampled with 128 electrodes were used to estimate the RL on a realistically shaped, MR-constructed model of the subject's scalp surface. The RL was also estimated on a mathematical spherical scalp model computed from the real scalp surface. Simulation experiments showed that the performances of the RL estimator were similar to those of the second and third order spherical spline Laplacians. Furthermore, the information content of scalp-recorded potentials was clearly better when the RL estimator computed the SL of the potential on an MR-constructed scalp surface model.

  4. Realistic and efficient 2D crack simulation

    NASA Astrophysics Data System (ADS)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  5. The Response of Extreme Precipitation to Climate Change in the North American Monsoon Region

    NASA Astrophysics Data System (ADS)

    Pascale, S.; Bordoni, S.; Kapnick, S. B.; Delworth, T. L.; Murakami, H.

    2017-12-01

    Gulf of California moisture surges (GoC surges) transport lower-level moisture in the southwestern United States and can trigger widespread convective bursts during the summertime North American monsoon (NAM). The intensity of such bursts varies over a wide spectrum, going from drier-than-average to extremely intense and persisting events. In this study we use a 50 km-horizontal resolution global coupled model (FLOR) developed at the NOAA Geophysical Fluid Dynamics Laboratory and featuring a realistic simulation of the GoC surges. We evaluate the model's ability to reproduce the intensity of precipitation during GoC surge and non-surge periods in present and doubled CO2 climatic conditions. We find that the mean number of GoC surge events per monsoon season (i.e., approximately 15) is not significantly affected by CO2 forcing. Nevertheless, when SST biases are minimized through flux adjustment, FLOR predicts a reduction in monsoonal precipitation over the southwestern United States. Our simulations further suggest that surge-related rainfall adjusts towards lower and higher percentiles, while becoming less important at intermediate values. Convective precipitation not occurring during GoC surges is instead not coherently affected by doubled CO2. Finally, the influence of CO2 forcing on the large-scale drivers of monsoonal precipitation during GoC surge events, such as the position of the monsoonal ridge, is investigated and related to precipitation changes.

  6. Contribution of land use changes to meteorological parameters in Greater Jakarta: Case 17 January 2014

    NASA Astrophysics Data System (ADS)

    Nuryanto, D. E.; Pawitan, H.; Hidayat, R.; Aldrian, E.

    2018-05-01

    The impact of land use changes on meteorological parameters during a heavy rainfall event on 17 January 2014 in Greater Jakarta (GJ) was examined using the Weather Research and Forecasting (WRF) model. This study performed two experimental simulation methods. The first WRF simulation uses default land use (CTL). The second simulation applies the experiment by changing the size of urban and built-up land use (SCE). The Global Forecast System (GFS) data is applied to provide more realistic initial and boundary conditions for the nested model domains (3 km, 1 km). The simulations were initiated at 00:00 UTC January 13, 2014 and the period of modeling was equal to six days. The air temperature and the precipitation pattern in GJ shows a good agreement between the observed and simulated data. The results show a consistent significant contribution of urban development and accompany land use changes in air temperature and precipitation. According to the model simulation, urban and built-up land contributed about 6% of heavy rainfall and about 0.2 degrees of air temperatures in the morning. Simulations indicate that new urban developments led to an intensification and expansion of the rain area. The results can support the decision-making of flooding and watershed management.

  7. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    PubMed

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  8. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model

    PubMed Central

    Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.

    2017-01-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746

  9. SPH simulations of WBC adhesion to the endothelium: the role of haemodynamics and endothelial binding kinetics.

    PubMed

    Gholami, Babak; Comerford, Andrew; Ellero, Marco

    2015-11-01

    A multiscale Lagrangian particle solver introduced in our previous work is extended to model physiologically realistic near-wall cell dynamics. Three-dimensional simulation of particle trajectories is combined with realistic receptor-ligand adhesion behaviour to cover full cell interactions in the vicinity of the endothelium. The selected stochastic adhesion model, which is based on a Monte Carlo acceptance-rejection method, fits in our Lagrangian framework and does not compromise performance. Additionally, appropriate inflow/outflow boundary conditions are implemented for our SPH solver to enable realistic pulsatile flow simulation. The model is tested against in-vitro data from a 3D geometry with a stenosis and sudden expansion. In both steady and pulsatile flow conditions, results show close agreement with the experimental ones. Furthermore we demonstrate, in agreement with experimental observations, that haemodynamics alone does not account for adhesion of white blood cells, in this case U937 monocytic human cells. Our findings suggest that the current framework is fully capable of modelling cell dynamics in large arteries in a realistic and efficient manner.

  10. Computing return times or return periods with rare event algorithms

    NASA Astrophysics Data System (ADS)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  11. Combining Gravitational Wave Events with their Electromagnetic Counterparts: A Realistic Joint False-Alarm Rate

    NASA Astrophysics Data System (ADS)

    Ackley, Kendall; Eikenberry, Stephen; Klimenko, Sergey; LIGO Team

    2017-01-01

    We present a false-alarm rate for a joint detection of gravitational wave (GW) events and associated electromagnetic (EM) counterparts for Advanced LIGO and Virgo (LV) observations during the first years of operation. Using simulated GW events and their recostructed probability skymaps, we tile over the error regions using sets of archival wide-field telescope survey images and recover the number of astrophysical transients to be expected during LV-EM followup. With the known GW event injection coordinates we inject artificial electromagnetic (EM) sources at that site based on theoretical and observational models on a one-to-one basis. We calculate the EM false-alarm probability using an unsupervised machine learning algorithm based on shapelet analysis which has shown to be a strong discriminator between astrophysical transients and image artifacts while reducing the set of transients to be manually vetted by five orders of magnitude. We also show the performance of our method in context with other machine-learned transient classification and reduction algorithms, showing comparability without the need for a large set of training data opening the possibility for next-generation telescopes to take advantage of this pipeline for LV-EM followup missions.

  12. The Importance of Electron Source Population to the Remarkable Enhancement of Radiation belt Electrons during the October 2012 Storm

    NASA Astrophysics Data System (ADS)

    Tu, W.; Cunningham, G.; Reeves, G. D.; Chen, Y.; Henderson, M. G.; Blake, J. B.; Baker, D. N.; Spence, H.

    2013-12-01

    During the October 8-9 2012 storm, the MeV electron fluxes in the heart of the outer radiation belt are first wiped out then exhibit a three-orders-of-magnitude increase on the timescale of hours, as observed by the MagEIS and REPT instruments aboard the Van Allen Probes. There is strong observational evidence that the remarkable enhancement is due to local acceleration by chorus waves, as shown in the recent Science paper by Reeves et al.1. However, the importance of the dynamic electron source population transported in from the plasma sheet, to the observed remarkable enhancement, has not been studied. We illustrate the importance of the source population with our simulation of the event using the DREAM 3D diffusion model. Three new modifications have been implemented in the model: 1) incorporating a realistic and time-dependent low-energy boundary condition at 100 keV obtained from the MagEIS data; 2) utilizing event-specific chorus wave distributions derived from the low-energy electron precipitation observed by POES and validated against the in situ wave data from EMFISIS; 3) using an ';open' boundary condition at L*=11 and implementing electron lifetimes on the order of the drift period outside the solar-wind driven last closed drift shell. The model quantitatively reproduces the MeV electron dynamics during this event, including the fast dropout at the start of Oct. 8th, low electron flux during the first Dst dip, and the remarkable enhancement peaked at L*=4.2 during the second Dst dip. By comparing the model results with realistic source population against those with constant low-energy boundary (see figure), we find that the realistic electron source population is critical to reproduce the observed fast and significant increase of MeV electrons. 1Reeves, G. D., et al. (2013), Electron Acceleration in the Heart of the Van Allen Radiation Belts, Science, DOI:10.1126/science.1237743. Comparison between data and model results during the October 2012 storm for electrons at μ=3168 MeV/G and K=0.1 G1/2Re. Top plot is the electron phase space density data measured by the two Van Allen Probes; middle plot shows the results from the DREAM 3D diffusion model with a realistic electron source population derived from MagEIS data; and the bottom plot is the model results with a constant source population.

  13. Simulation of realistic retinoscopic measurement

    NASA Astrophysics Data System (ADS)

    Tan, Bo; Chen, Ying-Ling; Baker, K.; Lewis, J. W.; Swartz, T.; Jiang, Y.; Wang, M.

    2007-03-01

    Realistic simulation of ophthalmic measurements on normal and diseased eyes is presented. We use clinical data of ametropic and keratoconus patients to construct anatomically accurate three-dimensional eye models and simulate the measurement of a streak retinoscope with all the optical elements. The results show the clinical observations including the anomalous motion in high myopia and the scissors reflex in keratoconus. The demonstrated technique can be applied to other ophthalmic instruments and to other and more extensively abnormal eye conditions. It provides promising features for medical training and for evaluating and developing ocular instruments.

  14. Recent developments in track reconstruction and hadron identification at MPD

    NASA Astrophysics Data System (ADS)

    Mudrokh, A.; Zinchenko, A.

    2017-03-01

    A Monte Carlo simulation of real detector effects with as many details as possible has been carried out instead of a simplified Geant point smearing approach during the study of the detector performance. Some results of realistic simulation of the MPD TPC (Time Projection Chamber) including digitization in central Au+Au collisions have been obtained. Particle identification (PID) has been tuned to account for modifications in the track reconstruction. Some results on hadron identification in the TPC and TOF (Time Of Flight) detectors with realistically simulated response have been also obtained.

  15. Development of synthetic simulators for endoscope-assisted repair of metopic and sagittal craniosynostosis.

    PubMed

    Eastwood, Kyle W; Bodani, Vivek P; Haji, Faizal A; Looi, Thomas; Naguib, Hani E; Drake, James M

    2018-06-01

    OBJECTIVE Endoscope-assisted repair of craniosynostosis is a safe and efficacious alternative to open techniques. However, this procedure is challenging to learn, and there is significant variation in both its execution and outcomes. Surgical simulators may allow trainees to learn and practice this procedure prior to operating on an actual patient. The purpose of this study was to develop a realistic, relatively inexpensive simulator for endoscope-assisted repair of metopic and sagittal craniosynostosis and to evaluate the models' fidelity and teaching content. METHODS Two separate, 3D-printed, plastic powder-based replica skulls exhibiting metopic (age 1 month) and sagittal (age 2 months) craniosynostosis were developed. These models were made into consumable skull "cartridges" that insert into a reusable base resembling an infant's head. Each cartridge consists of a multilayer scalp (skin, subcutaneous fat, galea, and periosteum); cranial bones with accurate landmarks; and the dura mater. Data related to model construction, use, and cost were collected. Eleven novice surgeons (residents), 9 experienced surgeons (fellows), and 5 expert surgeons (attendings) performed a simulated metopic and sagittal craniosynostosis repair using a neuroendoscope, high-speed drill, rongeurs, lighted retractors, and suction/irrigation. All participants completed a 13-item questionnaire (using 5-point Likert scales) to rate the realism and utility of the models for teaching endoscope-assisted strip suturectomy. RESULTS The simulators are compact, robust, and relatively inexpensive. They can be rapidly reset for repeated use and contain a minimal amount of consumable material while providing a realistic simulation experience. More than 80% of participants agreed or strongly agreed that the models' anatomical features, including surface anatomy, subgaleal and subperiosteal tissue planes, anterior fontanelle, and epidural spaces, were realistic and contained appropriate detail. More than 90% of participants indicated that handling the endoscope and the instruments was realistic, and also that the steps required to perform the procedure were representative of the steps required in real life. CONCLUSIONS Both the metopic and sagittal craniosynostosis simulators were developed using low-cost methods and were successfully designed to be reusable. The simulators were found to realistically represent the surgical procedure and can be used to develop the technical skills required for performing an endoscope-assisted craniosynostosis repair.

  16. The Fogo's Collapse-triggered Megatsunami: Evidence-calibrated Numerical Simulations of Tsunamigenic Potential and Coastal Impact

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Ramalho, Ricardo S.; Quartau, Rui; Ramalho, Inês; Madeira, José; Baptista, Maria Ana

    2017-04-01

    Volcanic Ocean Islands are very prominent and dynamic features involving several constructive and destructive phases during their life-cycles. Large-scale gravitational flank collapses are one of the most destructive processes and can present a major source of hazard, since it has been shown that these events are capable of triggering megatsunamis with significant coastal impact. The Fogo volcanic island, Cape Verde, presents evidence for giant edifice mass-wasting, as attested by both onshore and offshore evidence. A recent study by Ramalho et al. (2015) revealed the presence of tsunamigenic deposits that attest the generation of a megatsunami with devastating impact on the nearby Santiago Island, following Fogo's catastrophic collapse. Evidence from northern Santiago implies local minimum run-ups of 270 m, providing a unique physical framework to test collapse-triggered tsunami numerical simulations. In this study, we investigate the tsunamigenic potential associated with Fogo's flank collapse, and its impact on the Islands of the Cape Verde archipelago using field evidence-calibrated numerical simulations. We first reconstruct the pre-event island morphology, and then employ a multilayer numerical model to simulate the flank failure flow towards and under the sea, the ensuing tsunami generation, propagation and coastal impact. We use a digital elevation model that considers the coastline configuration and the sea level at the time of the event. Preliminary numerical modeling results suggest that collapsed volumes of 90-150 km3, in one single event, generate numerical solutions that are compatible with field evidence. Our simulations suggest that Fogo's collapse triggered a megatsunami that reached the coast of Santiago in 8 min, and with wave heights in excess of 250 m. The tsunami waves propagated with lower amplitudes towards the Cape Verde Islands located northward of Fogo. This study will contribute to more realistically assess the scale of risks associated with these extremely rare but very high impact natural disasters. This work is supported by the EU project ASTARTE -Grant 603839, 7th FP (ENV.2013, 6.4-3), the EU project TSUMAPS-NEAM -Agreement Number: ECHO/SUB/2015/718568/PREV26, and the IF/01641/2015 MEGAWAVE - FCT project.

  17. A multidisciplinary approach to teach responses to weapons of mass destruction and terrorism using combined simulation modalities.

    PubMed

    Kyle, Richard R; Via, Darin K; Lowy, R Joel; Madsen, James M; Marty, Aileen M; Mongan, Paul D

    2004-03-01

    To reinforce concepts presented in the lectures; understand the complexity and speed of casualty and information generation during a Weapons of Mass Destruction and Terrorism (WMD/T) event; experience the novelty of combined weapons' effects; recognize the time course of the various chemical, biological, and radiation agents; and make challenging decisions with incomplete and conflicting information. Two environments simulated simultaneously: one a major trauma center emergency room (ER) with two patient simulators and several human actors; the other an Emergency Operations Command Center (EOC). Students for this course included: clinicians, scientists, military and intelligence officers, lawyers, administrators, and logistic personnel whose jobs involve planning and executing emergency response plans to WMD/T. SIMULATION SCRIPT: A WMD/T attack in Washington, D.C., has occurred. Clinical students performed in their real life roles in the simulated ER, while nonclinical students did the same in the simulated EOC. Six ER casualties with combined WMD/T injuries were presented and treated over 40 minutes. In the EOC, each person was given his or her role title with identification tag. The EOC scenario took cues from the action in the ER via two television (TV) news feeds and telephone calls from other Emergency Operations Assets. PERFORMANCE EXPECTATIONS: Students were expected to actively engage in their roles. Student performances were self-evaluated during the debriefing. DEBRIEFING: The two groups were reunited and debriefed utilizing disaster crisis resource management tools. ASSESSMENT OF EFFECTIVENESS: Students answered an 18-point questionnaire to help evaluate the usefulness and acceptance of multimodality patient simulation. Large-scale multimodality patient simulation can be used to train both clinicians and nonclinicians for future events of WMD/T. Students accepted the simulation experience and thought that scenario was appropriately realistic, complex, and overwhelming. Difficulties include the extensive man-hours involved in designing and presenting the live simulations. EOC-only sessions could be staged with only a few video cassette recorders, TVs, telephones, and callers.

  18. Semi-Autonomous Control with Cyber-Pain for Artificial Muscles and Smart Structures

    DTIC Science & Technology

    2010-09-15

    avoid some key failure modes. Our approach has built on our developments in dynamic self-sensing and realistic simulation of DEA electromechanics...local controller) to avoid some key failure modes. Our approach has built on our developments in dynamic self-sensing and realistic simulation of DEA...strains [4]. In its natural state long polymer backbones are entangled with intermittent cross-links tying neighbouring backbones together. The soft

  19. Realistic Modeling of Multi-Scale MHD Dynamics of the Solar Atmosphere

    NASA Technical Reports Server (NTRS)

    Kitiashvili, Irina; Mansour, Nagi N.; Wray, Alan; Couvidat, Sebastian; Yoon, Seokkwan; Kosovichev, Alexander

    2014-01-01

    Realistic 3D radiative MHD simulations open new perspectives for understanding the turbulent dynamics of the solar surface, its coupling to the atmosphere, and the physical mechanisms of generation and transport of non-thermal energy. Traditionally, plasma eruptions and wave phenomena in the solar atmosphere are modeled by prescribing artificial driving mechanisms using magnetic or gas pressure forces that might arise from magnetic field emergence or reconnection instabilities. In contrast, our 'ab initio' simulations provide a realistic description of solar dynamics naturally driven by solar energy flow. By simulating the upper convection zone and the solar atmosphere, we can investigate in detail the physical processes of turbulent magnetoconvection, generation and amplification of magnetic fields, excitation of MHD waves, and plasma eruptions. We present recent simulation results of the multi-scale dynamics of quiet-Sun regions, and energetic effects in the atmosphere and compare with observations. For the comparisons we calculate synthetic spectro-polarimetric data to model observational data of SDO, Hinode, and New Solar Telescope.

  20. First Results of an “Artificial Retina” Processor Prototype

    DOE PAGES

    Cenci, Riccardo; Bedeschi, Franco; Marino, Pietro; ...

    2016-11-15

    We report on the performance of a specialized processor capable of reconstructing charged particle tracks in a realistic LHC silicon tracker detector, at the same speed of the readout and with sub-microsecond latency. The processor is based on an innovative pattern-recognition algorithm, called “artificial retina algorithm”, inspired from the vision system of mammals. A prototype of the processor has been designed, simulated, and implemented on Tel62 boards equipped with high-bandwidth Altera Stratix III FPGA devices. Also, the prototype is the first step towards a real-time track reconstruction device aimed at processing complex events of high-luminosity LHC experiments at 40 MHzmore » crossing rate.« less

  1. Modelling remediation scenarios in historical mining catchments.

    PubMed

    Gamarra, Javier G P; Brewer, Paul A; Macklin, Mark G; Martin, Katherine

    2014-01-01

    Local remediation measures, particularly those undertaken in historical mining areas, can often be ineffective or even deleterious because erosion and sedimentation processes operate at spatial scales beyond those typically used in point-source remediation. Based on realistic simulations of a hybrid landscape evolution model combined with stochastic rainfall generation, we demonstrate that similar remediation strategies may result in differing effects across three contrasting European catchments depending on their topographic and hydrologic regimes. Based on these results, we propose a conceptual model of catchment-scale remediation effectiveness based on three basic catchment characteristics: the degree of contaminant source coupling, the ratio of contaminated to non-contaminated sediment delivery, and the frequency of sediment transport events.

  2. A fast low-to-high confinement mode bifurcation dynamics in the boundary-plasma gyrokinetic code XGC1

    NASA Astrophysics Data System (ADS)

    Ku, S.; Chang, C. S.; Hager, R.; Churchill, R. M.; Tynan, G. R.; Cziegler, I.; Greenwald, M.; Hughes, J.; Parker, S. E.; Adams, M. F.; D'Azevedo, E.; Worley, P.

    2018-05-01

    A fast edge turbulence suppression event has been simulated in the electrostatic version of the gyrokinetic particle-in-cell code XGC1 in a realistic diverted tokamak edge geometry under neutral particle recycling. The results show that the sequence of turbulent Reynolds stress followed by neoclassical ion orbit-loss driven together conspire to form the sustaining radial electric field shear and to quench turbulent transport just inside the last closed magnetic flux surface. The main suppression action is located in a thin radial layer around ψN≃0.96 -0.98 , where ψN is the normalized poloidal flux, with the time scale ˜0.1 ms.

  3. First Results of an “Artificial Retina” Processor Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cenci, Riccardo; Bedeschi, Franco; Marino, Pietro

    We report on the performance of a specialized processor capable of reconstructing charged particle tracks in a realistic LHC silicon tracker detector, at the same speed of the readout and with sub-microsecond latency. The processor is based on an innovative pattern-recognition algorithm, called “artificial retina algorithm”, inspired from the vision system of mammals. A prototype of the processor has been designed, simulated, and implemented on Tel62 boards equipped with high-bandwidth Altera Stratix III FPGA devices. Also, the prototype is the first step towards a real-time track reconstruction device aimed at processing complex events of high-luminosity LHC experiments at 40 MHzmore » crossing rate.« less

  4. Dry matter partitioning models for the simulation of individual fruit growth in greenhouse cucumber canopies

    PubMed Central

    Wiechers, Dirk; Kahlen, Katrin; Stützel, Hartmut

    2011-01-01

    Background and Aims Growth imbalances between individual fruits are common in indeterminate plants such as cucumber (Cucumis sativus). In this species, these imbalances can be related to differences in two growth characteristics, fruit growth duration until reaching a given size and fruit abortion. Both are related to distribution, and environmental factors as well as canopy architecture play a key role in their differentiation. Furthermore, events leading to a fruit reaching its harvestable size before or simultaneously with a prior fruit can be observed. Functional–structural plant models (FSPMs) allow for interactions between environmental factors, canopy architecture and physiological processes. Here, we tested hypotheses which account for these interactions by introducing dominance and abortion thresholds for the partitioning of assimilates between growing fruits. Methods Using the L-System formalism, an FSPM was developed which combined a model for architectural development, a biochemical model of photosynthesis and a model for assimilate partitioning, the last including a fruit growth model based on a size-related potential growth rate (RP). Starting from a distribution proportional to RP, the model was extended by including abortion and dominance. Abortion was related to source strength and dominance to sink strength. Both thresholds were varied to test their influence on fruit growth characteristics. Simulations were conducted for a dense row and a sparse isometric canopy. Key Results The simple partitioning models failed to simulate individual fruit growth realistically. The introduction of abortion and dominance thresholds gave the best results. Simulations of fruit growth durations and abortion rates were in line with measurements, and events in which a fruit was harvestable earlier than an older fruit were reproduced. Conclusions Dominance and abortion events need to be considered when simulating typical fruit growth traits. By integrating environmental factors, the FSPM can be a valuable tool to analyse and improve existing knowledge about the dynamics of assimilates partitioning. PMID:21715366

  5. On the Lack of Stratospheric Dynamical Variability in Low-top Versions of the CMIP5 Models

    NASA Technical Reports Server (NTRS)

    Charlton-Perez, Andrew J.; Baldwin, Mark P.; Birner, Thomas; Black, Robert X.; Butler, Amy H.; Calvo, Natalia; Davis, Nicholas A.; Gerber, Edwin P.; Gillett, Nathan; Hardiman, Steven; hide

    2013-01-01

    We describe the main differences in simulations of stratospheric climate and variability by models within the fifth Coupled Model Intercomparison Project (CMIP5) that have a model top above the stratopause and relatively fine stratospheric vertical resolution (high-top), and those that have a model top below the stratopause (low-top). Although the simulation of mean stratospheric climate by the two model ensembles is similar, the low-top model ensemble has very weak stratospheric variability on daily and interannual time scales. The frequency of major sudden stratospheric warming events is strongly underestimated by the low-top models with less than half the frequency of events observed in the reanalysis data and high-top models. The lack of stratospheric variability in the low-top models affects their stratosphere-troposphere coupling, resulting in short-lived anomalies in the Northern Annular Mode, which do not produce long-lasting tropospheric impacts, as seen in observations. The lack of stratospheric variability, however, does not appear to have any impact on the ability of the low-top models to reproduce past stratospheric temperature trends. We find little improvement in the simulation of decadal variability for the high-top models compared to the low-top, which is likely related to the fact that neither ensemble produces a realistic dynamical response to volcanic eruptions.

  6. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Astrophysics Data System (ADS)

    Moradi, I.; Prive, N.; McCarty, W.; Errico, R. M.; Gelaro, R.

    2017-12-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  7. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Technical Reports Server (NTRS)

    Moradi, Isaac; Prive, Nikki; McCarty, Will; Errico, Ronald M.; Gelaro, Ron

    2017-01-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  8. The VIIRS Ocean Data Simulator Enhancements and Results

    NASA Technical Reports Server (NTRS)

    Robinson, Wayne D.; Patt, Fredrick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.

    2011-01-01

    The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.

  9. The VIIRS ocean data simulator enhancements and results

    NASA Astrophysics Data System (ADS)

    Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.

    2011-10-01

    The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.

  10. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    PubMed

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Random species loss underestimates dilution effects of host diversity on foliar fungal diseases under fertilization.

    PubMed

    Liu, Xiang; Chen, Fei; Lyu, Shengman; Sun, Dexin; Zhou, Shurong

    2018-02-01

    With increasing attention being paid to the consequences of global biodiversity losses, several recent studies have demonstrated that realistic species losses can have larger impacts than random species losses on community productivity and resilience. However, little is known about the effects of the order in which species are lost on biodiversity-disease relationships. Using a multiyear nitrogen addition and artificial warming experiment in natural assemblages of alpine meadow vegetation on the Qinghai-Tibetan Plateau, we inferred the sequence of plant species losses under fertilization/warming. Then the sequence of species losses under fertilization/warming was used to simulate the species loss orders (both realistic and random) in an adjacently novel removal experiment manipulating plot-level plant diversity. We explicitly compared the effect sizes of random versus realistic species losses simulated from fertilization/warming on plant foliar fungal diseases. We found that realistic species losses simulated from fertilization had greater effects than random losses on fungal diseases, and that species identity drove the diversity-disease relationship. Moreover, the plant species most prone to foliar fungal diseases were also the least vulnerable to extinction under fertilization, demonstrating the importance of protecting low competence species (the ability to maintain and transmit fungal infections was low) to impede the spread of infectious disease. In contrast, there was no difference between random and realistic species loss scenarios simulated from experimental warming (or the combination of warming and fertilization) on the diversity-disease relationship, indicating that the functional consequences of species losses may vary under different drivers.

  12. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  13. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  14. Creating a Realistic Weather Environment for Motion-Based Piloted Flight Simulation

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Schaffner, Philip R.; Evans, Emory T.; Neece, Robert T.; Young, Steve D.

    2012-01-01

    A flight simulation environment is being enhanced to facilitate experiments that evaluate research prototypes of advanced onboard weather radar, hazard/integrity monitoring (HIM), and integrated alerting and notification (IAN) concepts in adverse weather conditions. The simulation environment uses weather data based on real weather events to support operational scenarios in a terminal area. A simulated atmospheric environment was realized by using numerical weather data sets. These were produced from the High-Resolution Rapid Refresh (HRRR) model hosted and run by the National Oceanic and Atmospheric Administration (NOAA). To align with the planned flight simulation experiment requirements, several HRRR data sets were acquired courtesy of NOAA. These data sets coincided with severe weather events at the Memphis International Airport (MEM) in Memphis, TN. In addition, representative flight tracks for approaches and departures at MEM were generated and used to develop and test simulations of (1) what onboard sensors such as the weather radar would observe; (2) what datalinks of weather information would provide; and (3) what atmospheric conditions the aircraft would experience (e.g. turbulence, winds, and icing). The simulation includes a weather radar display that provides weather and turbulence modes, derived from the modeled weather along the flight track. The radar capabilities and the pilots controls simulate current-generation commercial weather radar systems. Appropriate data-linked weather advisories (e.g., SIGMET) were derived from the HRRR weather models and provided to the pilot consistent with NextGen concepts of use for Aeronautical Information Service (AIS) and Meteorological (MET) data link products. The net result of this simulation development was the creation of an environment that supports investigations of new flight deck information systems, methods for incorporation of better weather information, and pilot interface and operational improvements for better aviation safety. This research is part of a larger effort at NASA to study the impact of the growing complexity of operations, information, and systems on crew decision-making and response effectiveness; and then to recommend methods for improving future designs.

  15. A fast analytical undulator model for realistic high-energy FEL simulations

    NASA Astrophysics Data System (ADS)

    Tatchyn, R.; Cremer, T.

    1997-02-01

    A number of leading FEL simulation codes used for modeling gain in the ultralong undulators required for SASE saturation in the <100 Å range employ simplified analytical models both for field and error representations. Although it is recognized that both the practical and theoretical validity of such codes could be enhanced by incorporating realistic undulator field calculations, the computational cost of doing this can be prohibitive, especially for point-to-point integration of the equations of motion through each undulator period. In this paper we describe a simple analytical model suitable for modeling realistic permanent magnet (PM), hybrid/PM, and non-PM undulator structures, and discuss selected techniques for minimizing computation time.

  16. Investigation of nozzle flow and cavitation characteristics in a diesel injector.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Som, S.; Ramirez, A.; Aggarwal, S.

    2010-04-01

    Cavitation and turbulence inside a diesel injector play a critical role in primary spray breakup and development processes. The study of cavitation in realistic injectors is challenging, both theoretically and experimentally, since the associated two-phase flow field is turbulent and highly complex, characterized by large pressure gradients and small orifice geometries. We report herein a computational investigation of the internal nozzle flow and cavitation characteristics in a diesel injector. A mixture based model in FLUENT V6.2 software is employed for simulations. In addition, a new criterion for cavitation inception based on the total stress is implemented, and its effectiveness inmore » predicting cavitation is evaluated. Results indicate that under realistic diesel engine conditions, cavitation patterns inside the orifice are influenced by the new cavitation criterion. Simulations are validated using the available two-phase nozzle flow data and the rate of injection measurements at various injection pressures (800-1600 bar) from the present study. The computational model is then used to characterize the effects of important injector parameters on the internal nozzle flow and cavitation behavior, as well as on flow properties at the nozzle exit. The parameters include injection pressure, needle lift position, and fuel type. The propensity of cavitation for different on-fleet diesel fuels is compared with that for n-dodecane, a diesel fuel surrogate. Results indicate that the cavitation characteristics of n-dodecane are significantly different from those of the other three fuels investigated. The effect of needle movement on cavitation is investigated by performing simulations at different needle lift positions. Cavitation patterns are seen to shift dramatically as the needle lift position is changed during an injection event. The region of significant cavitation shifts from top of the orifice to bottom of the orifice as the needle position is changed from fully open (0.275 mm) to nearly closed (0.1 mm), and this behavior can be attributed to the effect of needle position on flow patterns upstream of the orifice. The results demonstrate the capability of the cavitation model to predict cavitating nozzle flows in realistic diesel injectors and provide boundary conditions, in terms of vapor fraction, velocity, and turbulence parameters at the nozzle exit, which can be coupled with the primary breakup simulation.« less

  17. Contingency Support Simulation for the Tracking and Data Relay Satellite System (TDRSS)

    NASA Technical Reports Server (NTRS)

    Dykes, Andy; Dunham, Joan; Ward, Douglas T.; Robertson, Mika; Nesbit, Gary

    2007-01-01

    In March 2006, the Tracking and Data Relay Satellite (TDRS)-3 experienced an unexpected thrusting event, which caused significant changes to its orbit. Recovery from this anomaly was protracted, raising concerns during the Independent Review Team (IRT) investigation of the anomaly regarding the contingency response readiness. The simulations and readiness exercises discussed in this paper were part of the response to the IRT concerns. This paper explains the various levels of simulation needed to enhance the proficiency of the Flight Dynamics Facility (FDF) and supporting elements in recovery from a TDRS contingency situation. The main emergency to address is when a TDRS has experienced uncommanded, unreported, or misreported thrusting, causing a ground station to lose the ability to acquire the spacecraft, as happened in 2006. The following levels of simulation are proposed: 1) Tests that would be performed by the individual support sites to verify that internal procedures and tools are in place and up to date; 2) Tabletop simulations that would involve all of the key support sites talking through their respective operating procedures to ensure that proper notifications are made and communications links are established; and 3) Comprehensive simulations that would be infrequent, but realistic, involving data exchanges between ground sites and voice and electronic communications among the supporting elements.

  18. Raising Money and Cultivating Donors through Special Events.

    ERIC Educational Resources Information Center

    Harris, April L.

    This book provides "how to" information for those who have never managed a fund-raising event for a college or university, and gives advice to help the experienced professional fund-raiser fine-tune traditional or new events. It covers the essential steps in event planning such as developing a realistic budget, and includes a budget planning…

  19. Impacts of Realistic Urban Heating, Part I: Spatial Variability of Mean Flow, Turbulent Exchange and Pollutant Dispersion

    NASA Astrophysics Data System (ADS)

    Nazarian, Negin; Martilli, Alberto; Kleissl, Jan

    2018-03-01

    As urbanization progresses, more realistic methods are required to analyze the urban microclimate. However, given the complexity and computational cost of numerical models, the effects of realistic representations should be evaluated to identify the level of detail required for an accurate analysis. We consider the realistic representation of surface heating in an idealized three-dimensional urban configuration, and evaluate the spatial variability of flow statistics (mean flow and turbulent fluxes) in urban streets. Large-eddy simulations coupled with an urban energy balance model are employed, and the heating distribution of urban surfaces is parametrized using sets of horizontal and vertical Richardson numbers, characterizing thermal stratification and heating orientation with respect to the wind direction. For all studied conditions, the thermal field is strongly affected by the orientation of heating with respect to the airflow. The modification of airflow by the horizontal heating is also pronounced for strongly unstable conditions. The formation of the canyon vortices is affected by the three-dimensional heating distribution in both spanwise and streamwise street canyons, such that the secondary vortex is seen adjacent to the windward wall. For the dispersion field, however, the overall heating of urban surfaces, and more importantly, the vertical temperature gradient, dominate the distribution of concentration and the removal of pollutants from the building canyon. Accordingly, the spatial variability of concentration is not significantly affected by the detailed heating distribution. The analysis is extended to assess the effects of three-dimensional surface heating on turbulent transfer. Quadrant analysis reveals that the differential heating also affects the dominance of ejection and sweep events and the efficiency of turbulent transfer (exuberance) within the street canyon and at the roof level, while the vertical variation of these parameters is less dependent on the detailed heating of urban facets.

  20. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    NASA Astrophysics Data System (ADS)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  1. High Order Accurate Finite Difference Modeling of Seismo-Acoustic Wave Propagation in a Moving Atmosphere and a Heterogeneous Earth Model Coupled Across a Realistic Topography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersson, N. Anders; Sjogreen, Bjorn

    Here, we develop a numerical method for simultaneously simulating acoustic waves in a realistic moving atmosphere and seismic waves in a heterogeneous earth model, where the motions are coupled across a realistic topography. We model acoustic wave propagation by solving the linearized Euler equations of compressible fluid mechanics. The seismic waves are modeled by the elastic wave equation in a heterogeneous anisotropic material. The motion is coupled by imposing continuity of normal velocity and normal stresses across the topographic interface. Realistic topography is resolved on a curvilinear grid that follows the interface. The governing equations are discretized using high ordermore » accurate finite difference methods that satisfy the principle of summation by parts. We apply the energy method to derive the discrete interface conditions and to show that the coupled discretization is stable. The implementation is verified by numerical experiments, and we demonstrate a simulation of coupled wave propagation in a windy atmosphere and a realistic earth model with non-planar topography.« less

  2. High Order Accurate Finite Difference Modeling of Seismo-Acoustic Wave Propagation in a Moving Atmosphere and a Heterogeneous Earth Model Coupled Across a Realistic Topography

    DOE PAGES

    Petersson, N. Anders; Sjogreen, Bjorn

    2017-04-18

    Here, we develop a numerical method for simultaneously simulating acoustic waves in a realistic moving atmosphere and seismic waves in a heterogeneous earth model, where the motions are coupled across a realistic topography. We model acoustic wave propagation by solving the linearized Euler equations of compressible fluid mechanics. The seismic waves are modeled by the elastic wave equation in a heterogeneous anisotropic material. The motion is coupled by imposing continuity of normal velocity and normal stresses across the topographic interface. Realistic topography is resolved on a curvilinear grid that follows the interface. The governing equations are discretized using high ordermore » accurate finite difference methods that satisfy the principle of summation by parts. We apply the energy method to derive the discrete interface conditions and to show that the coupled discretization is stable. The implementation is verified by numerical experiments, and we demonstrate a simulation of coupled wave propagation in a windy atmosphere and a realistic earth model with non-planar topography.« less

  3. Variable rainfall intensity and tillage effects on runoff, sediment, and carbon losses from a loamy sand under simulated rainfall.

    PubMed

    Truman, C C; Strickland, T C; Potter, T L; Franklin, D H; Bosch, D D; Bednarz, C W

    2007-01-01

    The low-carbon, intensively cropped Coastal Plain soils of Georgia are susceptible to runoff, soil loss, and drought. Reduced tillage systems offer the best management tool for sustained row crop production. Understanding runoff, sediment, and chemical losses from conventional and reduced tillage systems is expected to improve if the effect of a variable rainfall intensity storm was quantified. Our objective was to quantify and compare effects of a constant (Ic) intensity pattern and a more realistic, observed, variable (Iv) rainfall intensity pattern on runoff (R), sediment (E), and carbon losses (C) from a Tifton loamy sand cropped to conventional-till (CT) and strip-till (ST) cotton (Gossypium hirsutum L.). Four treatments were evaluated: CT-Ic, CT-Iv, ST-Ic, and ST-Iv, each replicated three times. Field plots (n=12), each 2 by 3 m, were established on each treatment. Each 6-m2 field plot received simulated rainfall at a constant (57 mm h(-1)) or variable rainfall intensity pattern for 70 min (12-run ave.=1402 mL; CV=3%). The Iv pattern represented the most frequent occurring intensity pattern for spring storms in the region. Compared with CT, ST decreased R by 2.5-fold, E by 3.5-fold, and C by 7-fold. Maximum runoff values for Iv events were 1.6-fold higher than those for Ic events and occurred 38 min earlier. Values for Etot and Ctot for Iv events were 19-36% and 1.5-fold higher than corresponding values for Ic events. Values for Emax and Cmax for Iv events were 3-fold and 4-fold higher than corresponding values for Ic events. Carbon enrichment ratios (CER) were or=1.0 for CT plots (except for first 20 min). Maximum CER for CT-Ic, CT-Iv, ST-Ic, and ST-Iv were 2.0, 2.2, 1.0, and 1.2, respectively. Transport of sediment, carbon, and agrichemicals would be better understood if variable rainfall intensity patterns derived from natural rainfall were used in rainfall simulations to evaluate their fate and transport from CT and ST systems.

  4. SOFIA tracking image simulation

    NASA Astrophysics Data System (ADS)

    Taylor, Charles R.; Gross, Michael A. K.

    2016-09-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented

  5. Building the evidence on simulation validity: comparison of anesthesiologists' communication patterns in real and simulated cases.

    PubMed

    Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F

    2014-01-01

    Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.

  6. A piezoelectric shock-loading response simulator for piezoelectric-based device developers

    NASA Astrophysics Data System (ADS)

    Rastegar, J.; Feng, Z.

    2017-04-01

    Pulsed loading of piezoelectric transducers occurs in many applications, such as those in munitions firing, or when a mechanical system is subjected to impact type loading. In this paper, an electronic simulator that can be programmed to generate electrical charges that a piezoelectric transducer generates as it is subjected to various shock loading profiles is presented. The piezoelectric output simulator can provide close to realistic outputs so that the circuit designer can use it to test the developed system under close to realistic conditions without the need for the costly and time consuming process of performing actual tests. The design of the electronic simulator and results of its testing are presented.

  7. Gene family evolution: an in-depth theoretical and simulation analysis of non-linear birth-death-innovation models.

    PubMed

    Karev, Georgy P; Wolf, Yuri I; Berezovskaya, Faina S; Koonin, Eugene V

    2004-09-09

    The size distribution of gene families in a broad range of genomes is well approximated by a generalized Pareto function. Evolution of ensembles of gene families can be described with Birth, Death, and Innovation Models (BDIMs). Analysis of the properties of different versions of BDIMs has the potential of revealing important features of genome evolution. In this work, we extend our previous analysis of stochastic BDIMs. In addition to the previously examined rational BDIMs, we introduce potentially more realistic logistic BDIMs, in which birth/death rates are limited for the largest families, and show that their properties are similar to those of models that include no such limitation. We show that the mean time required for the formation of the largest gene families detected in eukaryotic genomes is limited by the mean number of duplications per gene and does not increase indefinitely with the model degree. Instead, this time reaches a minimum value, which corresponds to a non-linear rational BDIM with the degree of approximately 2.7. Even for this BDIM, the mean time of the largest family formation is orders of magnitude greater than any realistic estimates based on the timescale of life's evolution. We employed the embedding chains technique to estimate the expected number of elementary evolutionary events (gene duplications and deletions) preceding the formation of gene families of the observed size and found that the mean number of events exceeds the family size by orders of magnitude, suggesting a highly dynamic process of genome evolution. The variance of the time required for the formation of the largest families was found to be extremely large, with the coefficient of variation > 1. This indicates that some gene families might grow much faster than the mean rate such that the minimal time required for family formation is more relevant for a realistic representation of genome evolution than the mean time. We determined this minimal time using Monte Carlo simulations of family growth from an ensemble of simultaneously evolving singletons. In these simulations, the time elapsed before the formation of the largest family was much shorter than the estimated mean time and was compatible with the timescale of evolution of eukaryotes. The analysis of stochastic BDIMs presented here shows that non-linear versions of such models can well approximate not only the size distribution of gene families but also the dynamics of their formation during genome evolution. The fact that only higher degree BDIMs are compatible with the observed characteristics of genome evolution suggests that the growth of gene families is self-accelerating, which might reflect differential selective pressure acting on different genes.

  8. Atomistic simulations of materials: Methods for accurate potentials and realistic time scales

    NASA Astrophysics Data System (ADS)

    Tiwary, Pratyush

    This thesis deals with achieving more realistic atomistic simulations of materials, by developing accurate and robust force-fields, and algorithms for practical time scales. I develop a formalism for generating interatomic potentials for simulating atomistic phenomena occurring at energy scales ranging from lattice vibrations to crystal defects to high-energy collisions. This is done by fitting against an extensive database of ab initio results, as well as to experimental measurements for mixed oxide nuclear fuels. The applicability of these interactions to a variety of mixed environments beyond the fitting domain is also assessed. The employed formalism makes these potentials applicable across all interatomic distances without the need for any ambiguous splining to the well-established short-range Ziegler-Biersack-Littmark universal pair potential. We expect these to be reliable potentials for carrying out damage simulations (and molecular dynamics simulations in general) in nuclear fuels of varying compositions for all relevant atomic collision energies. A hybrid stochastic and deterministic algorithm is proposed that while maintaining fully atomistic resolution, allows one to achieve milliseconds and longer time scales for several thousands of atoms. The method exploits the rare event nature of the dynamics like other such methods, but goes beyond them by (i) not having to pick a scheme for biasing the energy landscape, (ii) providing control on the accuracy of the boosted time scale, (iii) not assuming any harmonic transition state theory (HTST), and (iv) not having to identify collective coordinates or interesting degrees of freedom. The method is validated by calculating diffusion constants for vacancy-mediated diffusion in iron metal at low temperatures, and comparing against brute-force high temperature molecular dynamics. We also calculate diffusion constants for vacancy diffusion in tantalum metal, where we compare against low-temperature HTST as well. The robustness of the algorithm with respect to the only free parameter it involves is ascertained. The method is then applied to perform tensile tests on gold nanopillars on strain rates as low as 100/s, bringing out the perils of high strain-rate molecular dynamics calculations. We also calculate temperature and stress dependence of activation free energy for surface nucleation of dislocations in pristine gold nanopillars under realistic loads. While maintaining fully atomistic resolution, we reach the fraction-of-a-second time scale regime. It is found that the activation free energy depends significantly and nonlinearly on the driving force (stress or strain) and temperature, leading to very high activation entropies for surface dislocation nucleation.

  9. High accuracy mantle convection simulation through modern numerical methods - II: realistic models and problems

    NASA Astrophysics Data System (ADS)

    Heister, Timo; Dannberg, Juliane; Gassmöller, Rene; Bangerth, Wolfgang

    2017-08-01

    Computations have helped elucidate the dynamics of Earth's mantle for several decades already. The numerical methods that underlie these simulations have greatly evolved within this time span, and today include dynamically changing and adaptively refined meshes, sophisticated and efficient solvers, and parallelization to large clusters of computers. At the same time, many of the methods - discussed in detail in a previous paper in this series - were developed and tested primarily using model problems that lack many of the complexities that are common to the realistic models our community wants to solve today. With several years of experience solving complex and realistic models, we here revisit some of the algorithm designs of the earlier paper and discuss the incorporation of more complex physics. In particular, we re-consider time stepping and mesh refinement algorithms, evaluate approaches to incorporate compressibility, and discuss dealing with strongly varying material coefficients, latent heat, and how to track chemical compositions and heterogeneities. Taken together and implemented in a high-performance, massively parallel code, the techniques discussed in this paper then allow for high resolution, 3-D, compressible, global mantle convection simulations with phase transitions, strongly temperature dependent viscosity and realistic material properties based on mineral physics data.

  10. Numerical Study of Solar Storms from the Sun to Earth

    NASA Astrophysics Data System (ADS)

    Feng, Xueshang; Jiang, Chaowei; Zhou, Yufen

    2017-04-01

    As solar storms are sweeping the Earth, adverse changes occur in geospace environment. How human can mitigate and avoid destructive damages caused by solar storms becomes an important frontier issue that we must face in the high-tech times. It is of both scientific significance to understand the dynamic process during solar storm's propagation in interplanetary space and realistic value to conduct physics-based numerical researches on the three-dimensional process of solar storms in interplanetary space with the aid of powerful computing capacity to predict the arrival times, intensities, and probable geoeffectiveness of solar storms at the Earth. So far, numerical studies based on magnetohydrodynamics (MHD) have gone through the transition from the initial qualitative principle researches to systematic quantitative studies on concrete events and numerical predictions. Numerical modeling community has a common goal to develop an end-to-end physics-based modeling system for forecasting the Sun-Earth relationship. It is hoped that the transition of these models to operational use depends on the availability of computational resources at reasonable cost and that the models' prediction capabilities may be improved by incorporating the observational findings and constraints into physics-based models, combining the observations, empirical models and MHD simulations in organic ways. In this talk, we briefly focus on our recent progress in using solar observations to produce realistic magnetic configurations of CMEs as they leave the Sun, and coupling data-driven simulations of CMEs to heliospheric simulations that then propagate the CME configuration to 1AU, and outlook the important numerical issues and their possible solutions in numerical space weather modeling from the Sun to Earth for future research.

  11. Event attribution using data assimilation in an intermediate complexity atmospheric model

    NASA Astrophysics Data System (ADS)

    Metref, Sammy; Hannart, Alexis; Ruiz, Juan; Carrassi, Alberto; Bocquet, Marc; Ghil, Michael

    2016-04-01

    A new approach, coined DADA (Data Assimilation for Detection and Attribution) has been recently introduced by Hannart et al. 2015, and is potentially useful for near real time, systematic causal attribution of weather and climate-related events The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with Data Assimilation (DA) methods usually designed to deal with large nonlinear models. In Hannart et al. 2015, the DADA proposal is illustrated in the context of a low-order nonlinear model (forced three-variable Lorenz model) that is of course not realistic to represent the events considered. As a continuation of this stream of work, we therefore propose an implementation of the DADA approach in a realistic intermediate complexity atmospheric model (ICTP AGCM, nicknamed SPEEDY). The SPEEDY model is based on a spectral dynamical core developed at the Geophysical Fluid Dynamics Laboratory (see Held and Suarez 1994). It is a hydrostatic, r-coordinate, spectral-transform model in the vorticity-divergence form described by Bourke (1974). A synthetic dataset of observations of an extreme precipitation event over Southeastern South America is extracted from a long SPEEDY simulation under present climatic conditions (i.e. factual conditions). Then, following the DADA approach, observations of this event are assimilated twice in the SPEEDY model: first in the factual configuration of the model and second under its counterfactual, pre-industrial configuration. We show that attribution can be performed based on the likelihood ratio as in Hannart et al. 2015, but we further extend this result by showing that the likelihood can be split in space, time and variables in order to help identify the specific physical features of the event that bear the causal signature. References: Hannart A., A. Carrassi, M. Bocquet, M. Ghil, P. Naveau, M. Pulido, J. Ruiz, P. Tandeo (2015) DADA: Data assimilation for the detection and attribution of weather and climate-related events, Climatic Change, (in press). Held I. M. and M. J. Suarez, (1994): A Proposal for the Intercomparison of the Dynamical Cores of Atmospheric General Circulation Models. Bull. Amer. Meteor. Soc., 75, 1825-1830. Bourke W. (1972): A multi-level spectral model. I. Formulation and hemispheric integrations. Mon. Wea. Rev., 102, 687-701.

  12. Generating Virtual Patients by Multivariate and Discrete Re-Sampling Techniques.

    PubMed

    Teutonico, D; Musuamba, F; Maas, H J; Facius, A; Yang, S; Danhof, M; Della Pasqua, O

    2015-10-01

    Clinical Trial Simulations (CTS) are a valuable tool for decision-making during drug development. However, to obtain realistic simulation scenarios, the patients included in the CTS must be representative of the target population. This is particularly important when covariate effects exist that may affect the outcome of a trial. The objective of our investigation was to evaluate and compare CTS results using re-sampling from a population pool and multivariate distributions to simulate patient covariates. COPD was selected as paradigm disease for the purposes of our analysis, FEV1 was used as response measure and the effects of a hypothetical intervention were evaluated in different populations in order to assess the predictive performance of the two methods. Our results show that the multivariate distribution method produces realistic covariate correlations, comparable to the real population. Moreover, it allows simulation of patient characteristics beyond the limits of inclusion and exclusion criteria in historical protocols. Both methods, discrete resampling and multivariate distribution generate realistic pools of virtual patients. However the use of a multivariate distribution enable more flexible simulation scenarios since it is not necessarily bound to the existing covariate combinations in the available clinical data sets.

  13. Forecasting and modelling ice layer formation on the snowpack due to freezing precipitations in the Pyrenees

    NASA Astrophysics Data System (ADS)

    Quéno, Louis; Vionnet, Vincent; Cabot, Frédéric; Vrécourt, Dominique; Dombrowski-Etchevers, Ingrid

    2017-04-01

    In the Pyrenees, freezing precipitations in altitude occur at least once per winter, leading to the formation of a pure ice layer on the surface of the snowpack. It may lead to accidents and fatalities among mountaineers and skiers, with sometimes a higher human toll than avalanches. Such events are not predicted by the current operational systems for snow and avalanche hazard forecasting. A crowd-sourced database of surface ice layer occurrences is first built up, using reports from Internet mountaineering and ski-touring communities, to mitigate the lack of observations from conventional observation networks. A simple diagnostic of freezing precipitation is then developed, based on the cloud water content and screen temperature forecast by the Numerical Weather Prediction model AROME, operating at 2.5-km resolution. The performance of this diagnostic is assessed for the event of 5-6 January 2012, with a good representation of altitudinal and spatial distributions of the ice layer. An evaluation of the diagnostic for major events over five winters gives good skills of detection compared to the occurrences reported in the observation database. A new modelling of ice formation on the surface of the snowpack due to impinging supercooled water is added to the detailed snowpack model Crocus. It is combined to the atmospheric diagnostic of freezing precipitations and resulting snowpack simulations over a winter season capture well the formation of the main ice layers. Their influence on the snowpack stratigraphy is also realistically simulated. These simple methods enable to forecast the occurrence of surface ice layer formations with good confidence and to simulate their evolution within the snowpack, even if an accurate estimation of freezing precipitation amounts remains the main challenge.

  14. Characterizing Wheel-Soil Interaction Loads Using Meshfree Finite Element Methods: A Sensitivity Analysis for Design Trade Studies

    NASA Technical Reports Server (NTRS)

    Contreras, Michael T.; Trease, Brian P.; Bojanowski, Cezary; Kulakx, Ronald F.

    2013-01-01

    A wheel experiencing sinkage and slippage events poses a high risk to planetary rover missions as evidenced by the mobility challenges endured by the Mars Exploration Rover (MER) project. Current wheel design practice utilizes loads derived from a series of events in the life cycle of the rover which do not include (1) failure metrics related to wheel sinkage and slippage and (2) performance trade-offs based on grouser placement/orientation. Wheel designs are rigorously tested experimentally through a variety of drive scenarios and simulated soil environments; however, a robust simulation capability is still in development due to myriad of complex interaction phenomena that contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree nite element approaches enable simulations that capture su cient detail of wheel-soil interaction while remaining computationally feasible. This study implements the JPL wheel-soil benchmark problem in the commercial code environment utilizing the large deformation modeling capability of Smooth Particle Hydrodynamics (SPH) meshfree methods. The nominal, benchmark wheel-soil interaction model that produces numerically stable and physically realistic results is presented and simulations are shown for both wheel traverse and wheel sinkage cases. A sensitivity analysis developing the capability and framework for future ight applications is conducted to illustrate the importance of perturbations to critical material properties and parameters. Implementation of the proposed soil-wheel interaction simulation capability and associated sensitivity framework has the potential to reduce experimentation cost and improve the early stage wheel design proce

  15. The importance of wind-flux feedbacks during the November CINDY-DYNAMO MJO event

    NASA Astrophysics Data System (ADS)

    Riley Dellaripa, Emily; Maloney, Eric; van den Heever, Susan

    2015-04-01

    High-resolution, large-domain cloud resolving model (CRM) simulations probing the importance of wind-flux feedbacks to Madden-Julian Oscillation (MJO) convection are performed for the November 2011 CINDY-DYNAMO MJO event. The work is motivated by observational analysis from RAMA buoys in the Indian Ocean and TRMM precipitation retrievals that show a positive correlation between MJO precipitation and wind-induced surface fluxes, especially latent heat fluxes, during and beyond the CINDY-DYNAMO time period. Simulations are done using Colorado State University's Regional Atmospheric Modeling System (RAMS). The domain setup is oceanic and spans 1000 km x 1000 km with 1.5 km horizontal resolution and 65 stretched vertical levels centered on the location of Gan Island - one of the major CINDY-DYNAMO observation points. The model is initialized with ECMWF reanalysis and Aqua MODIS sea surface temperatures. Nudging from ECMWF reanalysis is applied at the domain periphery to encourage realistic evolution of MJO convection. The control experiment is run for the entire month of November so both suppressed and active, as well as, transitional phases of the MJO are modeled. In the control experiment, wind-induced surface fluxes are activated through the surface bulk aerodynamic formula and allowed to evolve organically. Sensitivity experiments are done by restarting the control run one week into the simulation and controlling the wind-induced flux feedbacks. In one sensitivity experiment, wind-induced surface flux feedbacks are completely denied, while in another experiment the winds are kept constant at the control simulations mean surface wind speed. The evolution of convection, especially on the mesoscale, is compared between the control and sensitivity simulations.

  16. Numerical Simulation of Intense Precipitation Events South of the Alps: Sensitivity to Initial Conditions and Horizontal Resolution

    NASA Astrophysics Data System (ADS)

    Cacciamani, C.; Cesari, D.; Grazzini, F.; Paccagnella, T.; Pantone, M.

    In this paper we describe the results of several numerical experiments performed with the limited area model LAMBO, based on a 1989 version of the NCEP (National Center for Environmental Prediction) ETA model, operational at ARPA-SMR since 1993. The experiments have been designed to assess the impact of different horizontal resolutions and initial conditions on the quality and detail of the forecast, especially as regards the precipitation field in the case of severe flood events. For initial conditions we developed a mesoscale data assimilation scheme, based on the nudging technique. The scheme makes use of upper air and surface meteorological observations to modify ECMWF (European Centre for Medium Range Weather Forecast) operational analyses, used as first-guess fields, in order to better describe smaller scales features, mainly in the lower troposphere. Three flood cases in the Alpine and Mediterranean regions have been simulated with LAMBO, using a horizontal grid spacing of 15 and 5km and starting either from ECMWF initialised analysis or from the result of our mesoscale analysis procedure. The results show that increasing the resolution generally improves the forecast, bringing the precipitation peaks in the flooded areas close to the observed values without producing many spurious precipitation patterns. The use of mesoscale analysis produces a more realistic representation of precipitation patterns giving a further improvement to the forecast of precipitation. Furthermore, when simulations are started from mesoscale analysis, some model-simulated thermodynamic indices show greater vertical instability just in the regions where strongest precipitation occurred.

  17. Timing performance of the silicon PET insert probe

    PubMed Central

    Studen, A.; Burdette, D.; Chesi, E.; Cindro, V.; Clinthorne, N. H.; Cochran, E.; Grošičar, B.; Kagan, H.; Lacasta, C.; Linhart, V.; Mikuž, M.; Stankova, V.; Weilhammer, P.; Žontar, D.

    2010-01-01

    Simulation indicates that PET image could be improved by upgrading a conventional ring with a probe placed close to the imaged object. In this paper, timing issues related to a PET probe using high-resistivity silicon as a detector material are addressed. The final probe will consist of several (four to eight) 1-mm thick layers of silicon detectors, segmented into 1 × 1 mm2 pads, each pad equivalent to an independent p + nn+ diode. A proper matching of events in silicon with events of the external ring can be achieved with a good timing resolution. To estimate the timing performance, measurements were performed on a simplified model probe, consisting of a single 1-mm thick detector with 256 square pads (1.4 mm side), coupled with two VATAGP7s, application-specific integrated circuits. The detector material and electronics are the same that will be used for the final probe. The model was exposed to 511 keV annihilation photons from an 22Na source, and a scintillator (LYSO)–PMT assembly was used as a timing reference. Results were compared with the simulation, consisting of four parts: (i) GEANT4 implemented realistic tracking of electrons excited by annihilation photon interactions in silicon, (ii) calculation of propagation of secondary ionisation (electron–hole pairs) in the sensor, (iii) estimation of the shape of the current pulse induced on surface electrodes and (iv) simulation of the first electronics stage. A very good agreement between the simulation and the measurements were found. Both indicate reliable performance of the final probe at timing windows down to 20 ns. PMID:20215445

  18. Timing performance of the silicon PET insert probe.

    PubMed

    Studen, A; Burdette, D; Chesi, E; Cindro, V; Clinthorne, N H; Cochran, E; Grosicar, B; Kagan, H; Lacasta, C; Linhart, V; Mikuz, M; Stankova, V; Weilhammer, P; Zontar, D

    2010-01-01

    Simulation indicates that PET image could be improved by upgrading a conventional ring with a probe placed close to the imaged object. In this paper, timing issues related to a PET probe using high-resistivity silicon as a detector material are addressed. The final probe will consist of several (four to eight) 1-mm thick layers of silicon detectors, segmented into 1 x 1 mm(2) pads, each pad equivalent to an independent p + nn+ diode. A proper matching of events in silicon with events of the external ring can be achieved with a good timing resolution. To estimate the timing performance, measurements were performed on a simplified model probe, consisting of a single 1-mm thick detector with 256 square pads (1.4 mm side), coupled with two VATAGP7s, application-specific integrated circuits. The detector material and electronics are the same that will be used for the final probe. The model was exposed to 511 keV annihilation photons from an (22)Na source, and a scintillator (LYSO)-PMT assembly was used as a timing reference. Results were compared with the simulation, consisting of four parts: (i) GEANT4 implemented realistic tracking of electrons excited by annihilation photon interactions in silicon, (ii) calculation of propagation of secondary ionisation (electron-hole pairs) in the sensor, (iii) estimation of the shape of the current pulse induced on surface electrodes and (iv) simulation of the first electronics stage. A very good agreement between the simulation and the measurements were found. Both indicate reliable performance of the final probe at timing windows down to 20 ns.

  19. Evaluation of WRF PBL parameterization schemes against direct observations during a dry event over the Ganges valley

    NASA Astrophysics Data System (ADS)

    Sathyanadh, Anusha; Prabha, Thara V.; Balaji, B.; Resmi, E. A.; Karipot, Anandakumar

    2017-09-01

    Accurate representations of the planetary boundary layer (PBL) are important in all weather forecast systems, especially in simulations of turbulence, wind and air quality in the lower atmosphere. In the present study, detailed observations from the Cloud Aerosol Interaction and Precipitation Enhancement Experiment - Integrated Ground based Observational Campaign (CAIPEEX-IGOC) 2014 comprising of the complete surface energy budget and detailed boundary layer observations are used to validate Advanced Research Weather Research and Forecasting (WRF) model simulations over a diverse terrain over the Ganges valley region, Uttar Pradesh, India. A drying event in June 2014 associated with a heat wave is selected for validation.Six local and nonlocal PBL schemes from WRF at 1 km resolution are compared with hourly observations during the diurnal cycle. Near-surface observations of weather parameters, radiation components and eddy covariance fluxes from micrometeorological tower, and profiles of variables from microwave radiometer, and radiosonde observations are used for model evaluations. Models produce a warmer, drier surface layer with higher wind speed, sensible heat flux and temperature than observations. Layered boundary layer dynamics, including the residual layer structure as illustrated in the observations over the Ganges valley are missed in the model, which lead to deeper mixed layers and excessive drying.Although it is difficult to identify any single scheme as the best, the qualitative and quantitative analyses for the entire study period and overall reproducibility of the observations indicate that the MYNN2 simulations describe lower errors and more realistic simulation of spatio-temporal variations in the boundary layer height.

  20. The diagnosis of severe thunderstorms with high-resolution WRF model

    NASA Astrophysics Data System (ADS)

    Litta, A. J.; Mohanty, U. C.; Idicula, Sumam Mary

    2012-04-01

    Thunderstorm, resulting from vigorous convective activity, is one of the most spectacular weather phenomena in the atmosphere. A common feature of the weather during the pre-monsoon season over the Indo-Gangetic Plain and northeast India is the outburst of severe local convective storms, commonly known as `Nor'westers'(as they move from northwest to southeast). The severe thunderstorms associated with thunder, squall lines, lightning and hail cause extensive losses in agricultural, damage to structure and also loss of life. In this paper, sensitivity experiments have been conducted with the Non-hydrostatic Mesoscale Model (NMM) to test the impact of three microphysical schemes in capturing the severe thunderstorm event occurred over Kolkata on 15 May 2009. The results show that the WRF-NMM model with Ferrier microphysical scheme appears to reproduce the cloud and precipitation processes more realistically than other schemes. Also, we have made an attempt to diagnose four severe thunderstorms that occurred during pre-monsoon seasons of 2006, 2007 and 2008 through the simulated radar reflectivity fields from NMM model with Ferrier microphysics scheme and validated the model results with Kolkata Doppler Weather Radar (DWR) observations. Composite radar reflectivity simulated by WRF-NMM model clearly shows the severe thunderstorm movement as observed by DWR imageries, but failed to capture the intensity as in observations. The results of these analyses demonstrated the capability of high resolution WRF-NMM model in the simulation of severe thunderstorm events and determined that the 3 km model improve upon current abilities when it comes to simulating severe thunderstorms over east Indian region.

  1. Disentangling the impacts of heat wave magnitude, duration and timing on the structure and diversity of sessile marine assemblages

    PubMed Central

    Yunnie, Anna L.E.; Vance, Thomas; Widdicombe, Stephen

    2015-01-01

    Extreme climatic events, including heat waves (HWs) and severe storms, influence the structure of marine and terrestrial ecosystems. Despite growing consensus that anthropogenic climate change will increase the frequency, duration and magnitude of extreme events, current understanding of their impact on communities and ecosystems is limited. Here, we used sessile invertebrates on settlement panels as model assemblages to examine the influence of HW magnitude, duration and timing on marine biodiversity patterns. Settlement panels were deployed in a marina in southwest UK for ≥5 weeks, to allow sufficient time for colonisation and development of sessile fauna, before being subjected to simulated HWs in a mesocosm facility. Replicate panel assemblages were held at ambient sea temperature (∼17 °C), or +3 °C or +5 °C for a period of 1 or 2 weeks, before being returned to the marina for a recovery phase of 2–3 weeks. The 10-week experiment was repeated 3 times, staggered throughout summer, to examine the influence of HW timing on community impacts. Contrary to our expectations, the warming events had no clear, consistent impacts on the abundance of species or the structure of sessile assemblages. With the exception of 1 high-magnitude long-duration HW event, warming did not alter not assemblage structure, favour non-native species, nor lead to changes in richness, abundance or biomass of sessile faunal assemblages. The observed lack of effect may have been caused by a combination of (1) the use of relatively low magnitude, realistic heat wave treatments compared to previous studies (2), the greater resilience of mature adult sessile fauna compared to recruits and juveniles, and (3) the high thermal tolerance of the model organisms (i.e., temperate fouling species, principally bryozoans and ascidians). Our study demonstrates the importance of using realistic treatments when manipulating climate change variables, and also suggests that biogeographical context may influence community-level responses to short-term warming events, which are predicted to increase in severity in the future. PMID:25834773

  2. Improving stamping simulation accuracy by accounting for realistic friction and lubrication conditions: Application to the door-outer of the Mercedes-Benz C-class Coupé

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Stock, J.; Wied, J.; Wiegand, K.; Carleer, B.

    2016-08-01

    In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. Only then, one can obtain reliable and realistic simulation results that correspond to the actual try-out and mass production conditions. In this work, the TriboForm software is used to accurately account for tribology-, friction-, and lubrication conditions in stamping simulations. The enhanced stamping simulations are applied and validated for the door-outer of the Mercedes- Benz C-Class Coupe. The project results demonstrate the improved prediction accuracy of stamping simulations with respect to both part quality and actual stamping process conditions.

  3. Laboratory simulation of space plasma phenomena*

    NASA Astrophysics Data System (ADS)

    Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.

    2017-12-01

    Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.

  4. SU-E-I-88: Realistic Pathological Simulations of the NCAT and Zubal Anthropomorphic Models, Based on Clinical PET/CT Data.

    PubMed

    Papadimitroulas, P; Loudos, G; Le Maitre, A; Efthimiou, N; Visvikis, D; Nikiforidis, G; Kagadis, G C

    2012-06-01

    In the present study a patient-specific dataset of realistic PET simulations was created, taking into account the variability of clinical oncology data. Tumor variability was tested in the simulated results. A comparison of the produced simulated data was performed to clinical PET/CT data, for the validation and the evaluation of the procedure. Clinical PET/CT data of oncology patients were used as the basis of the simulated variability inserting patient-specific characteristics in the NCAT and the Zubal anthropomorphic phantoms. GATE Monte Carlo toolkit was used for simulating a commercial PET scanner. The standard computational anthropomorphic phantoms were adapted to the CT data (organ shapes), using a fitting algorithm. The activity map was derived from PET images. Patient tumors were segmented and inserted in the phantom, using different activity distributions. The produced simulated data were reconstructed using the STIR opensource software and compared to the original clinical ones. The accuracy of the procedure was tested in four different oncology cases. Each pathological situation was illustrated simulating a) a healthy body, b) insertion of the clinical tumor with homogenous activity, and c) insertion of the clinical tumor with variable activity (voxel-by-voxel) based on the clinical PET data. The accuracy of the presented dataset was compared to the original PET/CT data. Partial Volume Correction (PVC) was also applied in the simulated data. In this study patient-specific characteristics were used in computational anthropomorphic models for simulating realistic pathological patients. Voxel-by-voxel activity distribution with PVC within the tumor gives the most accurate results. Radiotherapy applications can utilize the benefits of the accurate realistic imaging simulations, using the anatomicaland biological information of each patient. Further work will incorporate the development of analytical anthropomorphic models with motion and cardiac correction, combined with pathological patients to achieve high accuracy in tumor imaging. This research was supported by the Joint Research and Technology Program between Greece and France; 2009-2011 (protocol ID: 09FR103). © 2012 American Association of Physicists in Medicine.

  5. A study of long-term trends in mineral dust aerosol distributions in Asia using a general circulation model

    NASA Astrophysics Data System (ADS)

    Mukai, Makiko; Nakajima, Teruyuki; Takemura, Toshihiko

    2004-10-01

    Dust events have been observed in Japan with high frequency since 2000. On the other hand, the frequency of dust storms is said to have decreased in the desert regions of China since about the middle of the 1970s. This study simulates dust storms and transportation of mineral dust aerosols in the east Asia region from 1981 to 2001 using an aerosol transport model, Spectral Radiation-Transport Model for Aerosol Species (SPRINTARS), implemented in the Center for Climate System Research/National Institute for Environmental Studies atmospheric global circulation model, in order to investigate the main factors that control a dust event and its long-term variation. The model was forced to simulate a real atmospheric condition by a nudging technique using European Centre for Medium-Range Weather Forecasts reanalysis data on wind velocities, temperature, specific humidity, soil wetness, and snow depth. From a comparison between the long-term change in the dust emission and model parameters, it is found that the wind speed near the surface level had a significant influence on the dust emission, and snow is also an important factor in the early spring dust emission. The simulated results suggested that dust emissions from northeast China have a great impact on dust mass concentration in downwind regions, such as the cities of northeastern China, Korea, and Japan. When the frequency of dust events was high in Japan, a low-pressure system tended to develop over the northeast China region that caused strong winds. From 2000 to 2001 the simulated dust emission flux decreased in the Taklimakan desert and the northwestern part of China, while it increased in the Gobi desert and the northeastern part of China. Consequently, dust particles seem to be transported more from the latter region by prevailing westerlies in the springtime to downwind areas as actually observed. In spite of the similarity, however, there is still a large disagreement between observed and simulated dust frequencies and concentrations. A more realistic land surface and uplift mechanism of dust particles should be modeled to improve the model simulation. Desertification of the northeastern China region may be another reason for this disagreement.

  6. Real-time, haptics-enabled simulator for probing ex vivo liver tissue.

    PubMed

    Lister, Kevin; Gao, Zhan; Desai, Jaydev P

    2009-01-01

    The advent of complex surgical procedures has driven the need for realistic surgical training simulators. Comprehensive simulators that provide realistic visual and haptic feedback during surgical tasks are required to familiarize surgeons with the procedures they are to perform. Complex organ geometry inherent to biological tissues and intricate material properties drive the need for finite element methods to assure accurate tissue displacement and force calculations. Advances in real-time finite element methods have not reached the state where they are applicable to soft tissue surgical simulation. Therefore a real-time, haptics-enabled simulator for probing of soft tissue has been developed which utilizes preprocessed finite element data (derived from accurate constitutive model of the soft-tissue obtained from carefully collected experimental data) to accurately replicate the probing task in real-time.

  7. Design of Accelerator Online Simulator Server Using Structured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Guobao; /Brookhaven; Chu, Chungming

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less

  8. Cost assessment and ecological effectiveness of nutrient reduction options for mitigating Phaeocystis colony blooms in the Southern North Sea: an integrated modeling approach.

    PubMed

    Lancelot, Christiane; Thieu, Vincent; Polard, Audrey; Garnier, Josette; Billen, Gilles; Hecq, Walter; Gypens, Nathalie

    2011-05-01

    Nutrient reduction measures have been already taken by wealthier countries to decrease nutrient loads to coastal waters, in most cases however, prior to having properly assessed their ecological effectiveness and their economic costs. In this paper we describe an original integrated impact assessment methodology to estimate the direct cost and the ecological performance of realistic nutrient reduction options to be applied in the Southern North Sea watershed to decrease eutrophication, visible as Phaeocystis blooms and foam deposits on the beaches. The mathematical tool couples the idealized biogeochemical GIS-based model of the river system (SENEQUE-RIVERSTRAHLER) implemented in the Eastern Channel/Southern North Sea watershed to the biogeochemical MIRO model describing Phaeocystis blooms in the marine domain. Model simulations explore how nutrient reduction options regarding diffuse and/or point sources in the watershed would affect the Phaeocystis colony spreading in the coastal area. The reference and prospective simulations are performed for the year 2000 characterized by mean meteorological conditions, and nutrient reduction scenarios include and compare upgrading of wastewater treatment plants and changes in agricultural practices including an idealized shift towards organic farming. A direct cost assessment is performed for each realistic nutrient reduction scenario. Further the reduction obtained for Phaeocystis blooms is assessed by comparison with ecological indicators (bloom magnitude and duration) and the cost for reducing foam events on the beaches is estimated. Uncertainty brought by the added effect of meteorological conditions (rainfall) on coastal eutrophication is discussed. It is concluded that the reduction obtained by implementing realistic environmental measures on the short-term is costly and insufficient to restore well-balanced nutrient conditions in the coastal area while the replacement of conventional agriculture by organic farming might be an option to consider in the nearby future. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    NASA Astrophysics Data System (ADS)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  10. Observation and Numerical Simulation of Cavity Mode Oscillations Excited by an Interplanetary Shock

    NASA Astrophysics Data System (ADS)

    Takahashi, Kazue; Lysak, Robert; Vellante, Massimo; Kletzing, Craig A.; Hartinger, Michael D.; Smith, Charles W.

    2018-03-01

    Cavity mode oscillations (CMOs) are basic magnetohydrodynamic eigenmodes in the magnetosphere predicted by theory and are expected to occur following the arrival of an interplanetary shock. However, observational studies of shock-induced CMOs have been sparse. We present a case study of a dayside ultralow-frequency wave event that exhibited CMO properties. The event occurred immediately following the arrival of an interplanetary shock at 0829 UT on 15 August 2015. The shock was observed in the solar wind by the Time History of Events and Macroscale Interactions during Substorms-B and -C spacecraft, and magnetospheric ultralow-frequency waves were observed by multiple spacecraft including the Van Allen Probe-A and Van Allen Probe-B spacecraft, which were located in the dayside plasmasphere at L ˜1.4 and L ˜ 2.4, respectively. Both Van Allen Probes spacecraft detected compressional poloidal mode oscillations at ˜13 mHz (fundamental) and ˜26 mHz (second harmonic). At both frequencies, the azimuthal component of the electric field (Eϕ) lagged behind the compressional component of the magnetic field (Bμ) by ˜90°. The frequencies and the Eϕ-Bμ relative phase are in good agreement with the CMOs generated in a dipole magnetohydrodynamic simulation that incorporates a realistic plasma mass density distribution and ionospheric boundary condition. The oscillations were also detected on the ground by the European quasi-Meridional Magnetometer Array, which was located near the magnetic field footprints of the Van Allen Probes spacecraft.

  11. Imaging an Event Horizon: Mitigation of Source Variability of Sagittarius A*

    NASA Astrophysics Data System (ADS)

    Lu, Ru-Sen; Roelofs, Freek; Fish, Vincent L.; Shiokawa, Hotaka; Doeleman, Sheperd S.; Gammie, Charles F.; Falcke, Heino; Krichbaum, Thomas P.; Zensus, J. Anton

    2016-02-01

    The black hole in the center of the Galaxy, associated with the compact source Sagittarius A* (Sgr A*), is predicted to cast a shadow upon the emission of the surrounding plasma flow, which encodes the influence of general relativity (GR) in the strong-field regime. The Event Horizon Telescope (EHT) is a Very Long Baseline Interferometry (VLBI) network with a goal of imaging nearby supermassive black holes (in particular Sgr A* and M87) with angular resolution sufficient to observe strong gravity effects near the event horizon. General relativistic magnetohydrodynamic (GRMHD) simulations show that radio emission from Sgr A* exhibits variability on timescales of minutes, much shorter than the duration of a typical VLBI imaging experiment, which usually takes several hours. A changing source structure during the observations, however, violates one of the basic assumptions needed for aperture synthesis in radio interferometry imaging to work. By simulating realistic EHT observations of a model movie of Sgr A*, we demonstrate that an image of the average quiescent emission, featuring the characteristic black hole shadow and photon ring predicted by GR, can nonetheless be obtained by observing over multiple days and subsequent processing of the visibilities (scaling, averaging, and smoothing) before imaging. Moreover, it is shown that this procedure can be combined with an existing method to mitigate the effects of interstellar scattering. Taken together, these techniques allow the black hole shadow in the Galactic center to be recovered on the reconstructed image.

  12. Simulation framework for intelligent transportation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, T.; Doss, E.; Hanebutte, U.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less

  13. A fast low-to-high confinement mode bifurcation dynamics in the boundary-plasma gyrokinetic code XGC1

    DOE PAGES

    Ku, S.; Chang, C. S.; Hager, R.; ...

    2018-04-18

    Here, a fast edge turbulence suppression event has been simulated in the electrostatic version of the gyrokinetic particle-in-cell code XGC1 in a realistic diverted tokamak edge geometry under neutral particle recycling. The results show that the sequence of turbulent Reynolds stress followed by neoclassical ion orbit-loss driven together conspire to form the sustaining radial electric field shear and to quench turbulent transport just inside the last closed magnetic flux surface. As a result, the main suppression action is located in a thin radial layer around ψ N≃0.96–0.98, where ψ N is the normalized poloidal flux, with the time scale ~0.1more » ms.« less

  14. An Evaluation of the NOAA Climate Forecast System Subseasonal Forecasts

    NASA Astrophysics Data System (ADS)

    Mass, C.; Weber, N.

    2016-12-01

    This talk will describe a multi-year evaluation of the 1-5 week forecasts of the NOAA Climate Forecasting System (CFS) over the globe, North America, and the western U.S. Forecasts are evaluated for both specific times and for a variety of time-averaging periods. Initial results show a loss of predictability at approximately three weeks, with sea surface temperature retaining predictability longer than atmospheric variables. It is shown that a major CFS problem is an inability to realistically simulate propagating convection in the tropics, with substantial implications for midlatitude teleconnections and subseasonal predictability. The inability of CFS to deal with tropical convection will be discussed in connection with the prediction of extreme climatic events over the midlatitudes.

  15. The digital computer as a metaphor for the perfect laboratory experiment: Loophole-free Bell experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2016-12-01

    Using Einstein-Podolsky-Rosen-Bohm experiments as an example, we demonstrate that the combination of a digital computer and algorithms, as a metaphor for a perfect laboratory experiment, provides solutions to problems of the foundations of physics. Employing discrete-event simulation, we present a counterexample to John Bell's remarkable "proof" that any theory of physics, which is both Einstein-local and "realistic" (counterfactually definite), results in a strong upper bound to the correlations that are being measured in Einstein-Podolsky-Rosen-Bohm experiments. Our counterexample, which is free of the so-called detection-, coincidence-, memory-, and contextuality loophole, violates this upper bound and fully agrees with the predictions of quantum theory for Einstein-Podolsky-Rosen-Bohm experiments.

  16. Modeling the energy performance of event-driven wireless sensor network by using static sink and mobile sink.

    PubMed

    Chen, Jiehui; Salim, Mariam B; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations.

  17. Determining Wheel-Soil Interaction Loads Using a Meshfree Finite Element Approach Assisting Future Missions with Rover Wheel Design

    NASA Technical Reports Server (NTRS)

    Contreras, Michael T.; Peng, Chia-Yen; Wang, Dongdong; Chen, Jiun-Shyan

    2012-01-01

    A wheel experiencing sinkage and slippage events poses a high risk to rover missions as evidenced by recent mobility challenges on the Mars Exploration Rover (MER) project. Because several factors contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc., there are significant benefits to modeling these events to a sufficient degree of complexity. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree finite element approaches enable simulations that capture sufficient detail of wheel-soil interaction while remaining computationally feasible. This study demonstrates some of the large deformation modeling capability of meshfree methods and the realistic solutions obtained by accounting for the soil material properties. A benchmark wheel-soil interaction problem is developed and analyzed using a specific class of meshfree methods called Reproducing Kernel Particle Method (RKPM). The benchmark problem is also analyzed using a commercially available finite element approach with Lagrangian meshing for comparison. RKPM results are comparable to classical pressure-sinkage terramechanics relationships proposed by Bekker-Wong. Pending experimental calibration by future work, the meshfree modeling technique will be a viable simulation tool for trade studies assisting rover wheel design.

  18. Modeling the Energy Performance of Event-Driven Wireless Sensor Network by Using Static Sink and Mobile Sink

    PubMed Central

    Chen, Jiehui; Salim, Mariam B.; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations. PMID:22163503

  19. Stellar tracking attitude reference system

    NASA Technical Reports Server (NTRS)

    Klestadt, B.

    1974-01-01

    A satellite precision attitude control system was designed, based on the use of STARS as the principal sensing system. The entire system was analyzed and simulated in detail, considering the nonideal properties of the control and sensing components and realistic spacecraft mass properties. Experimental results were used to improve the star tracker noise model. The results of the simulation indicate that STARS performs in general as predicted in a realistic application and should be a strong contender in most precision earth pointing applications.

  20. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  1. Mspire-Simulator: LC-MS shotgun proteomic simulator for creating realistic gold standard data.

    PubMed

    Noyce, Andrew B; Smith, Rob; Dalgleish, James; Taylor, Ryan M; Erb, K C; Okuda, Nozomu; Prince, John T

    2013-12-06

    The most important step in any quantitative proteomic pipeline is feature detection (aka peak picking). However, generating quality hand-annotated data sets to validate the algorithms, especially for lower abundance peaks, is nearly impossible. An alternative for creating gold standard data is to simulate it with features closely mimicking real data. We present Mspire-Simulator, a free, open-source shotgun proteomic simulator that goes beyond previous simulation attempts by generating LC-MS features with realistic m/z and intensity variance along with other noise components. It also includes machine-learned models for retention time and peak intensity prediction and a genetic algorithm to custom fit model parameters for experimental data sets. We show that these methods are applicable to data from three different mass spectrometers, including two fundamentally different types, and show visually and analytically that simulated peaks are nearly indistinguishable from actual data. Researchers can use simulated data to rigorously test quantitation software, and proteomic researchers may benefit from overlaying simulated data on actual data sets.

  2. Simulating human behavior for national security human interactions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.

    2007-01-01

    This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less

  3. Smooth particle hydrodynamic modeling and validation for impact bird substitution

    NASA Astrophysics Data System (ADS)

    Babu, Arun; Prasad, Ganesh

    2018-04-01

    Bird strike events incidentally occur and can at times be fatal for air frame structures. Federal Aviation Regulations (FAR) and such other ones mandates aircrafts to be modeled to withstand various levels of bird hit damages. The subject matter of this paper is numerical modeling of a soft body geometry for realistically substituting an actual bird for carrying out simulations of bird hit on target structures. Evolution of such a numerical code to effect an actual bird behavior through impact is much desired for making use of the state of the art computational facilities in simulating bird strike events. Validity, of simulations depicting bird hits, is largely dependent on the correctness of the bird model. In an impact, a set of complex and coupled dynamic interaction exists between the target and the impactor. To simplify this problem, impactor response needs to be decoupled from that of the target. This can be done by assuming and modeling the target as noncompliant. Bird is assumed as fluidic in a impact. Generated stresses in the bird body are significant than its yield stresses. Hydrodynamic theory is most ideal for describing this problem. Impactor literally flows steadily over the target for most part of this problem. The impact starts with an initial shock and falls into a radial release shock regime. Subsequently a steady flow is established in the bird body and this phase continues till the whole length of the bird body is turned around. Initial shock pressure and steady state pressure are ideal variables for comparing and validating the bird model. Spatial discretization of the bird is done using Smooth Particle Hydrodynamic (SPH) approach. This Discrete Element Model (DEM) offers significant advantages over other contemporary approaches. Thermodynamic state variable relations are established using Polynomial Equation of State (EOS). ANSYS AUTODYN is used to perform the explicit dynamic simulation of the impact event. Validation of the shock and steady pressure data for different try geometries is done against experimental and other published theoretical results, which yielded a geometry which best reflects the load values as in a real bird impact event.

  4. An Integrated Approach for the Large-Scale Simulation of Sedimentary Basins to Study Seismic Wave Amplification

    NASA Astrophysics Data System (ADS)

    Poursartip, B.

    2015-12-01

    Seismic hazard assessment to predict the behavior of infrastructures subjected to earthquake relies on ground motion numerical simulation because the analytical solution of seismic waves is limited to only a few simple geometries. Recent advances in numerical methods and computer architectures make it ever more practical to reliably and quickly obtain the near-surface response to seismic events. The key motivation stems from the need to access the performance of sensitive components of the civil infrastructure (nuclear power plants, bridges, lifelines, etc), when subjected to realistic scenarios of seismic events. We discuss an integrated approach that deploys best-practice tools for simulating seismic events in arbitrarily heterogeneous formations, while also accounting for topography. Specifically, we describe an explicit forward wave solver based on a hybrid formulation that couples a single-field formulation for the computational domain with an unsplit mixed-field formulation for Perfectly-Matched-Layers (PMLs and/or M-PMLs) used to limit the computational domain. Due to the material heterogeneity and the contrasting discretization needs it imposes, an adaptive time solver is adopted. We use a Runge-Kutta-Fehlberg time-marching scheme that adjusts optimally the time step such that the local truncation error rests below a predefined tolerance. We use spectral elements for spatial discretization, and the Domain Reduction Method in accordance with double couple method to allow for the efficient prescription of the input seismic motion. Of particular interest to this development is the study of the effects idealized topographic features have on the surface motion when compared against motion results that are based on a flat-surface assumption. We discuss the components of the integrated approach we followed, and report the results of parametric studies in two and three dimensions, for various idealized topographic features, which show motion amplification that depends, as expected, on the relation between the topographic feature's characteristics and the dominant wavelength. Lastly, we report results involving three-dimensional simulations.

  5. Towards a more realistic population of bright spiral galaxies in cosmological simulations

    NASA Astrophysics Data System (ADS)

    Aumer, Michael; White, Simon D. M.; Naab, Thorsten; Scannapieco, Cecilia

    2013-10-01

    We present an update to the multiphase smoothed particle hydrodynamics galaxy formation code by Scannapieco et al. We include a more elaborate treatment of the production of metals, cooling rates based on individual element abundances and a scheme for the turbulent diffusion of metals. Our supernova feedback model now transfers energy to the interstellar medium (ISM) in kinetic and thermal form, and we include a prescription for the effects of radiation pressure from massive young stars on the ISM. We calibrate our new code on the well-studied Aquarius haloes and then use it to simulate a sample of 16 galaxies with halo masses between 1 × 1011 and 3 × 1012 M⊙. In general, the stellar masses of the sample agree well with the stellar mass to halo mass relation inferred from abundance matching techniques for redshifts z = 0-4. There is however a tendency to overproduce stars at z > 4 and to underproduce them at z < 0.5 in the least massive haloes. Overly high star formation rates (SFRs) at z < 1 for the most massive haloes are likely connected to the lack of active galactic nuclei feedback in our model. The simulated sample also shows reasonable agreement with observed SFRs, sizes, gas fractions and gas-phase metallicities at z = 0-3. Remaining discrepancies can be connected to deviations from predictions for star formation histories from abundance matching. At z = 0, the model galaxies show realistic morphologies, stellar surface density profiles, circular velocity curves and stellar metallicities, but overly flat metallicity gradients. 15 out of 16 of our galaxies contain disc components with kinematic disc fraction ranging between 15 and 65 per cent. The disc fraction depends on the time of the last destructive merger or misaligned infall event. Considering the remaining shortcomings of our simulations we conclude that even higher kinematic disc fractions may be possible for Λ cold dark matter haloes with quiet merger histories, such as the Aquarius haloes.

  6. Impact of an open-chest extracorporeal membrane oxygenation model for in situ simulated team training: a pilot study.

    PubMed

    Atamanyuk, Iryna; Ghez, Olivier; Saeed, Imran; Lane, Mary; Hall, Judith; Jackson, Tim; Desai, Ajay; Burmester, Margarita

    2014-01-01

    To develop an affordable realistic open-chest extracorporeal membrane oxygenation (ECMO) model for embedded in situ interprofessional crisis resource management training in emergency management of a post-cardiac surgery child. An innovative attachment to a high-fidelity mannequin (Laerdal Simbaby) was used to enable a cardiac tamponade/ECMO standstill scenario. Two saline bags with blood dye were placed over the mannequin's chest. A 'heart' bag with venous and arterial outlets was connected to the corresponding tubes of the ECMO circuit. The bag was divided into arterial and venous parts by loosely wrapping silicon tubing around its centre. A 'pericardial' bag was placed above it. Both were then covered by a chest skin that had a sutured silicone membrane window. False blood injected into the 'pericardial' bag caused expansion leading to (i) bulging of silastic membrane, simulating tamponade, and (ii) compression of tubing around the 'heart' bag, creating negative venous pressures and cessation of ECMO flow. In situ Simulation Paediatric Resuscitation Team Training (SPRinT) was performed on paediatric intensive care unit; the course included a formal team training/scenario of an open-chest ECMO child with acute cardiac tamponade due to blocked chest drains/debriefing by trained facilitators. Cardiac tamponade was reproducible, and ECMO flow/circuit pressure changes were effective and appropriate. There were eight participants: one cardiac surgeon, two intensivists, one cardiologist, one perfusionist and three nurses. Five of the eight reported the realism of the model and 6/8 the realism of the clinical scenario as highly effective. Eight of eight reported a highly effective impact on (i) their practice and (ii) teamwork. Six of eight reported a highly effective impact on communication skills and increased confidence in attending future real events. Innovative adaptation of a high-fidelity mannequin for open-chest ECMO simulation can achieve a realistic and reproducible training model. The impact on interprofessional team training is promising but needs to be validated further.

  7. Simulation of seasonal US precipitation and temperature by the nested CWRF-ECHAM system

    NASA Astrophysics Data System (ADS)

    Chen, Ligang; Liang, Xin-Zhong; DeWitt, David; Samel, Arthur N.; Wang, Julian X. L.

    2016-02-01

    This study investigates the refined simulation skill that results when the regional Climate extension of the Weather Research and Forecasting (CWRF) model is nested in the ECMWF Hamburg version 4.5 (ECHAM) atmospheric general circulation model over the United States during 1980-2009, where observed sea surface temperatures are used in both models. Over the contiguous US, for each of the four seasons from winter to fall, CWRF reduces the root mean square error of the ECHAM seasonal mean surface air temperature simulation by 0.19, 0.82, 2.02 and 1.85 °C, and increases the equitable threat score of seasonal mean precipitation by 0.18, 0.11, 0.09 and 0.12. CWRF also simulates much more realistically daily precipitation frequency and heavy precipitation events, typically over the Central Great Plains, Cascade Mountains and Gulf Coast States. These CWRF skill enhancements are attributed to the increased spatial resolution and physics refinements in representing orographic, terrestrial hydrology, convection, and cloud-aerosol-radiation effects and their interactions. Empirical orthogonal function analysis of seasonal mean precipitation and surface air temperature interannual variability shows that, in general, CWRF substantially improves the spatial distribution of both quantities, while temporal evolution (i.e. interannual variability) of the first 3 primary patterns is highly correlated with that of the driving ECHAM (except for summer precipitation), and they both have low temporal correlations against observations. During winter, when large-scale forcing dominates, both models also have similar responses to strong ENSO signals where they successfully capture observed precipitation composite anomalies but substantially fail to reproduce surface air temperature anomalies. When driven by the ECMWF Reanalysis Interim, CWRF produces a very realistic interannual evolution of large-scale precipitation and surface air temperature patterns where the temporal correlations with observations are significant. These results indicate that CWRF can greatly improve mesoscale regional climate structures but it cannot change interannual variations of the large-scale patterns, which are determined by the driving lateral boundary conditions.

  8. Assessment of the impact of oxidation processes on indoor air pollution using the new time-resolved INCA-Indoor model

    NASA Astrophysics Data System (ADS)

    Mendez, Maxence; Blond, Nadège; Blondeau, Patrice; Schoemaecker, Coralie; Hauglustaine, Didier A.

    2015-12-01

    INCA-Indoor, a new indoor air quality (IAQ) model, has been developed to simulate the concentrations of volatile organic compounds (VOC) and oxidants considering indoor air specific processes such as: emission, ventilation, surface interactions (sorption, deposition, uptake). Based on the detailed version of SAPRC-07 chemical mechanism, INCA-Indoor is able to analyze the contribution of the production and loss pathways of key chemical species (VOCs, oxidants, radical species). The potential of this model has been tested through three complementary analyses: a comparison with the most detailed IAQ model found in the literature, focusing on oxidant species; realistic scenarios covering a large range of conditions, involving variable OH sources like HONO; and the investigation of alkenes ozonolysis under a large range of indoor conditions that can increase OH and HO2 concentrations. Simulations have been run changing nitrous acid (HONO) concentrations, NOx levels, photolysis rates and ventilation rates, showing that HONO can be the main source of indoor OH. Cleaning events using products containing D-limonene have been simulated at different periods of the day. These scenarios show that HOX concentrations can significantly increase in specific conditions. An assessment of the impact of indoor chemistry on the potential formation of secondary species such as formaldehyde (HCHO) and acetaldehyde (CH3CHO) has been carried out under various room configuration scenarios and a study of the HOx budget for different realistic scenarios has been performed. It has been shown that, under the simulation conditions, formaldehyde can be affected by oxidant concentrations via chemical production which can account for more than 10% of the total production, representing 6.5 ppb/h. On the other hand, acetaldehyde production is affected more by oxidation processes. When the photolysis rates are high, chemical processes are responsible for about 50% of the total production of acetaldehyde (9 ppb/h).

  9. Immersion team training in a realistic environment improves team performance in trauma resuscitation.

    PubMed

    Siriratsivawong, Kris; Kang, Jeff; Riffenburgh, Robert; Hoang, Tuan N

    2016-09-01

    In the US military, it is common for health care teams to be formed ad hoc and expected to function cohesively as a unit. Poor team dynamics decreases the effectiveness of trauma care delivery. The US Navy Fleet Surgical Team Three has developed a simulation-based trauma initiative-the Shipboard Surgical Trauma Training (S2T2) Course-that emphasizes team dynamics to improve the delivery of trauma care to the severely injured patient. The S2T2 Course combines classroom didactics with hands-on simulation over a period of 6 days, culminating in a daylong, mass casualty scenario. Each resuscitation team was initially evaluated with a simulated trauma resuscitation scenario then retested on the same scenario after completing the course. A written exam was also administered individually both before and after the course. A survey was administered to assess the participants' perceived effectiveness of the course on overall team training. From the evaluation of 20 resuscitation teams made up of 123 medical personnel, there was a decrease in the mean time needed to perform the simulated trauma resuscitation, from a mean of 24.4 minutes to 13.5 minutes (P < .01), a decrease in the mean number of critical events missed, from 5.15 to 1.00 (P < .01), and a mean improvement of 41% in written test scores. More than 90% of participants rated the course as highly effective for improving team dynamics. A team-based trauma course with immersion in a realistic environment is an effective tool for improving team performance in trauma training. This approach has high potential to improve trauma care and patient outcomes. The benefits of this team-based course can be adapted to the civilian rural sector, where gaps have been identified in trauma care. Published by Elsevier Inc.

  10. Virtual Cerebral Aneurysm Clipping with Real-Time Haptic Force Feedback in Neurosurgical Education.

    PubMed

    Gmeiner, Matthias; Dirnberger, Johannes; Fenz, Wolfgang; Gollwitzer, Maria; Wurm, Gabriele; Trenkler, Johannes; Gruber, Andreas

    2018-04-01

    Realistic, safe, and efficient modalities for simulation-based training are highly warranted to enhance the quality of surgical education, and they should be incorporated in resident training. The aim of this study was to develop a patient-specific virtual cerebral aneurysm-clipping simulator with haptic force feedback and real-time deformation of the aneurysm and vessels. A prototype simulator was developed from 2012 to 2016. Evaluation of virtual clipping by blood flow simulation was integrated in this software, and the prototype was evaluated by 18 neurosurgeons. In 4 patients with different medial cerebral artery aneurysms, virtual clipping was performed after real-life surgery, and surgical results were compared regarding clip application, surgical trajectory, and blood flow. After head positioning and craniotomy, bimanual virtual aneurysm clipping with an original forceps was performed. Blood flow simulation demonstrated residual aneurysm filling or branch stenosis. The simulator improved anatomic understanding for 89% of neurosurgeons. Simulation of head positioning and craniotomy was considered realistic by 89% and 94% of users, respectively. Most participants agreed that this simulator should be integrated into neurosurgical education (94%). Our illustrative cases demonstrated that virtual aneurysm surgery was possible using the same trajectory as in real-life cases. Both virtual clipping and blood flow simulation were realistic in broad-based but not calcified aneurysms. Virtual clipping of a calcified aneurysm could be performed using the same surgical trajectory, but not the same clip type. We have successfully developed a virtual aneurysm-clipping simulator. Next, we will prospectively evaluate this device for surgical procedure planning and education. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Virtual Physical Therapy Clinician: Development, Validation and Testing

    ERIC Educational Resources Information Center

    Huhn, Karen

    2011-01-01

    Introduction: Clinical reasoning skills develop through repeated practice in realistic patient scenarios. Time constraints, declining availability of clinical education sites and patient safety are some of the factors that limit physical therapy educators' ability to expose students to realistic patient scenarios. Computerized simulations may be…

  12. Simulations of X-ray diffraction of shock-compressed single-crystal tantalum with synchrotron undulator sources.

    PubMed

    Tang, M X; Zhang, Y Y; E, J C; Luo, S N

    2018-05-01

    Polychromatic synchrotron undulator X-ray sources are useful for ultrafast single-crystal diffraction under shock compression. Here, simulations of X-ray diffraction of shock-compressed single-crystal tantalum with realistic undulator sources are reported, based on large-scale molecular dynamics simulations. Purely elastic deformation, elastic-plastic two-wave structure, and severe plastic deformation under different impact velocities are explored, as well as an edge release case. Transmission-mode diffraction simulations consider crystallographic orientation, loading direction, incident beam direction, X-ray spectrum bandwidth and realistic detector size. Diffraction patterns and reciprocal space nodes are obtained from atomic configurations for different loading (elastic and plastic) and detection conditions, and interpretation of the diffraction patterns is discussed.

  13. Simulations of X-ray diffraction of shock-compressed single-crystal tantalum with synchrotron undulator sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, M. X.; Zhang, Y. Y.; E, J. C.

    Polychromatic synchrotron undulator X-ray sources are useful for ultrafast single-crystal diffraction under shock compression. Here, simulations of X-ray diffraction of shock-compressed single-crystal tantalum with realistic undulator sources are reported, based on large-scale molecular dynamics simulations. Purely elastic deformation, elastic–plastic two-wave structure, and severe plastic deformation under different impact velocities are explored, as well as an edge release case. Transmission-mode diffraction simulations consider crystallographic orientation, loading direction, incident beam direction, X-ray spectrum bandwidth and realistic detector size. Diffraction patterns and reciprocal space nodes are obtained from atomic configurations for different loading (elastic and plastic) and detection conditions, and interpretation of themore » diffraction patterns is discussed.« less

  14. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    PubMed Central

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  15. Hydraulic risk assessment of bridges using UAV photogrammetry

    NASA Astrophysics Data System (ADS)

    Hackl, Jürgen; Adey, Bryan T.; Woźniak, Michał; Schümperlin, Oliver

    2017-04-01

    Road networks are essential for economic growth and development. Of the objects within a road network, bridges are of special interest, because their failure often results in relatively large interruptions to how the network is used, their replacement costs are generally large, and it usually takes a considerable amount of time to restore them once they have failed. Of the different types of bridges, bridges in mountainous regions are of special interest because their failure could cause severe societal consequences, for example, if it renders an area inaccessible. One of the main causes of the failure of bridges in mountainous regions is the occurrence of a hydraulic event, for example, flood waters above a certain level, scour below a certain depth or debris build up beyond a certain level. An assessment of risk related to a bridge in a mountainous region is challenging. The probability of occurrence of these events, and the resulting consequences, depend greatly on the characteristics (e.g. slope, soil, vegetation, precipitation, …) of the specific regions where the bridges are located. An indication of the effect of these characteristics can be seen in the sediment deposition during floods in mountain catchments. Additionally, there is often no, or no recent, topological information that can be used to develop terrain models to be used for realistic water flow simulations in mountain regions, and most hydrology and hydraulic models have been developed for lower gradient rivers and can often not be directly used to model water flow in mountain rivers. In an effort to improve the assessment of risk related to bridges in mountainous regions, using the setting for risk assessments established by Hackl et al. (2015) and Adey et al. (2016), an investigation was undertaken to determine whether unmanned aerial vehicles (UAVs) and photogrammetry could be used to generate the topological information required to run realistic water flow simulations. The process investigated includes: the use of geo-referenced images, taken by an UAV, the exportation of these images into a photogrammetric software, the creation of a 3D mesh of the terrain from these images, the conversion of the 3D mesh to a computational mesh, the use of the computational mesh to build a hydrodynamic model, and the use of the hydrodynamic model to run flow simulations. The process was used to estimate the complex water flow near a single span concrete bridge in the Canton of Grisons, Switzerland. The hydraulic events (abutment scour and overflow) predicted by the developed model were compared with with historical observations from a recent flood event in the region. The hydraulic events predicted by the developed model correspond with historical observations, indicating that the topological information collected in this way is sufficiently accurate to be used to simulate complex flow situations, which can be used in bridge risk assessments. Hackl, J., Adey, B.T., Heitzler, M., and Iosifescu Enescu, I. (2015). "An Overarching Risk Assessment Process to Evaluate the Risks Associated with Infrastructure Networks due to Natural Hazards." International Journal of Performability Engineering, 11(2), 153-168. Adey, B.T., Hackl, J., Lam, J.C., van Gelder, P., Prak, P., van Erp, N., Heitzler, M., Iosifescu Enescu, I., and Hurni, L. (2016). "Ensuring acceptable levels of infrastructure related risks due to natural hazards with emphasis on conducting stress tests." 1st International Symposium on Infrastructure Asset Management (SIAM2016), K. Kobayashi, ed., Kyoto, Japan, 19-29 (Jan).

  16. A Variable-Resolution Stretched-Grid General Circulation Model and Data Assimilation System with Multiple Areas of Interest: Studying the Anomalous Regional Climate Events of 1998

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence; Govindaraju, Ravi C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The new stretched-grid design with multiple (four) areas of interest, one at each global quadrant, is implemented into both a stretched-grid GCM (general circulation model) and a stretched-grid data assimilation system (DAS). The four areas of interest include: the U.S./Northern Mexico, the El Nino area/Central South America, India/China, and the Eastern Indian Ocean/Australia. Both the stretched-grid GCM and DAS annual (November 1997 through December 1998) integrations are performed with 50 km regional resolution. The efficient regional down-scaling to mesoscales is obtained for each of the four areas of interest while the consistent interactions between regional and global scales and the high quality of global circulation, are preserved. This is the advantage of the stretched-grid approach. The global variable resolution DAS incorporating the stretched-grid GCM has been developed and tested as an efficient tool for producing regional analyses and diagnostics with enhanced mesoscale resolution. The anomalous regional climate events of 1998 that occurred over the U.S., Mexico, South America, China, India, African Sahel, and Australia are investigated in both simulation and data assimilation modes. Tree assimilated products are also used, along with gauge precipitation data, for validating the simulation results. The obtained results show that the stretched-grid GCM and DAS are capable of producing realistic high quality simulated and assimilated products at mesoscale resolution for regional climate studies and applications.

  17. Challenges and Solutions for GNSS Receivers onboard LEO Satellites Traveling through the Ionosphere during Space Weather Events

    NASA Astrophysics Data System (ADS)

    Morton, Y.; Xu, D.; Yang, R.; Jiao, Y.; Rino, C.; Carrano, C. S.

    2017-12-01

    This presentation discusses challenges imposed on GNSS receiver carrier-tracking loop for receivers onboard LEO satellites traveling through ionosphere during space weather events and techniques that mitigate the effects. Recent studies show that the ESA's swarm satellites experienced a total loss of GPS signals in areas known for frequent occurrence of ionosphere plasma irregularities. The same phenomena have been observed in other satellite missions. More robust GNSS receiver technologies are needed to improve the navigation capabilities for future LEO satellite missions. A major challenge to characterize GNSS signals traversing ionospheric plasma structures to reach a LEO satellite is the lack of data. To overcome this challenge, we utilized a physics-based GNSS scintillation signal simulator to generate simulated data for analysis and algorithm development. The simulator relies on real scintillation data collected by ground-based receivers as the initializer to generate a realization of ionosphere irregularity structure statistical distribution. A user specifies desired satellite orbit, signal modulation scheme, receiver platform dynamics, and receiver front-end hardware design. These inputs are used to establish the signal propagation geometry to allow interception of the disturbed signal by a realistic GNSS receiver. The simulator results showed that plasma structures lead to strong disturbances on GNSS signals reaching a LEO platform. The disturbances are characterized by simultaneous deep amplitude fades and extremely rapid carrier phase fluctuations. The carrier phase rate is orders of magnitude higher than the ones experienced by receivers on the ground. Such high carrier dynamics far exceeds the range that can be tolerated by the bandwidth of a typical GNSS receiver. The deep amplitude fades further exacerbate the problem. Based on the simulator outputs, we established models of the disturbed signal parameters. These models are used in an adaptive carrier-tracking algorithm that demonstrated improved performances when applied to various simulated scenarios of plasma structures and receiver trajectories. The presentation will discuss the simulator, disturbed signal characterization, and the adaptive algorithm architecture and performances.

  18. SIMPAVE : evaluation of virtual environments for pavement construction simulations

    DOT National Transportation Integrated Search

    2007-05-01

    In the last couple of years, the authors have been developing virtual simulations for modeling the construction of asphalt pavements. The simulations are graphically rich, interactive, three-dimensional, with realistic physics, and allow multiple peo...

  19. Human swallowing simulation based on videofluorography images using Hamiltonian MPS method

    NASA Astrophysics Data System (ADS)

    Kikuchi, Takahiro; Michiwaki, Yukihiro; Kamiya, Tetsu; Toyama, Yoshio; Tamai, Tasuku; Koshizuka, Seiichi

    2015-09-01

    In developed nations, swallowing disorders and aspiration pneumonia have become serious problems. We developed a method to simulate the behavior of the organs involved in swallowing to clarify the mechanisms of swallowing and aspiration. The shape model is based on anatomically realistic geometry, and the motion model utilizes forced displacements based on realistic dynamic images to reflect the mechanisms of human swallowing. The soft tissue organs are modeled as nonlinear elastic material using the Hamiltonian MPS method. This method allows for stable simulation of the complex swallowing movement. A penalty method using metaballs is employed to simulate contact between organ walls and smooth sliding along the walls. We performed four numerical simulations under different analysis conditions to represent four cases of swallowing, including a healthy volunteer and a patient with a swallowing disorder. The simulation results were compared to examine the epiglottic downfolding mechanism, which strongly influences the risk of aspiration.

  20. "Physically-based" numerical experiment to determine the dominant hillslope processes during floods?

    NASA Astrophysics Data System (ADS)

    Gaume, Eric; Esclaffer, Thomas; Dangla, Patrick; Payrastre, Olivier

    2016-04-01

    To study the dynamics of hillslope responses during flood event, a fully coupled "physically-based" model for the combined numerical simulation of surface runoff and underground flows has been developed. A particular attention has been given to the selection of appropriate numerical schemes for the modelling of both processes and of their coupling. Surprisingly, the most difficult question to solve, from a numerical point of view, was not related to the coupling of two processes with contrasted kinetics such as surface and underground flows, but to the high gradient infiltration fronts appearing in soils, source of numerical diffusion, instabilities and sometimes divergence. The model being elaborated, it has been successfully tested against results of high quality experiments conducted on a laboratory sandy slope in the early eighties, which is still considered as a reference hillslope experimental setting (Abdul & Guilham). The model appeared able to accurately simulate the pore pressure distributions observed in this 1.5 meter deep and wide laboratory hillslope, as well as its outflow hydrograph shapes and the measured respective contributions of direct runoff and groundwater to these outflow hydrographs. Based on this great success, the same model has been used to simulate the response of a theoretical 100-meter wide and 10% sloped hillslope, with a 2 meter deep pervious soil and impervious bedrock. Three rain events have been tested: a 100 millimeter rainfall event over 10 days, over 1 day or over one hour. The simulated responses are hydrologically not realistic and especially the fast component of the response, that is generally observed in the real-world and explains flood events, is almost absent of the simulated response. Thinking a little about the whole problem, the simulation results appears totally logical according to the proposed model. The simulated response, in fact a recession hydrograph, corresponds to a piston flow of a relatively uniformly saturated hillslope leading to a constant discharge over several days. Some ingredients are clearly missing in the proposed model to reproduce hydrologically sensible responses. Heterogeneities are necessary to generate a variety of residence times and especially preferential flows must clearly be present to generate the fast component of hillslope responses. The importance of preferential flows in hillslope hydrology has been confirmed since this reported failure by several hillslope field experiments. We let also the readers draw their own conclusions about the numerous numerical models, that look very much alike the model proposed here, even if generally much more simplified, but representing the watersheds as much too homogeneous neglecting heterogeneities and preferential flows and pretending to be "physically based"…

  1. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    This report presents the results of arc track testing conducted to determine if such a transfer of power to un-energized wires is possible and/or likely during an arcing event, and to evaluate an array of protection schemes that may significantly reduce the possibility of such a transfer. The results of these experiments may be useful for determining the level of protection necessary to guard against spurious voltage and current being applied to safety critical circuits. It was not the purpose of these experiments to determine the probability of the initiation of an arc track event only if an initiation did occur could it cause the undesired event: an inadvertent thruster firing. The primary wire insulation used in the Orbiter is aromatic polyimide, or Kapton , a construction known to arc track under certain conditions [3]. Previous Boeing testing has shown that arc tracks can initiate in aromatic polyimide insulated 28 volts direct current (VDC) power circuits using more realistic techniques such as chafing with an aluminum blade (simulating the corner of an avionics box or lip of a wire tray), or vibration of an aluminum plate against a wire bundle [4]. Therefore, an arc initiation technique was chosen that provided a reliable and consistent technique of starting the arc and not a realistic simulation of a scenario on the vehicle. Once an arc is initiated, the current, power and propagation characteristics of the arc depend on the power source, wire gauge and insulation type, circuit protection and series resistance rather than type of initiation. The initiation method employed for these tests was applying an oil and graphite mixture to the ends of a powered twisted pair wire. The flight configuration of the heater circuits, the fuel/oxider (or ox) wire, and the RCS jet solenoid were modeled in the test configuration so that the behavior of these components during an arcing event could be studied. To determine if coil activation would occur with various protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  2. Abrupt cooling over the North Atlantic in modern climate models

    PubMed Central

    Sgubin, Giovanni; Swingedouw, Didier; Drijfhout, Sybren; Mary, Yannick; Bennabi, Amine

    2017-01-01

    Observations over the 20th century evidence no long-term warming in the subpolar North Atlantic (SPG). This region even experienced a rapid cooling around 1970, raising a debate over its potential reoccurrence. Here we assess the risk of future abrupt SPG cooling in 40 climate models from the fifth Coupled Model Intercomparison Project (CMIP5). Contrary to the long-term SPG warming trend evidenced by most of the models, 17.5% of the models (7/40) project a rapid SPG cooling, consistent with a collapse of the local deep-ocean convection. Uncertainty in projections is associated with the models' varying capability in simulating the present-day SPG stratification, whose realistic reproduction appears a necessary condition for the onset of a convection collapse. This event occurs in 45.5% of the 11 models best able to simulate the observed SPG stratification. Thus, due to systematic model biases, the CMIP5 ensemble as a whole underestimates the chance of future abrupt SPG cooling, entailing crucial implications for observation and adaptation policy. PMID:28198383

  3. Modeling Geometry and Progressive Failure of Material Interfaces in Plain Weave Composites

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Cheng, Ron-Bin

    2010-01-01

    A procedure combining a geometrically nonlinear, explicit-dynamics contact analysis, computer aided design techniques, and elasticity-based mesh adjustment is proposed to efficiently generate realistic finite element models for meso-mechanical analysis of progressive failure in textile composites. In the procedure, the geometry of fiber tows is obtained by imposing a fictitious expansion on the tows. Meshes resulting from the procedure are conformal with the computed tow-tow and tow-matrix interfaces but are incongruent at the interfaces. The mesh interfaces are treated as cohesive contact surfaces not only to resolve the incongruence but also to simulate progressive failure. The method is employed to simulate debonding at the material interfaces in a ceramic-matrix plain weave composite with matrix porosity and in a polymeric matrix plain weave composite without matrix porosity, both subject to uniaxial cyclic loading. The numerical results indicate progression of the interfacial damage during every loading and reverse loading event in a constant strain amplitude cyclic process. However, the composites show different patterns of damage advancement.

  4. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  5. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  6. Data-driven magnetohydrodynamic modelling of a flux-emerging active region leading to solar eruption

    PubMed Central

    Jiang, Chaowei; Wu, S. T.; Feng, Xuesheng; Hu, Qiang

    2016-01-01

    Solar eruptions are well-recognized as major drivers of space weather but what causes them remains an open question. Here we show how an eruption is initiated in a non-potential magnetic flux-emerging region using magnetohydrodynamic modelling driven directly by solar magnetograms. Our model simulates the coronal magnetic field following a long-duration quasi-static evolution to its fast eruption. The field morphology resembles a set of extreme ultraviolet images for the whole process. Study of the magnetic field suggests that in this event, the key transition from the pre-eruptive to eruptive state is due to the establishment of a positive feedback between the upward expansion of internal stressed magnetic arcades of new emergence and an external magnetic reconnection which triggers the eruption. Such a nearly realistic simulation of a solar eruption from origin to onset can provide important insight into its cause, and also has the potential for improving space weather modelling. PMID:27181846

  7. Colonoscopy procedure simulation: virtual reality training based on a real time computational approach.

    PubMed

    Wen, Tingxi; Medveczky, David; Wu, Jackie; Wu, Jianhuang

    2018-01-25

    Colonoscopy plays an important role in the clinical screening and management of colorectal cancer. The traditional 'see one, do one, teach one' training style for such invasive procedure is resource intensive and ineffective. Given that colonoscopy is difficult, and time-consuming to master, the use of virtual reality simulators to train gastroenterologists in colonoscopy operations offers a promising alternative. In this paper, a realistic and real-time interactive simulator for training colonoscopy procedure is presented, which can even include polypectomy simulation. Our approach models the colonoscopy as thick flexible elastic rods with different resolutions which are dynamically adaptive to the curvature of the colon. More material characteristics of this deformable material are integrated into our discrete model to realistically simulate the behavior of the colonoscope. We present a simulator for training colonoscopy procedure. In addition, we propose a set of key aspects of our simulator that give fast, high fidelity feedback to trainees. We also conducted an initial validation of this colonoscopic simulator to determine its clinical utility and efficacy.

  8. The Thermal Regulation of Gravitational Instabilities in Protoplanetary Disks. III. Simulations with Radiative Cooling and Realistic Opacities

    NASA Astrophysics Data System (ADS)

    Boley, Aaron C.; Mejía, Annie C.; Durisen, Richard H.; Cai, Kai; Pickett, Megan K.; D'Alessio, Paola

    2006-11-01

    This paper presents a fully three-dimensional radiative hydrodymanics simulation with realistic opacities for a gravitationally unstable 0.07 Msolar disk around a 0.5 Msolar star. We address the following aspects of disk evolution: the strength of gravitational instabilities under realistic cooling, mass transport in the disk that arises from GIs, comparisons between the gravitational and Reynolds stresses measured in the disk and those expected in an α-disk, and comparisons between the SED derived for the disk and SEDs derived from observationally determined parameters. The mass transport in this disk is dominated by global modes, and the cooling times are too long to permit fragmentation for all radii. Moreover, our results suggest a plausible explanation for the FU Ori outburst phenomenon.

  9. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    NASA Astrophysics Data System (ADS)

    Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration

    2018-01-01

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.

  10. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scolnic, D.; Kessler, R.; Brout, D.

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  11. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE PAGES

    Scolnic, D.; Kessler, R.; Brout, D.; ...

    2017-12-22

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  12. Modeling the Magnetopause Shadowing and Drift Orbit Bifurcation Loss during the June 2015 Dropout Event

    NASA Astrophysics Data System (ADS)

    Tu, W.; Cunningham, G.

    2017-12-01

    The relativistic electron flux in Earth's radiation belt are observed to drop by orders of magnitude on timescale of a few hours. Where do the electrons go during the dropout? This is one of the most important outstanding questions in radiation belt studies. Here we will study the 22 June 2015 dropout event which occurred during one of the largest geomagnetic storms in the last decade. A sudden and nearly complete loss of all the outer zone relativistic and ultra-relativistic electrons were observed after a strong interplanetary shock. The Last Closed Drift Shell (LCDS) calculated using the TS04 model reached as low as L*=3.7 during the shock and stay below L*=4 for 1 hour. The unusually low LCDS values suggest that magnetopause shadowing and the associated outward radial diffusion can contribute significantly to the observed dropout. In addition, Drift Orbit Bifurcation (DOB) has been suggested as an important loss mechanism for radiation belt electrons, especially when the solar wind dynamic pressure is high, but its relative importance has not been quantified. Here, we will model the June 2015 dropout event using a radial diffusion model that includes physical and event-specific inputs. First, we will trace electron drift shells based on TS04 model to identify the LCDS and bifurcation regions as a function of the 2nd adiabatic invariant (K) and time. To model magnetopause shadowing, electron lifetimes in our model will be set to electron drift periods at L*>LCDS. Electron lifetimes inside the bifurcation region have been estimated by Ukhorskiy et al. [JGR 2011, doi:10.1029/2011JA016623] as a function of L* and K, which will also be implemented in the model. This will be the first effort to include the DOB loss in a comprehensive radiation belt model. Furthermore, to realistically simulate outward radial diffusion, the new radial diffusion coefficients that are calculated based on the realistic TS04 model and include physical K dependence [Cunningham, JGR 2016, doi:10.1002/2015JA021981] will be achieved and included here. With these event-specific and physical model inputs, we will test how well the observed fast dropout during the June 2015 event can be reproduced by our model, and quantify the relative contribution of magnetopause shadowing, outward radial diffusion, and DOB to the fast electron depletion.

  13. [Investigation into the formation of proportions of "realistic thinking vs magical thinking" in paranoid schizophrenia].

    PubMed

    Jarosz, M; Pankiewicz, Z; Buczek, I; Poprawska, I; Rojek, J; Zaborowski, A

    1993-01-01

    Both magical thinking among healthy persons and magical and symbolic thinking in schizophrenia were discussed. The investigation covered 100 paranoid schizophrenics. They also underwent an examination in connection with the formation of the remaining 3 proportions. Both "realistic thinking and magical thinking" scales were used. An ability to think realistically was preserved, to a varying degree, in all patients, with 50% of those examined having shown an explicit or very explicit ability to follow realistic thinking. The above findings deviate from a simplified cognitive model within the discussed range. It was further confirmed that realistic thinking may coexist with magical thinking, and, in some cases, it concerns the same events. That type of disorders of the content of thinking are referred to as magical-realistic interpenetration. The results, and particularly high coefficient of negative correlation within the scales of the examined proportions, confirm the correctness of the assumption that the investigated modes of thinking form an antithetic bipolarity of proportions, aggregating antithetic values, therefore being also complementary.

  14. Realistic Radio Communications in Pilot Simulator Training

    NASA Technical Reports Server (NTRS)

    Burki-Cohen, Judith; Kendra, Andrew J.; Kanki, Barbara G.; Lee, Alfred T.

    2000-01-01

    Simulators used for total training and evaluation of airline pilots must satisfy stringent criteria in order to assure their adequacy for training and checking maneuvers. Air traffic control and company radio communications simulation, however, may still be left to role-play by the already taxed instructor/evaluators in spite of their central importance in every aspect of the flight environment. The underlying premise of this research is that providing a realistic radio communications environment would increase safety by enhancing pilot training and evaluation. This report summarizes the first-year efforts of assessing the requirement and feasibility of simulating radio communications automatically. A review of the training and crew resource/task management literature showed both practical and theoretical support for the need for realistic radio communications simulation. A survey of 29 instructor/evaluators from 14 airlines revealed that radio communications are mainly role-played by the instructor/evaluators. This increases instructor/evaluators' own workload while unrealistically lowering pilot communications load compared to actual operations, with a concomitant loss in training/evaluation effectiveness. A technology review searching for an automated means of providing radio communications to and from aircraft with minimal human effort showed that while promising, the technology is still immature. Further research and the need for establishing a proof-of-concept are also discussed.

  15. NDE and SHM Simulation for CFRP Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Parker, F. Raymond

    2014-01-01

    Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.

  16. Improved Land Use and Leaf Area Index Enhances WRF-3DVAR Satellite Radiance Assimilation: A Case Study Focusing on Rainfall Simulation in the Shule River Basin during July 2013

    NASA Astrophysics Data System (ADS)

    Yang, Junhua; Ji, Zhenming; Chen, Deliang; Kang, Shichang; Fu, Congshen; Duan, Keqin; Shen, Miaogen

    2018-06-01

    The application of satellite radiance assimilation can improve the simulation of precipitation by numerical weather prediction models. However, substantial quantities of satellite data, especially those derived from low-level (surface-sensitive) channels, are rejected for use because of the difficulty in realistically modeling land surface emissivity and energy budgets. Here, we used an improved land use and leaf area index (LAI) dataset in the WRF-3DVAR assimilation system to explore the benefit of using improved quality of land surface information to improve rainfall simulation for the Shule River Basin in the northeastern Tibetan Plateau as a case study. The results for July 2013 show that, for low-level channels (e.g., channel 3), the underestimation of brightness temperature in the original simulation was largely removed by more realistic land surface information. In addition, more satellite data could be utilized in the assimilation because the realistic land use and LAI data allowed more satellite radiance data to pass the deviation test and get used by the assimilation, which resulted in improved initial driving fields and better simulation in terms of temperature, relative humidity, vertical convection, and cumulative precipitation.

  17. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  18. Boulder Dislodgment Reloaded: New insights from boulder transport and dislodgement by tsunamis and storms from three-dimensional numerical simulations with GPUSPH

    NASA Astrophysics Data System (ADS)

    Weiss, R.; Zainali, A.

    2014-12-01

    Boulders can be found on many coastlines around the globe. They are generally thought to be moved either during coastal storms or tsunamis because they are too heavy to be moved by more common marine or coastal processes. To understand storm and tsunami risk at given coastline, the event histories of both events need to be separated to produce a robust event statistics for quantitative risk analyses. Because boulders are most likely only moved by coastal storms or tsunamis, they are very suitable to produce the data basis for such event statistics. Boulder transport problem has been approached by comparing the driving with resisting forces acting on a boulder. However, we argue that this approach is not sufficient because the comparison of resisting and driving forces only constitutes boulder motion, but not for boulder dislodgment. Boulder motion means that the boulder starts to move out of its pocket. However, this motion does not guarantee that the boulder will reach the critical dislodgment position. Boulder dislodgment is a necessary condition to identify whether or not a boulder has moved. For boulder dislodgement, an equation of motion is needed, and that equation is Newtons Second Law of Motion (NSL). We perform fully coupled three-dimensional numerical simulation of boulders moved by waves where the boulders move according to NSL. Our numerical simulations are the first of their kind applied to tsunami and storm boulder motion. They show how storm and tsunami waves interact with boulders in a more realistic physical setting, and highlight the importance of submergence. Based on our simulations we perform a dimensional analysis that identifies the Froude number as important parameter, which can be considered large only in the front of tsunami waves, but small in the rest of tsunami wave and also generally small in storm waves. From a general point of view, our results indicate that the boulder transport problem is more complex than recently considered, and more variables need to be considered in inversions of the wave characteristics from moved boulders. However, numerical simulations are an incredible powerful and flexible tool with which more robust and more correct techniques to invert wave characteristics from moved boulders can be developed. Our analyses of the Froude number and submergence are positive indicators.

  19. Bayesian reconstruction of transmission within outbreaks using genomic variants.

    PubMed

    De Maio, Nicola; Worby, Colin J; Wilson, Daniel J; Stoesser, Nicole

    2018-04-01

    Pathogen genome sequencing can reveal details of transmission histories and is a powerful tool in the fight against infectious disease. In particular, within-host pathogen genomic variants identified through heterozygous nucleotide base calls are a potential source of information to identify linked cases and infer direction and time of transmission. However, using such data effectively to model disease transmission presents a number of challenges, including differentiating genuine variants from those observed due to sequencing error, as well as the specification of a realistic model for within-host pathogen population dynamics. Here we propose a new Bayesian approach to transmission inference, BadTrIP (BAyesian epiDemiological TRansmission Inference from Polymorphisms), that explicitly models evolution of pathogen populations in an outbreak, transmission (including transmission bottlenecks), and sequencing error. BadTrIP enables the inference of host-to-host transmission from pathogen sequencing data and epidemiological data. By assuming that genomic variants are unlinked, our method does not require the computationally intensive and unreliable reconstruction of individual haplotypes. Using simulations we show that BadTrIP is robust in most scenarios and can accurately infer transmission events by efficiently combining information from genetic and epidemiological sources; thanks to its realistic model of pathogen evolution and the inclusion of epidemiological data, BadTrIP is also more accurate than existing approaches. BadTrIP is distributed as an open source package (https://bitbucket.org/nicofmay/badtrip) for the phylogenetic software BEAST2. We apply our method to reconstruct transmission history at the early stages of the 2014 Ebola outbreak, showcasing the power of within-host genomic variants to reconstruct transmission events.

  20. Seasonal Synchronization of a Simple Stochastic Dynamical Model Capturing El Niño Diversity

    NASA Astrophysics Data System (ADS)

    Thual, S.; Majda, A.; Chen, N.

    2017-12-01

    The El Niño-Southern Oscillation (ENSO) has significant impact on global climate and seasonal prediction. Recently, a simple ENSO model was developed that automatically captures the ENSO diversity and intermittency in nature, where state-dependent stochastic wind bursts and nonlinear advection of sea surface temperature (SST) are coupled to simple ocean-atmosphere processes that are otherwise deterministic, linear and stable. In the present article, it is further shown that the model can reproduce qualitatively the ENSO synchronization (or phase-locking) to the seasonal cycle in nature. This goal is achieved by incorporating a cloud radiative feedback that is derived naturally from the model's atmosphere dynamics with no ad-hoc assumptions and accounts in simple fashion for the marked seasonal variations of convective activity and cloud cover in the eastern Pacific. In particular, the weak convective response to SSTs in boreal fall favors the eastern Pacific warming that triggers El Niño events while the increased convective activity and cloud cover during the following spring contributes to the shutdown of those events by blocking incoming shortwave solar radiations. In addition to simulating the ENSO diversity with realistic non-Gaussian statistics in different Niño regions, both the eastern Pacific moderate and super El Niño, the central Pacific El Niño as well as La Niña show a realistic chronology with a tendency to peak in boreal winter as well as decreased predictability in spring consistent with the persistence barrier in nature. The incorporation of other possible seasonal feedbacks in the model is also documented for completeness.

  1. Numerical study of the effect of earth tides on recurring short-term slow slip events

    NASA Astrophysics Data System (ADS)

    Matsuzawa, T.; Tanaka, Y.; Shibazaki, B.

    2017-12-01

    Short-term slow slip events (SSEs) in the Nankai region are affected by earth tides (e.g., Nakata et al., 2008; Ide and Tanaka, 2014; Yabe et al., 2015). The effect of tidal stress on the SSEs is also examined numerically (e.g., Hawthorne and Rubin, 2013). In our previous study (Matsuzawa et al., 2017, JpGU-AGU), we numerically simulated SSEs in the Shikoku region, and reported that tidal stress makes the variance of recurrence intervals of SSEs smaller in relatively isolated SSE regions. However, the reason of such stable recurrence was not clear. In this study, we examine the tidal effect on short-term SSEs based on a flat plate and a realistic plate model (e.g., Matsuzawa et al., 2013, GRL). We adopt a rate- and state-dependent friction law (RS-law) with cutoff velocities as in our previous studies (Matsuzawa et al., 2013). We assume that (a-b) value in the RS-law is negative within the short-term SSE region, and positive outside the region. In a flat plate model, the short-term SSE region is a circular patch with the radius of 6 km. In a realistic plate model, the short-term SSE region is based on the actual distribution of low-frequency tremor. Low effective normal stress is assumed at the depth of SSEs. Calculating stress change by earth tides as in Yabe et al., (2015), we examine the stress perturbation by two different earth tides with the period of semidiurnal (M2) and fortnight (Mf) tide in this study. In the result of a flat plate case, amplitude of SSEs becomes smaller just after the slip at whole simulated area. Recurring SSEs become clear again within one year in the case with tides (M2 or Mf), while the recurrence becomes clear after seven years in the case without tides. Interestingly, the effect of the Mf tide is similar to the case with the M2 tide, even though the amplitude of the Mf tide (0.01 kPa) is two-order smaller than that of the M2 tide. In the realistic plate model of Shikoku, clear recurrence of short-term SSEs is found earlier than the case without tides, after the occurrence of long-term SSEs. These results suggest that stress perturbation by earth tides makes SSEs more episodic even in the situation that the loading in the surrounding area tends to cause temporal stable sliding.

  2. How Expert Pilots Think Cognitive Processes in Expert Decision Making

    DTIC Science & Technology

    1993-02-01

    Management (CRM) This document is available to the public Advanced Qualification Program (AQP) through the National Technical Information Cognitive Task Analysis (CTA...8217 Selecting realistic EDM scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events...scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events is a basic requirement for

  3. Numerical investigations with WRF about atmospheric features leading to heavy precipitation and flood events over the Central Andes' complex topography

    NASA Astrophysics Data System (ADS)

    Zamuriano, Marcelo; Brönnimann, Stefan

    2017-04-01

    It's known that some extremes such as heavy rainfalls, flood events, heatwaves and droughts depend largely on the atmospheric circulation and local features. Bolivia is no exception and while the large scale dynamics over the Amazon has been largely investigated, the local features driven by the Andes Cordillera and the Altiplano is still poorly documented. New insights on the regional atmospheric dynamics preceding heavy precipitation and flood events over the complex topography of the Andes-Amazon interface are added through numerical investigations of several case events: flash flood episodes over La Paz city and the extreme 2014 flood in south-western Amazon basin. Large scale atmospheric water transport is dynamically downscaled in order to take into account the complex topography forcing and local features as modulators of these events. For this purpose, a series of high resolution numerical experiments with the WRF-ARW model is conducted using various global datasets and parameterizations. While several mechanisms have been suggested to explain the dynamics of these episodes, they have not been tested yet through numerical modelling experiments. The simulations captures realistically the local water transport and the terrain influence over atmospheric circulation, even though the precipitation intensity is in general unrealistic. Nevertheless, the results show that Dynamical Downscaling over the tropical Andes' complex terrain provides useful meteorological data for a variety of studies and contributes to a better understanding of physical processes involved in the configuration of these events.

  4. Multi-ray medical ultrasound simulation without explicit speckle modelling.

    PubMed

    Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak

    2018-05-04

    To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.

  5. Low resolution brain electromagnetic tomography in a realistic geometry head model: a simulation study

    NASA Astrophysics Data System (ADS)

    Ding, Lei; Lai, Yuan; He, Bin

    2005-01-01

    It is of importance to localize neural sources from scalp recorded EEG. Low resolution brain electromagnetic tomography (LORETA) has received considerable attention for localizing brain electrical sources. However, most such efforts have used spherical head models in representing the head volume conductor. Investigation of the performance of LORETA in a realistic geometry head model, as compared with the spherical model, will provide useful information guiding interpretation of data obtained by using the spherical head model. The performance of LORETA was evaluated by means of computer simulations. The boundary element method was used to solve the forward problem. A three-shell realistic geometry (RG) head model was constructed from MRI scans of a human subject. Dipole source configurations of a single dipole located at different regions of the brain with varying depth were used to assess the performance of LORETA in different regions of the brain. A three-sphere head model was also used to approximate the RG head model, and similar simulations performed, and results compared with the RG-LORETA with reference to the locations of the simulated sources. Multi-source localizations were discussed and examples given in the RG head model. Localization errors employing the spherical LORETA, with reference to the source locations within the realistic geometry head, were about 20-30 mm, for four brain regions evaluated: frontal, parietal, temporal and occipital regions. Localization errors employing the RG head model were about 10 mm over the same four brain regions. The present simulation results suggest that the use of the RG head model reduces the localization error of LORETA, and that the RG head model based LORETA is desirable if high localization accuracy is needed.

  6. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  7. Realistic Analytical Polyhedral MRI Phantoms

    PubMed Central

    Ngo, Tri M.; Fung, George S. K.; Han, Shuo; Chen, Min; Prince, Jerry L.; Tsui, Benjamin M. W.; McVeigh, Elliot R.; Herzka, Daniel A.

    2015-01-01

    Purpose Analytical phantoms have closed form Fourier transform expressions and are used to simulate MRI acquisitions. Existing 3D analytical phantoms are unable to accurately model shapes of biomedical interest. It is demonstrated that polyhedral analytical phantoms have closed form Fourier transform expressions and can accurately represent 3D biomedical shapes. Theory The derivations of the Fourier transform of a polygon and polyhedron are presented. Methods The Fourier transform of a polyhedron was implemented and its accuracy in representing faceted and smooth surfaces was characterized. Realistic anthropomorphic polyhedral brain and torso phantoms were constructed and their use in simulated 3D/2D MRI acquisitions was described. Results Using polyhedra, the Fourier transform of faceted shapes can be computed to within machine precision. Smooth surfaces can be approximated with increasing accuracy by increasing the number of facets in the polyhedron; the additional accumulated numerical imprecision of the Fourier transform of polyhedra with many faces remained small. Simulations of 3D/2D brain and 2D torso cine acquisitions produced realistic reconstructions free of high frequency edge aliasing as compared to equivalent voxelized/rasterized phantoms. Conclusion Analytical polyhedral phantoms are easy to construct and can accurately simulate shapes of biomedical interest. PMID:26479724

  8. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Global Aerodynamic Modeling for Stall/Upset Recovery Training Using Efficient Piloted Flight Test Techniques

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Cunningham, Kevin; Hill, Melissa A.

    2013-01-01

    Flight test and modeling techniques were developed for efficiently identifying global aerodynamic models that can be used to accurately simulate stall, upset, and recovery on large transport airplanes. The techniques were developed and validated in a high-fidelity fixed-base flight simulator using a wind-tunnel aerodynamic database, realistic sensor characteristics, and a realistic flight deck representative of a large transport aircraft. Results demonstrated that aerodynamic models for stall, upset, and recovery can be identified rapidly and accurately using relatively simple piloted flight test maneuvers. Stall maneuver predictions and comparisons of identified aerodynamic models with data from the underlying simulation aerodynamic database were used to validate the techniques.

  10. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  11. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  12. Realistic Affective Forecasting: The Role of Personality

    PubMed Central

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-01-01

    Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463

  13. Realistic affective forecasting: The role of personality.

    PubMed

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-11-01

    Affective forecasting often drives decision-making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the "realistic paradigm" in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesised that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine's Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesised, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts.

  14. Fun!

    ERIC Educational Resources Information Center

    Horne, Thomas

    1988-01-01

    Describes four IBM compatible flight simulator software packages: (1) "Falcon," air to air combat in an F-16 fighter; (2) "Chuck Yeager's Advanced Flight Trainer," test flight 14 different aircraft; (3) "Jet," air to air combat; and (4) "Flight Simulator," a realistic PC flight simulator program. (MVL)

  15. Effects of realistic topography on the ground motion of the Colombian Andes - A case study at the Aburrá Valley, Antioquia

    NASA Astrophysics Data System (ADS)

    Restrepo, Doriam; Bielak, Jacobo; Serrano, Ricardo; Gómez, Juan; Jaramillo, Juan

    2016-03-01

    This paper presents a set of deterministic 3-D ground motion simulations for the greater metropolitan area of Medellín in the Aburrá Valley, an earthquake-prone region of the Colombian Andes that exhibits moderate-to-strong topographic irregularities. We created the velocity model of the Aburrá Valley region (version 1) using the geological structures as a basis for determining the shear wave velocity. The irregular surficial topography is considered by means of a fictitious domain strategy. The simulations cover a 50 × 50 × 25 km3 volume, and four Mw = 5 rupture scenarios along a segment of the Romeral fault, a significant source of seismic activity in Colombia. In order to examine the sensitivity of ground motion to the irregular topography and the 3-D effects of the valley, each earthquake scenario was simulated with three different models: (i) realistic 3-D velocity structure plus realistic topography, (ii) realistic 3-D velocity structure without topography, and (iii) homogeneous half-space with realistic topography. Our results show how surface topography affects the ground response. In particular, our findings highlight the importance of the combined interaction between source-effects, source-directivity, focusing, soft-soil conditions, and 3-D topography. We provide quantitative evidence of this interaction and show that topographic amplification factors can be as high as 500 per cent at some locations. In other areas within the valley, the topographic effects result in relative reductions, but these lie in the 0-150 per cent range.

  16. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project

    NASA Astrophysics Data System (ADS)

    Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.

    2009-12-01

    During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions (consistent with the ESG2006 exercise which targeted the Grenoble Valley). Diffractions off the basin edges and induced surface-wave propagation mainly contribute to differences between predictions. The differences are particularly large in the elastic models but remain important also in models with attenuation. In the validation, predictions are compared with the recordings by a local array of 19 surface and borehole accelerometers. The level of agreement is found event-dependent. For the largest-magnitude event the agreement is surprisingly good even at high frequencies.

  17. Impacts of drought on grape yields in Western Cape, South Africa

    NASA Astrophysics Data System (ADS)

    Araujo, Julio A.; Abiodun, Babatunde J.; Crespo, Olivier

    2016-01-01

    Droughts remain a threat to grape yields in South Africa. Previous studies on the impacts of climate on grape yield in the country have focussed on the impact of rainfall and temperature separately; meanwhile, grape yields are affected by drought, which is a combination of rainfall and temperature influences. The present study investigates the impacts of drought on grape yields in the Western Cape (South Africa) at district and farm scales. The study used a new drought index that is based on simple water balance (Standardized Precipitation Evapotranspiration Index; hereafter, SPEI) to identify drought events and used a correlation analysis to identify the relationship between drought and grape yields. A crop simulation model (Agricultural Production Systems sIMulator, APSIM) was applied at the farm scale to investigate the role of irrigation in mitigating the impacts of drought on grape yield. The model gives a realistic simulation of grape yields. The Western Cape has experienced a series of severe droughts in the past few decades. The severe droughts occurred when a decrease in rainfall occurred simultaneously with an increase in temperature. El Niño Southern Oscillation (ENSO) appears to be an important driver of drought severity in the Western Cape, because most of the severe droughts occurred in El Niño years. At the district scale, the correlation between drought index and grape yield is weak ( r≈-0.5), but at the farm scale, it is strong ( r≈-0.9). This suggests that many farmers are able to mitigate the impacts of drought on grape yields through irrigation management. At the farm scale, where the impact of drought on grape yields is high, poor yield years coincide with moderate or severe drought periods. The APSIM simulation, which gives a realistic simulation of grape yields at the farm scale, suggests that grape yields become more sensitive to spring and summer droughts in the absence of irrigation. Results of this study may guide decision-making on how to reduce the impacts of drought on food security in South Africa.

  18. Abaqus Simulations of Rock Response to Dynamic Loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steedman, David W.; Coblentz, David

    The LANL Geodynamics Team has been applying Abaqus modeling to achieve increasingly complex simulations. Advancements in Abaqus model building and simulation tools allows this progress. We use Lab-developed constitutive models, the fully coupled CEL Abaqus and general contact to simulate response of realistic sites to explosively driven shock.

  19. Reproducing the observed energy-dependent structure of Earth's electron radiation belts during storm recovery with an event-specific diffusion model

    DOE PAGES

    Ripoll, J. -F.; Reeves, Geoffrey D.; Cunningham, Gregory Scott; ...

    2016-06-11

    Here, we present dynamic simulations of energy-dependent losses in the radiation belt “slot region” and the formation of the two-belt structure for the quiet days after the 1 March storm. The simulations combine radial diffusion with a realistic scattering model, based data-driven spatially and temporally resolved whistler-mode hiss wave observations from the Van Allen Probes satellites. The simulations reproduce Van Allen Probes observations for all energies and L shells (2–6) including (a) the strong energy dependence to the radiation belt dynamics (b) an energy-dependent outer boundary to the inner zone that extends to higher L shells at lower energies andmore » (c) an “S-shaped” energy-dependent inner boundary to the outer zone that results from the competition between diffusive radial transport and losses. We find that the characteristic energy-dependent structure of the radiation belts and slot region is dynamic and can be formed gradually in ~15 days, although the “S shape” can also be reproduced by assuming equilibrium conditions. The highest-energy electrons (E > 300 keV) of the inner region of the outer belt (L ~ 4–5) also constantly decay, demonstrating that hiss wave scattering affects the outer belt during times of extended plasmasphere. Through these simulations, we explain the full structure in energy and L shell of the belts and the slot formation by hiss scattering during storm recovery. We show the power and complexity of looking dynamically at the effects over all energies and L shells and the need for using data-driven and event-specific conditions.« less

  20. Modelling of NSTX hot vertical displacement events using M 3 D -C 1

    NASA Astrophysics Data System (ADS)

    Pfefferlé, D.; Ferraro, N.; Jardin, S. C.; Krebs, I.; Bhattacharjee, A.

    2018-05-01

    The main results of an intense vertical displacement event (VDE) modelling activity using the implicit 3D extended MHD code M3D-C1 are presented. A pair of nonlinear 3D simulations are performed using realistic transport coefficients based on the reconstruction of a so-called NSTX frozen VDE where the feedback control was purposely switched off to trigger a vertical instability. The vertical drift phase is solved assuming axisymmetry until the plasma contacts the first wall, at which point the intricate evolution of the plasma, decaying to large extent in force-balance with induced halo/wall currents, is carefully resolved via 3D nonlinear simulations. The faster 2D nonlinear runs allow to assess the sensitivity of the simulations to parameter changes. In the limit of perfectly conducting wall, the expected linear relation between vertical growth rate and wall resistivity is recovered. For intermediate wall resistivities, the halo region contributes to slowing the plasma down, and the characteristic VDE time depends on the choice of halo temperature. The evolution of the current quench and the onset of 3D halo/eddy currents are diagnosed in detail. The 3D simulations highlight a rich structure of toroidal modes, penetrating inwards from edge to core and cascading from high-n to low-n mode numbers. The break-up of flux-surfaces results in a progressive stochastisation of field-lines precipitating the thermalisation of the plasma with the wall. The plasma current then decays rapidly, inducing large currents in the halo region and the wall. Analysis of normal currents flowing in and out of the divertor plate reveals rich time-varying patterns.

  1. Mechanisms of Diurnal Precipitation over the United States Great Plains: A Cloud-Resolving Model Simulation

    NASA Technical Reports Server (NTRS)

    Lee, M.-I.; Choi, I.; Tao, W.-K.; Schubert, S. D.; Kang, I.-K.

    2010-01-01

    The mechanisms of summertime diurnal precipitation in the US Great Plains were examined with the two-dimensional (2D) Goddard Cumulus Ensemble (GCE) cloud-resolving model (CRM). The model was constrained by the observed large-scale background state and surface flux derived from the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Program s Intensive Observing Period (IOP) data at the Southern Great Plains (SGP). The model, when continuously-forced by realistic surface flux and large-scale advection, simulates reasonably well the temporal evolution of the observed rainfall episodes, particularly for the strongly forced precipitation events. However, the model exhibits a deficiency for the weakly forced events driven by diurnal convection. Additional tests were run with the GCE model in order to discriminate between the mechanisms that determine daytime and nighttime convection. In these tests, the model was constrained with the same repeating diurnal variation in the large-scale advection and/or surface flux. The results indicate that it is primarily the surface heat and moisture flux that is responsible for the development of deep convection in the afternoon, whereas the large-scale upward motion and associated moisture advection play an important role in preconditioning nocturnal convection. In the nighttime, high clouds are continuously built up through their interaction and feedback with long-wave radiation, eventually initiating deep convection from the boundary layer. Without these upper-level destabilization processes, the model tends to produce only daytime convection in response to boundary layer heating. This study suggests that the correct simulation of the diurnal variation in precipitation requires that the free-atmospheric destabilization mechanisms resolved in the CRM simulation must be adequately parameterized in current general circulation models (GCMs) many of which are overly sensitive to the parameterized boundary layer heating.

  2. Modelling the potential impacts of afforestation on extreme precipitation over West Africa

    NASA Astrophysics Data System (ADS)

    Odoulami, Romaric C.; Abiodun, Babatunde J.; Ajayi, Ayodele E.

    2018-05-01

    This study examines how afforestation in West Africa could influence extreme precipitation over the region, with a focus on widespread extreme rainfall events (WEREs) over the afforestation area. Two regional climate models (RegCM and WRF) were applied to simulate the present-day climate (1971-2000) and future climate (2031-2060, under IPCC RCP 4.5 emission scenario) with and without afforestation of the Savannah zone in West Africa. The models give a realistic simulation of precipitation indices and WEREs over the subcontinent. On average, the regional models projected future decreases in total annual wet day precipitation (PRCPTOT) and total annual daily precipitation greater than or equal to the 95th percentile of daily precipitation threshold (R95pTOT) and increases in maximum number of consecutive dry days (CDD) over Sahel. Over Savannah, the models projected decreases in PRCPTOT but increases in R95pTOT and CDD. Also, an increase in WEREs frequency is projected over west, central and east Savannah, except that RegCM simulated a decrease in WEREs over east Savannah. In general, afforestation increases PRCPTOT and R95pTOT but decreases CDD over the afforestation area. The forest-induced increases in PRCPTOT and decreases in CDD affect all ecological zones in West Africa. However, the simulations show that afforestation of Savannah also decreases R95pTOT over the Guinea Coast. It further increases WEREs over west and central Savannah and decreases them over east Savannah because of the local decrease in R95pTOT. Results of this study suggest that the future changes in characteristics of extreme precipitation events over West Africa are sensitive to the ongoing land modification.

  3. Modeling earthquake magnitudes from injection-induced seismicity on rough faults

    NASA Astrophysics Data System (ADS)

    Maurer, J.; Dunham, E. M.; Segall, P.

    2017-12-01

    It is an open question whether perturbations to the in-situ stress field due to fluid injection affect the magnitudes of induced earthquakes. It has been suggested that characteristics such as the total injected fluid volume control the size of induced events (e.g., Baisch et al., 2010; Shapiro et al., 2011). On the other hand, Van der Elst et al. (2016) argue that the size distribution of induced earthquakes follows Gutenberg-Richter, the same as tectonic events. Numerical simulations support the idea that ruptures nucleating inside regions with high shear-to-effective normal stress ratio may not propagate into regions with lower stress (Dieterich et al., 2015; Schmitt et al., 2015), however, these calculations are done on geometrically smooth faults. Fang & Dunham (2013) show that rupture length on geometrically rough faults is variable, but strongly dependent on background shear/effective normal stress. In this study, we use a 2-D elasto-dynamic rupture simulator that includes rough fault geometry and off-fault plasticity (Dunham et al., 2011) to simulate earthquake ruptures under realistic conditions. We consider aggregate results for faults with and without stress perturbations due to fluid injection. We model a uniform far-field background stress (with local perturbations around the fault due to geometry), superimpose a poroelastic stress field in the medium due to injection, and compute the effective stress on the fault as inputs to the rupture simulator. Preliminary results indicate that even minor stress perturbations on the fault due to injection can have a significant impact on the resulting distribution of rupture lengths, but individual results are highly dependent on the details of the local stress perturbations on the fault due to geometric roughness.

  4. Nonlinear Site Response Validation Studies Using KIK-net Strong Motion Data

    NASA Astrophysics Data System (ADS)

    Asimaki, D.; Shi, J.

    2014-12-01

    Earthquake simulations are nowadays producing realistic ground motion time-series in the range of engineering design applications. Of particular significance to engineers are simulations of near-field motions and large magnitude events, for which observations are scarce. With the engineering community slowly adopting the use of simulated ground motions, site response models need to be re-evaluated in terms of their capabilities and limitations to 'translate' the simulated time-series from rock surface output to structural analyses input. In this talk, we evaluate three one-dimensional site response models: linear viscoelastic, equivalent linear and nonlinear. We evaluate the performance of the models by comparing predictions to observations at 30 downhole stations of the Japanese network KIK-Net that have recorded several strong events, including the 2011 Tohoku earthquake. Velocity profiles are used as the only input to all models, while additional parameters such as quality factor, density and nonlinear dynamic soil properties are estimated from empirical correlations. We quantify the differences of ground surface predictions and observations in terms of both seismological and engineering intensity measures, including bias ratios of peak ground response and visual comparisons of elastic spectra, and inelastic to elastic deformation ratio for multiple ductility ratios. We observe that PGV/Vs,30 — as measure of strain— is a better predictor of site nonlinearity than PGA, and that incremental nonlinear analyses are necessary to produce reliable estimates of high-frequency ground motion components at soft sites. We finally discuss the implications of our findings on the parameterization of nonlinear amplification factors in GMPEs, and on the extensive use of equivalent linear analyses in probabilistic seismic hazard procedures.

  5. Realistic simulations of a cyclotron spiral inflector within a particle-in-cell framework

    NASA Astrophysics Data System (ADS)

    Winklehner, Daniel; Adelmann, Andreas; Gsell, Achim; Kaman, Tulin; Campo, Daniela

    2017-12-01

    We present an upgrade to the particle-in-cell ion beam simulation code opal that enables us to run highly realistic simulations of the spiral inflector system of a compact cyclotron. This upgrade includes a new geometry class and field solver that can handle the complicated boundary conditions posed by the electrode system in the central region of the cyclotron both in terms of particle termination, and calculation of self-fields. Results are benchmarked against the analytical solution of a coasting beam. As a practical example, the spiral inflector and the first revolution in a 1 MeV /amu test cyclotron, located at Best Cyclotron Systems, Inc., are modeled and compared to the simulation results. We find that opal can now handle arbitrary boundary geometries with relative ease. Simulated injection efficiencies and beam shape compare well with measured efficiencies and a preliminary measurement of the beam distribution after injection.

  6. Geometrical force constraint method for vessel and x-ray angiogram simulation.

    PubMed

    Song, Shuang; Yang, Jian; Fan, Jingfan; Cong, Weijian; Ai, Danni; Zhao, Yitian; Wang, Yongtian

    2016-01-01

    This study proposes a novel geometrical force constraint method for 3-D vasculature modeling and angiographic image simulation. For this method, space filling force, gravitational force, and topological preserving force are proposed and combined for the optimization of the topology of the vascular structure. The surface covering force and surface adhesion force are constructed to drive the growth of the vasculature on any surface. According to the combination effects of the topological and surface adhering forces, a realistic vasculature can be effectively simulated on any surface. The image projection of the generated 3-D vascular structures is simulated according to the perspective projection and energy attenuation principles of X-rays. Finally, the simulated projection vasculature is fused with a predefined angiographic mask image to generate a realistic angiogram. The proposed method is evaluated on a CT image and three generally utilized surfaces. The results fully demonstrate the effectiveness and robustness of the proposed method.

  7. Helioseismology of a Realistic Magnetoconvective Sunspot Simulation

    NASA Technical Reports Server (NTRS)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L., Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  8. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang

    2015-01-15

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less

  9. Simulation of the Burridge-Knopoff model of earthquakes with variable range stress transfer.

    PubMed

    Xia, Junchao; Gould, Harvey; Klein, W; Rundle, J B

    2005-12-09

    Simple models of earthquake faults are important for understanding the mechanisms for their observed behavior, such as Gutenberg-Richter scaling and the relation between large and small events, which is the basis for various forecasting methods. Although cellular automaton models have been studied extensively in the long-range stress transfer limit, this limit has not been studied for the Burridge-Knopoff model, which includes more realistic friction forces and inertia. We find that the latter model with long-range stress transfer exhibits qualitatively different behavior than both the long-range cellular automaton models and the usual Burridge-Knopoff model with nearest-neighbor springs, depending on the nature of the velocity-weakening friction force. These results have important implications for our understanding of earthquakes and other driven dissipative systems.

  10. Applications of acoustic-gravity waves numerical modeling to tsunami signals observed by gravimetry satellites in very low orbit

    NASA Astrophysics Data System (ADS)

    Brissaud, Q.; Garcia, R.; Sladen, A.; Martin, R.; Komatitsch, D.

    2016-12-01

    Acoustic and gravity waves propagating in planetary atmospheres have been studied intensively as markers of specific phenomena (tectonic events, explosions) or as contributors to atmosphere dynamics. To get a better understanding of the physics behind these dynamic processes, both acoustic and gravity waves propagation should be modeled in an attenuating and windy 3D atmosphere from the ground all the way to the upper thermosphere. Thus, in order to provide an efficient numerical tool at the regional or global scale we introduce a high-order finite-difference time domain (FDTD) approach that relies on the linearized compressible Navier-Stokes equations with spatially non constant physical parameters (density, viscosities and speed of sound) and background velocities (wind). We present applications of these simulations to the propagation of gravity waves generated by tsunamis for realistic cases for which atmospheric models are extracted from empirical models including variations with altitude of atmospheric parameters, and tsunami forcing at the ocean surface is extracted from shallow water simulations. We describe the specific difficulties induced by the size of the simulation, the boundary conditions and the spherical geometry and compare the simulation outputs to data gathered by gravimetric satellites crossing gravity waves generated by tsunamis.

  11. A pervasive visual-haptic framework for virtual delivery training.

    PubMed

    Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V

    2010-03-01

    Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.

  12. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  13. Simulation of Malaria Transmission among Households in a Thai Village using Remotely Sensed Parameters

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Zollner, Gabriela E.; Coleman, Russell E.

    2007-01-01

    We have used discrete-event simulation to model the malaria transmission in a Thailand village with approximately 700 residents. Specifically, we model the detailed interactions among the vector life cycle, sporogonic cycle and human infection cycle under the explicit influences of selected extrinsic and intrinsic factors. Some of the meteorological and environmental parameters used in the simulation are derived from Tropical Rainfall Measuring Mission and the Ikonos satellite data. Parameters used in the simulations reflect the realistic condition of the village, including the locations and sizes of the households, ages and estimated immunity of the residents, presence of farm animals, and locations of larval habitats. Larval habitats include the actual locations where larvae were collected and the probable locations based on satellite data. The output of the simulation includes the individual infection status and the quantities normally observed in field studies, such as mosquito biting rates, sporozoite infection rates, gametocyte prevalence and incidence. Simulated transmission under homogeneous environmental condition was compared with that predicted by a SEIR model. Sensitivity of the output with respect to some extrinsic and intrinsic factors was investigated. Results were compared with mosquito vector and human malaria data acquired over 4.5 years (June 1999 - January 2004) in Kong Mong Tha, a remote village in Kanchanaburi Province, western Thailand. The simulation method is useful for testing transmission hypotheses, estimating the efficacy of insecticide applications, assessing the impacts of nonimmune immigrants, and predicting the effects of socioeconomic, environmental and climatic changes.

  14. Ambient-Light Simulator For Testing Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Batson, Vernon M.; Gupton, Lawrence E.

    1995-01-01

    Apparatus provides illumination from outside, through windows and into interior of simulated airplane cockpit. Simulates sunlight, darkness, or lightning on demand. Ambient-lighting simulator surrounds forward section of simulated airplane. Provides control over intensity, color, and diffuseness of solar illumination and of position of Sun relative to airplane. Used to evaluate aircraft-instrumentation display devices under realistic lighting conditions.

  15. Reducing the biases in simulated polar climate by incorporating realistic surface spectral emissivity into the global climate model

    NASA Astrophysics Data System (ADS)

    Huang, X.; Chen, X.; Flanner, M.; Yang, P.; Feldman, D.; Kuo, C.

    2017-12-01

    Surface longwave emissivity can be less than unity and vary significantly with frequency. The emissivities of water, ice, and bare land all exhibit different spectral dependence, for both the far-IR and mid-IR bands. However, most climate models still assume blackbody surface in the longwave (LW) radiation scheme of their atmospheric modules. This study incorporates realistic surface spectral emissivity into the RRTMG_LW, the LW radiation scheme in CAM, which is the atmospheric component of the NCAR Community Earth System Model (CESM) version 1.1.1. Then we evaluate its impact on simulated climatology, especially for the polar regions. By ensuring the consistency of the broadband longwave flux across different modules of the CESM, the TOA energy balance in the simulation can be attained without additional tuning of the model. While the impact on global mean surface temperature is small, the surface temperature differences in Polar Regions are statistically significant. The mean surface temperature in Arctic in the modified CESM is 1.5K warmer than that in the standard CESM, reducing the cold bias that the standard CESM has with respect to observations. Accordingly the sea ice fraction in the modified CESM simulation is less than that in the standard CESM simulation by as much as 0.1, which significantly reduces the positive biases in the simulated sea ice coverage by the CESM. The largest sea-ice coverage difference happens in August and September, when new sea ice starts to form. The similar changes can be seen for the simulated Antarctic surface climate as well. In a nutshell, incorporating realistic surface spectral emissivity helps improving the fidelity of simulated surface energy budget in the polar region, which leads to a better simulation of the surface temperature and sea ice coverage.

  16. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  17. Detection of different reconnection regions from kinetic simulations during island coalescence after asymmetric magnetic reconnection

    NASA Astrophysics Data System (ADS)

    Cazzola, Emanuele; Berchem, Jean; Innocenti, Maria Elena; Goldman, Martin V.; Newman, David L.; Zhou, Meng; Lapenta, Giovanni

    2017-04-01

    In this work we present new results from fully kinetic simulations of the magnetic islands coalescence dynamics after asymmetric magnetic reconnection. In a previous work, we have shown that three different reconnection regions can be identified when a new frame of reference based on the local magnetic field is set. These regions were marked as X, D and M whether they describe, respectively, a traditional X-line event, an event between two diverging islands or an event between two merging islands [1, 2]. The results shown here extend the previous analysis to a more realistic regime, including a remarkable temperature transition across the current sheet. In particular, regions X, D, and M are also observed within this new regime, featuring yet new interesting characteristics. Special attention is given to the particles agyrotropic and anisotropic behavior as fundamental signatures for the detection of these regions with satellites. These results are timely for the ongoing MMS mission, whose data from the magnetopause crossing are presently being analyzed. In fact, data revealed that an intense flux-ropes activity takes place in this region of the magnetosphere, which makes the presence of this set of reconnection regions highly expected. [1] Cazzola, E., et al. "On the electron dynamics during island coalescence in asymmetric magnetic reconnection." Physics of Plasmas (1994-present) 22.9 (2015): 092901. [2] Cazzola, E., et al. "On the electron agyrotropy during rapid asymmetric magnetic island coalescence in presence of a guide field." Geophysical Research Letters 43.15 (2016): 7840-7849.

  18. Effects of sea water on elongated duration of ground motion as well as variation in its amplitude for offshore earthquakes

    NASA Astrophysics Data System (ADS)

    Todoriki, Masaru; Furumura, Takashi; Maeda, Takuto

    2017-01-01

    We investigated the effects of sea water on the propagation of seismic waves using a 3-D finite-difference-method simulation of seismic wave propagation following offshore earthquakes. When using a 1-D layered structure, the simulation results showed strong S- to P-wave conversion at the sea bottom; accordingly, S-wave energy was dramatically decreased by the sea water layer. This sea water de-amplification effect had strong frequency dependence, therefore resembling a low-pass filter in which the cut-off frequency and damping coefficients were defined by the thickness of the sea water layer. The sea water also acted to elongate the duration of Rayleigh wave packet. The importance of the sea water layer in modelling offshore earthquakes was further demonstrated by a simulation using a realistic 3-D velocity structure model with and without sea water for a shallow (h = 14 km) outer-rise Nankai Trough event, the 2004 SE Off Kii Peninsula earthquake (Mw = 7.2). Synthetic seismograms generated by the model when sea water was included were in accordance with observed seismograms for long-term longer period motions, particularly those in the shape of Rayleigh waves.

  19. Passive scalar transport to and from the surface of a Pocillopora coral colony

    NASA Astrophysics Data System (ADS)

    Hossain, Md Monir; Staples, Anne

    2016-11-01

    Three-dimensional simulations of flow through a single Pocillopora coral colony were performed to examine the interaction between the flow conditions and scalar transport near a coral colony. With corals currently undergoing a third global bleaching event, a fuller understanding of the transport of nutrients, weak temperature gradients, and other passive scalars to and from the coral polyp tissue is more important than ever. The complex geometry of a coral colony poses a significant challenge for numerical simulation. To simplify grid generation and minimize computational cost, the immersed boundary method was implemented. Large eddy simulation was chosen as the framework to capture the turbulent flow field in the range of realistic Reynolds numbers of 5,000 to 30,000 and turbulent Schmidt numbers of up to 1,000. Both uniform and oscillatory flows through the colony were investigated. Significant differences were found between the cases when the scalar originated at the edge of the flow domain and was transported into the colony, versus when the scalar originated on the surface of the colony and was transported away from the coral. The domain-to-colony transport rates were found to be orders of magnitude higher than the colony-to-domain rates.

  20. A Multi-Agent Approach to the Simulation of Robotized Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Foit, K.; Gwiazda, A.; Banaś, W.

    2016-08-01

    The recent years of eventful industry development, brought many competing products, addressed to the same market segment. The shortening of a development cycle became a necessity if the company would like to be competitive. Because of switching to the Intelligent Manufacturing model the industry search for new scheduling algorithms, while the traditional ones do not meet the current requirements. The agent-based approach has been considered by many researchers as an important way of evolution of modern manufacturing systems. Due to the properties of the multi-agent systems, this methodology is very helpful during creation of the model of production system, allowing depicting both processing and informational part. The complexity of such approach makes the analysis impossible without the computer assistance. Computer simulation still uses a mathematical model to recreate a real situation, but nowadays the 2D or 3D virtual environments or even virtual reality have been used for realistic illustration of the considered systems. This paper will focus on robotized manufacturing system and will present the one of possible approaches to the simulation of such systems. The selection of multi-agent approach is motivated by the flexibility of this solution that offers the modularity, robustness and autonomy.

  1. Simulation Test of a Head-Worn Display with Ambient Vision Display for Unusual Attitude Recovery

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis (Trey) J., III; Nicholas, Stephanie N.; Shelton, Kevin J.; Ballard, Kathryn; Prinzel, Lawrence J., III; Ellis, Kyle E.; Bailey, Randall E.; Williams, Steven P.

    2017-01-01

    Head-Worn Displays (HWDs) are envisioned as a possible equivalent to a Head-Up Display (HUD) in commercial and general aviation. A simulation experiment was conducted to evaluate whether the HWD can provide an equivalent or better level of performance to a HUD in terms of unusual attitude recognition and recovery. A prototype HWD was tested with ambient vision capability which were varied (on/off) as an independent variable in the experiment testing for attitude awareness. The simulation experiment was conducted in two parts: 1) short unusual attitude recovery scenarios where the aircraft is placed in an unusual attitude and a single-pilot crew recovered the aircraft; and, 2) a two-pilot crew operating in a realistic flight environment with "off-nominal" events to induce unusual attitudes. The data showed few differences in unusual attitude recognition and recovery performance between the tested head-down, head-up, and head-worn display concepts. The presence and absence of ambient vision stimulation was inconclusive. The ergonomic influences of the head-worn display, necessary to implement the ambient vision experimentation, may have influenced the pilot ratings and acceptance of the concepts.

  2. How to test the threat-simulation theory.

    PubMed

    Revonsuo, Antti; Valli, Katja

    2008-12-01

    Malcolm-Smith, Solms, Turnbull and Tredoux [Malcolm-Smith, S., Solms, M.,Turnbull, O., & Tredoux, C. (2008). Threat in dreams: An adaptation? Consciousness and Cognition, 17, 1281-1291.] have made an attempt to test the Threat-Simulation Theory (TST), a theory offering an evolutionary psychological explanation for the function of dreaming [Revonsuo, A. (2000a). The reinterpretation of dreams: An evolutionary hypothesis of the function of dreaming. Behavioral and Brain Sciences, 23(6), 877-901]. Malcolm-Smith et al. argue that empirical evidence from their own study as well as from some other studies in the literature does not support the main predictions of the TST: that threatening events are frequent and overrepresented in dreams, that exposure to real threats activates the threat-simulation system, and that dream threats contain realistic rehearsals of threat avoidance responses. Other studies, including our own, have come up with results and conclusions that are in conflict with those of Malcolm-Smith et al. In this commentary, we provide an analysis of the sources of these disagreements, and their implications to the TST. Much of the disagreement seems to stem from differing interpretations of the theory and, consequently, of differing methods to test it.

  3. Simulations of Solar Jets

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    Formation of a coronal jet from twisted field lines that have reconnected with the ambient field. The colors show the radial velocity of the plasma. [Adapted from Szente et al. 2017]How do jets emitted from the Suns surface contribute to its corona and to the solar wind? In a recent study, a team of scientists performed complex three-dimensional simulations of coronal jets to answer these questions.Small ExplosionsCoronal jets are relatively small eruptions from the Suns surface, with heights of roughly 100 to 10,000 km, speeds of 10 to 1,000 km/s, and lifetimes of a few minutes to around ten hours. These jets are constantly present theyre emitted even from the quiet Sun, when activity is otherwise low and weve observed them with a fleet of Sun-watching space telescopes spanning the visible, extreme ultraviolet (EUV), and X-ray wavelength bands.A comparison of simulated observations based on the authors model (left panels) to actual EUV and X-ray observations of jets (right panels). [Szente et al. 2017]Due to their ubiquity, we speculate that these jets might contribute to heating the global solar corona (which is significantly hotter than the surface below it, a curiosity known as the coronal heating problem). We can also wonder what role these jets might play in driving the overall solar wind.Launching a JetLed by Judit Szente (University of Michigan), a team of scientists has explored the impact of coronal jets on the global corona and solar wind with a series of numerical simulations. Szente and collaborators used three-dimensional, magnetohydrodynamic simulations that provide realistic treatment of the solar atmosphere, the solar wind acceleration, and the complexities of heat transfer throughout the corona.In the authors simulations, a jet is initiated as a magnetic dipole rotates at the solar surface, winding up field lines. Magnetic reconnection between the twisted lines and the background field then launches the jet from the dense and hot solar chromosphere, and erupting plasma is released outward into the solar corona.A second comparison of simulated observations based on the authors model (left panels) to actual EUV observations of jets (right panels). [Szente et al. 2017]Global InfluencesAfter demonstrating that their models could successfully lead to jet production and propagation, Szente and collaborators compared their results to actual observations of solar jets. The authors constructed simulated EUV and X-ray observations of their modeled events, and they verified that the behavior and structures in these simulated observations were very similar to real observations of coronal jet events from telescopes like SDO/AIA and Hinode.With this confirmed, the authors then used their models to determine how the jets influence the global solar corona and the solar wind. They found that the large-scale corona is significantly affected by the plasma waves from the jet, which travel across 40 in latitude and out to 24 solar radii. In spite of this, the simulated jets contributed only a few percent to the steady-state solar-wind energy outflow.These simulations represent an important step in realistic modeling of the quiet Sun. Because the models make specific predictions about temperature and density gradients within the corona, we can look forward to testing them with upcoming missions like Solar Probe Plus, which should be able to explore the Sun all the way down to ninesolar radii.CitationJ. Szente et al 2017 ApJ 834 123. doi:10.3847/1538-4357/834/2/123

  4. Insights from Synthetic Star-forming Regions. I. Reliable Mock Observations from SPH Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koepferl, Christine M.; Robitaille, Thomas P.; Biscani, Francesco

    Through synthetic observations of a hydrodynamical simulation of an evolving star-forming region, we assess how the choice of observational techniques affects the measurements of properties that trace star formation. Testing and calibrating observational measurements requires synthetic observations that are as realistic as possible. In this part of the series (Paper I), we explore different techniques for mapping the distributions of densities and temperatures from the particle-based simulations onto a Voronoi mesh suitable for radiative transfer and consequently explore their accuracy. We further test different ways to set up the radiative transfer in order to produce realistic synthetic observations. We give amore » detailed description of all methods and ultimately recommend techniques. We have found that the flux around 20 μ m is strongly overestimated when blindly coupling the dust radiative transfer temperature with the hydrodynamical gas temperature. We find that when instead assuming a constant background dust temperature in addition to the radiative transfer heating, the recovered flux is consistent with actual observations. We present around 5800 realistic synthetic observations for Spitzer and Herschel bands, at different evolutionary time-steps, distances, and orientations. In the upcoming papers of this series (Papers II, III, and IV), we will test and calibrate measurements of the star formation rate, gas mass, and the star formation efficiency using our realistic synthetic observations.« less

  5. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    NASA Astrophysics Data System (ADS)

    Solomon, Justin; Samei, Ehsan

    2014-11-01

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.

  6. Identification of genomic indels and structural variations using split reads

    PubMed Central

    2011-01-01

    Background Recent studies have demonstrated the genetic significance of insertions, deletions, and other more complex structural variants (SVs) in the human population. With the development of the next-generation sequencing technologies, high-throughput surveys of SVs on the whole-genome level have become possible. Here we present split-read identification, calibrated (SRiC), a sequence-based method for SV detection. Results We start by mapping each read to the reference genome in standard fashion using gapped alignment. Then to identify SVs, we score each of the many initial mappings with an assessment strategy designed to take into account both sequencing and alignment errors (e.g. scoring more highly events gapped in the center of a read). All current SV calling methods have multilevel biases in their identifications due to both experimental and computational limitations (e.g. calling more deletions than insertions). A key aspect of our approach is that we calibrate all our calls against synthetic data sets generated from simulations of high-throughput sequencing (with realistic error models). This allows us to calculate sensitivity and the positive predictive value under different parameter-value scenarios and for different classes of events (e.g. long deletions vs. short insertions). We run our calculations on representative data from the 1000 Genomes Project. Coupling the observed numbers of events on chromosome 1 with the calibrations gleaned from the simulations (for different length events) allows us to construct a relatively unbiased estimate for the total number of SVs in the human genome across a wide range of length scales. We estimate in particular that an individual genome contains ~670,000 indels/SVs. Conclusions Compared with the existing read-depth and read-pair approaches for SV identification, our method can pinpoint the exact breakpoints of SV events, reveal the actual sequence content of insertions, and cover the whole size spectrum for deletions. Moreover, with the advent of the third-generation sequencing technologies that produce longer reads, we expect our method to be even more useful. PMID:21787423

  7. Modeling of shallow and inefficient convection in the outer layers of the Sun using realistic physics

    NASA Technical Reports Server (NTRS)

    Kim, Yong-Cheol; Fox, Peter A.; Sofia, Sabatino; Demarque, Pierre

    1995-01-01

    In an attempt to understand the properties of convective energy transport in the solar convective zone, a numerical model has been constructed for turbulent flows in a compressible, radiation-coupled, nonmagnetic, gravitationally stratified medium using a realistic equation of state and realistic opacities. The time-dependent, three-dimensional hydrodynamic equations are solved with minimal simplifications. The statistical information obtained from the present simulation provides an improved undserstanding of solar photospheric convection. The characteristics of solar convection in shallow regions is parameterized and compared with the results of Chan & Sofia's (1989) simulations of deep and efficient convection. We assess the importance of the zones of partial ionization in the simulation and confirm that the radiative energy transfer is negliglble throughout the region except in the uppermost scale heights of the convection zone, a region of very high superadiabaticity. When the effects of partial ionization are included, the dynamics of flows are altered significantly. However, we confirm the Chan & Sofia result that kinetic energy flux is nonnegligible and can have a negative value in the convection zone.

  8. Crystallographic Lattice Boltzmann Method

    PubMed Central

    Namburi, Manjusha; Krithivasan, Siddharth; Ansumali, Santosh

    2016-01-01

    Current approaches to Direct Numerical Simulation (DNS) are computationally quite expensive for most realistic scientific and engineering applications of Fluid Dynamics such as automobiles or atmospheric flows. The Lattice Boltzmann Method (LBM), with its simplified kinetic descriptions, has emerged as an important tool for simulating hydrodynamics. In a heterogeneous computing environment, it is often preferred due to its flexibility and better parallel scaling. However, direct simulation of realistic applications, without the use of turbulence models, remains a distant dream even with highly efficient methods such as LBM. In LBM, a fictitious lattice with suitable isotropy in the velocity space is considered to recover Navier-Stokes hydrodynamics in macroscopic limit. The same lattice is mapped onto a cartesian grid for spatial discretization of the kinetic equation. In this paper, we present an inverted argument of the LBM, by making spatial discretization as the central theme. We argue that the optimal spatial discretization for LBM is a Body Centered Cubic (BCC) arrangement of grid points. We illustrate an order-of-magnitude gain in efficiency for LBM and thus a significant progress towards feasibility of DNS for realistic flows. PMID:27251098

  9. Primary combination of phase-field and discrete dislocation dynamics methods for investigating athermal plastic deformation in various realistic Ni-base single crystal superalloy microstructures

    NASA Astrophysics Data System (ADS)

    Gao, Siwen; Rajendran, Mohan Kumar; Fivel, Marc; Ma, Anxin; Shchyglo, Oleg; Hartmaier, Alexander; Steinbach, Ingo

    2015-10-01

    Three-dimensional discrete dislocation dynamics (DDD) simulations in combination with the phase-field method are performed to investigate the influence of different realistic Ni-base single crystal superalloy microstructures with the same volume fraction of {γ\\prime} precipitates on plastic deformation at room temperature. The phase-field method is used to generate realistic microstructures as the boundary conditions for DDD simulations in which a constant high uniaxial tensile load is applied along different crystallographic directions. In addition, the lattice mismatch between the γ and {γ\\prime} phases is taken into account as a source of internal stresses. Due to the high antiphase boundary energy and the rare formation of superdislocations, precipitate cutting is not observed in the present simulations. Therefore, the plastic deformation is mainly caused by dislocation motion in γ matrix channels. From a comparison of the macroscopic mechanical response and the dislocation evolution for different microstructures in each loading direction, we found that, for a given {γ\\prime} phase volume fraction, the optimal microstructure should possess narrow and homogeneous γ matrix channels.

  10. Radiation-Spray Coupling for Realistic Flow Configurations

    NASA Technical Reports Server (NTRS)

    El-Asrag, Hossam; Iannetti, Anthony C.

    2011-01-01

    Three Large Eddy Simulations (LES) for a lean-direct injection (LDI) combustor are performed and compared. In addition to the cold flow simulation, the effect of radiation coupling with the multi-physics reactive flow is analyzed. The flame let progress variable approach is used as a subgrid combustion model combined with a stochastic subgrid model for spray atomization and an optically thin radiation model. For accurate chemistry modeling, a detailed Jet-A surrogate mechanism is utilized. To achieve realistic inflow, a simple recycling technique is performed at the inflow section upstream of the swirler. Good comparison is shown with the experimental data mean and root mean square profiles. The effect of combustion is found to change the shape and size of the central recirculation zone. Radiation is found to change the spray dynamics and atomization by changing the heat release distribution and the local temperature values impacting the evaporation process. The simulation with radiation modeling shows wider range of droplet size distribution by altering the evaporation rate. The current study proves the importance of radiation modeling for accurate prediction in realistic spray combustion configurations, even for low pressure systems.

  11. Simulation of HLNC and NCC measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ming-Shih; Teichmann, T.; De Ridder, P.

    1994-03-01

    This report discusses an automatic method of simulating the results of High Level Neutron Coincidence Counting (HLNC) and Neutron Collar Coincidence Counting (NCC) measurements to facilitate the safeguards` inspectors understanding and use of these instruments under realistic conditions. This would otherwise be expensive, and time-consuming, except at sites designed to handle radioactive materials, and having the necessary variety of fuel elements and other samples. This simulation must thus include the behavior of the instruments for variably constituted and composed fuel elements (including poison rods and Gd loading), and must display the changes in the count rates as a function ofmore » these characteristics, as well as of various instrumental parameters. Such a simulation is an efficient way of accomplishing the required familiarization and training of the inspectors by providing a realistic reproduction of the results of such measurements.« less

  12. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  13. A Mirror for Managers: Using Simulation to Develop Management Teams. Technical Report 23.

    ERIC Educational Resources Information Center

    Kaplan, Robert E.; And Others

    Although simulation is among the least common of the many methods consultants employ to stimulate team development, realistic simulation can help in the diagnosis of management teams. Simulations fill a gap in the repertoire of data collection methods for organizational diagnosis and development by affording an opportunity for direct observation…

  14. African Easterly Waves in 30-day High-Resolution Global Simulations: A Case Study During the 2006 NAMMA Period

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Tao, Wei-Kuo; Wu, Man-Li C.

    2010-01-01

    In this study, extended -range (30 -day) high-resolution simulations with the NASA global mesoscale model are conducted to simulate the initiation and propagation of six consecutive African easterly waves (AEWs) from late August to September 2006 and their association with hurricane formation. It is shown that the statistical characteristics of individual AEWs are realistically simulated with larger errors in the 5th and 6th AEWs. Remarkable simulations of a mean African easterly jet (AEJ) are also obtained. Nine additional 30 -day experiments suggest that although land surface processes might contribute to the predictability of the AEJ and AEWs, the initiation and detailed evolution of AEWs still depend on the accurate representation of dynamic and land surface initial conditions and their time -varying nonlinear interactions. Of interest is the potential to extend the lead time for predicting hurricane formation (e.g., a lead time of up to 22 days) as the 4th AEW is realistically simulated.

  15. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  16. Spatial Fluctuations of the Intergalactic Temperature-Density Relation After Hydrogen Reionization

    NASA Astrophysics Data System (ADS)

    Keating, Laura C.; Puchwein, Ewald; Haehnelt, Martin G.

    2018-04-01

    The thermal state of the post-reionization IGM is sensitive to the timing of reionization and the nature of the ionizing sources. We have modelled here the thermal state of the IGM in cosmological radiative transfer simulations of a realistic, extended, spatially inhomogeneous hydrogen reionization process, carefully calibrated with Lyα forest data. We compare these with cosmological simulations run using a spatially homogeneous ionizing background. The simulations with a realistic growth of ionized regions and a realistic spread in reionization redshifts show, as expected, significant spatial fluctuations in the temperature-density relation (TDR) of the post-reionization IGM. The most recently ionized regions are hottest and exhibit a flatter TDR. In simulations consistent with the average TDR inferred from Lyα forest data, these spatial fluctuations have a moderate but noticeable effect on the statistical properties of the Lyα opacity of the IGM at z ˜ 4 - 6. This should be taken into account in accurate measurements of the thermal properties of the IGM and the free-streaming of dark matter from Lyα forest data in this redshift range. The spatial variations of the TDR predicted by our simulations are, however, smaller by about a factor two than would be necessary to explain the observed large spatial opacity fluctuations on large (≥ 50 h-1 comoving Mpc) scales at z ≳ 5.5.

  17. Spatial fluctuations of the intergalactic temperature-density relation after hydrogen reionization

    NASA Astrophysics Data System (ADS)

    Keating, Laura C.; Puchwein, Ewald; Haehnelt, Martin G.

    2018-07-01

    The thermal state of the post-reionization IGM is sensitive to the timing of reionization and the nature of the ionizing sources. We have modelled here the thermal state of the IGM in cosmological radiative transfer simulations of a realistic, extended, spatially inhomogeneous hydrogen reionization process, carefully calibrated with Ly α forest data. We compare these with cosmological simulations run using a spatially homogeneous ionizing background. The simulations with a realistic growth of ionized regions and a realistic spread in reionization redshifts show, as expected, significant spatial fluctuations in the temperature-density relation (TDR) of the post-reionization IGM. The most recently ionized regions are hottest and exhibit a flatter TDR. In simulations consistent with the average TDR inferred from Ly α forest data, these spatial fluctuations have a moderate but noticeable effect on the statistical properties of the Ly α opacity of the IGM at z ˜ 4-6. This should be taken into account in accurate measurements of the thermal properties of the IGM and the free-streaming of dark matter from Ly α forest data in this redshift range. The spatial variations of the TDR predicted by our simulations are, however, smaller by about a factor of 2 than would be necessary to explain the observed large spatial opacity fluctuations on large (≥50 h-1 comoving Mpc) scales atz ≳ 5.5.

  18. Impact of spectral nudging on the downscaling of tropical cyclones in regional climate simulations

    NASA Astrophysics Data System (ADS)

    Choi, Suk-Jin; Lee, Dong-Kyou

    2016-06-01

    This study investigated the simulations of three months of seasonal tropical cyclone (TC) activity over the western North Pacific using the Advanced Research WRF Model. In the control experiment (CTL), the TC frequency was considerably overestimated. Additionally, the tracks of some TCs tended to have larger radii of curvature and were shifted eastward. The large-scale environments of westerly monsoon flows and subtropical Pacific highs were unreasonably simulated. The overestimated frequency of TC formation was attributed to a strengthened westerly wind field in the southern quadrants of the TC center. In comparison with the experiment with the spectral nudging method, the strengthened wind speed was mainly modulated by large-scale flow that was greater than approximately 1000 km in the model domain. The spurious formation and undesirable tracks of TCs in the CTL were considerably improved by reproducing realistic large-scale atmospheric monsoon circulation with substantial adjustment between large-scale flow in the model domain and large-scale boundary forcing modified by the spectral nudging method. The realistic monsoon circulation took a vital role in simulating realistic TCs. It revealed that, in the downscaling from large-scale fields for regional climate simulations, scale interaction between model-generated regional features and forced large-scale fields should be considered, and spectral nudging is a desirable method in the downscaling method.

  19. Bivalves: From individual to population modelling

    NASA Astrophysics Data System (ADS)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  20. Electron Heating and Acceleration in a Reconnecting Magnetotail

    NASA Astrophysics Data System (ADS)

    El-Alaoui, M.; Zhou, M.; Lapenta, G.; Berchem, J.; Richard, R. L.; Schriver, D.; Walker, R. J.

    2017-12-01

    Electron heating and acceleration in the magnetotail have been investigated intensively. A major site for this process is the reconnection region. However, where and how the electrons are accelerated in a realistic three-dimensional X-line geometry is not fully understood. In this study, we employed a three-dimensional implicit particle-in-cell (iPIC3D) simulation and large-scale kinetic (LSK) simulation to address these problems. We modeled a magnetotail reconnection event observed by THEMIS in an iPIC3D simulation with initial and boundary conditions given by a global magnetohydrodynamic (MHD) simulation of Earth's magnetosphere. The iPIC3D simulation system includes the region of fast outflow emanating from the reconnection site that drives dipolarization fronts. We found that current sheet electrons exhibit elongated (cigar-shaped) velocity distributions with a higher parallel temperature. Using LSK we then followed millions of test electrons using the electromagnetic fields from iPIC3D. We found that magnetotail reconnection can generate power law spectra around the near-Earth X-line. A significant number of electrons with energies higher than 50 keV are produced. We identified several acceleration mechanisms at different locations that were responsible for energizing these electrons: non-adiabatic cross-tail drift, betatron and Fermi acceleration. Relative contributions to the energy gain of these high energy electrons from the different mechanisms will be discussed.

  1. Monte Carlo simulation of ferroelectric domain growth

    NASA Astrophysics Data System (ADS)

    Li, B. L.; Liu, X. P.; Fang, F.; Zhu, J. L.; Liu, J.-M.

    2006-01-01

    The kinetics of two-dimensional isothermal domain growth in a quenched ferroelectric system is investigated using Monte Carlo simulation based on a realistic Ginzburg-Landau ferroelectric model with cubic-tetragonal (square-rectangle) phase transitions. The evolution of the domain pattern and domain size with annealing time is simulated, and the stability of trijunctions and tetrajunctions of domain walls is analyzed. It is found that in this much realistic model with strong dipole alignment anisotropy and long-range Coulomb interaction, the powerlaw for normal domain growth still stands applicable. Towards the late stage of domain growth, both the average domain area and reciprocal density of domain wall junctions increase linearly with time, and the one-parameter dynamic scaling of the domain growth is demonstrated.

  2. Realistic Solar Surface Convection Simulations

    NASA Technical Reports Server (NTRS)

    Stein, Robert F.; Nordlund, Ake

    2000-01-01

    We perform essentially parameter free simulations with realistic physics of convection near the solar surface. We summarize the physics that is included and compare the simulation results with observations. Excellent agreement is obtained for the depth of the convection zone, the p-mode frequencies, the p-mode excitation rate, the distribution of the emergent continuum intensity, and the profiles of weak photospheric lines. We describe how solar convection is nonlocal. It is driven from a thin surface thermal boundary layer where radiative cooling produces low entropy gas which forms the cores of the downdrafts in which most of the buoyancy work occurs. We show that turbulence and vorticity are mostly confined to the intergranular lanes and underlying downdrafts. Finally, we illustrate our current work on magneto-convection.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ru-Sen; Fish, Vincent L.; Doeleman, Sheperd S.

    The black hole in the center of the Galaxy, associated with the compact source Sagittarius A* (Sgr A*), is predicted to cast a shadow upon the emission of the surrounding plasma flow, which encodes the influence of general relativity (GR) in the strong-field regime. The Event Horizon Telescope (EHT) is a Very Long Baseline Interferometry (VLBI) network with a goal of imaging nearby supermassive black holes (in particular Sgr A* and M87) with angular resolution sufficient to observe strong gravity effects near the event horizon. General relativistic magnetohydrodynamic (GRMHD) simulations show that radio emission from Sgr A* exhibits variability onmore » timescales of minutes, much shorter than the duration of a typical VLBI imaging experiment, which usually takes several hours. A changing source structure during the observations, however, violates one of the basic assumptions needed for aperture synthesis in radio interferometry imaging to work. By simulating realistic EHT observations of a model movie of Sgr A*, we demonstrate that an image of the average quiescent emission, featuring the characteristic black hole shadow and photon ring predicted by GR, can nonetheless be obtained by observing over multiple days and subsequent processing of the visibilities (scaling, averaging, and smoothing) before imaging. Moreover, it is shown that this procedure can be combined with an existing method to mitigate the effects of interstellar scattering. Taken together, these techniques allow the black hole shadow in the Galactic center to be recovered on the reconstructed image.« less

  4. Episodic thermal perturbations associated with groundwater flow: An example from Kilauea Volcano, Hawaii

    USGS Publications Warehouse

    Hurwitz, S.; Ingebritsen, S.E.; Sorey, M.L.

    2002-01-01

    Temperature measurements in deep drill holes on volcano summits or upper flanks allow a quantitative analysis of groundwater induced heat transport within the edifice. We present a new temperature-depth profile from a deep well on the summit of Kilauea Volcano, Hawaii, and analyze it in conjunction with a temperature profile measured 26 years earlier. We propose two groundwater flow models to interpret the complex temperature profiles. The first is a modified confined lateral flow model (CLFM) with a continuous flux of hydrothermal fluid. In the second, transient flow model (TFM), slow conductive cooling follows a brief, advective heating event. We carry out numerical simulations to examine the timescales associated with each of the models. Results for both models are sensitive to the initial conditions, and with realistic initial conditions it takes between 750 and 1000 simulation years for either model to match the measured temperature profiles. With somewhat hotter initial conditions, results are consistent with onset of a hydrothermal plume ???550 years ago, coincident with initiation of caldera subsidence. We show that the TFM is consistent with other data from hydrothermal systems and laboratory experiments and perhaps is more appropriate for this highly dynamic environment. The TFM implies that volcano-hydrothermal systems may be dominated by episodic events and that thermal perturbations may persist for several thousand years after hydrothermal flow has ceased.

  5. Preparing for InSight - using the continuous seismic data flow to investigate the deep interior of Mars

    NASA Astrophysics Data System (ADS)

    Hempel, S.; Garcia, R.; Weber, R. C.; Schmerr, N. C.; Panning, M. P.; Lognonne, P. H.; Banerdt, W. B.

    2016-12-01

    Complementary to investigating ray theoretically predictable parameters to explore the deep interior of Mars (see AGU contribution by R. Weber et al.), this paper presents the waveform approach to illuminate the lowermost mantle and core-mantle boundary of Mars. In preparation to the NASA discovery mission InSight, scheduled for launch in May, 2018, we produce synthetic waveforms considering realistic combinations of sources and a single receiver, as well as noise models. Due to a lack of constraints on the scattering properties of the Martian crust and mantle, we assume Earth-like scattering as a minimum and Moon-like scattering as a maximum possibility. Various seismic attenuation models are also investigated. InSight is set up to deliver event data as well as a continuous data flow. Where ray theoretical approaches will investigate the event data, the continuous data flow may contain signals reflected multiple times off the same reflector, e.g. the underside of the lithosphere, or the core-mantle boundary. It may also contain signals of individual events not detected or interfering wavefields radiated off multiple undetected events creating 'seismic noise'. We will use AxiSEM to simulate a continuous data flow for these cases for various 1D and 2D Mars models, and explore the possibilities of seismic interferometry to use seismic information hidden in the coda to investigate the deep interior of Mars.

  6. A method of emotion contagion for crowd evacuation

    NASA Astrophysics Data System (ADS)

    Cao, Mengxiao; Zhang, Guijuan; Wang, Mengsi; Lu, Dianjie; Liu, Hong

    2017-10-01

    The current evacuation model does not consider the impact of emotion and personality on crowd evacuation. Thus, there is large difference between evacuation results and the real-life behavior of the crowd. In order to generate more realistic crowd evacuation results, we present a method of emotion contagion for crowd evacuation. First, we combine OCEAN (Openness, Extroversion, Agreeableness, Neuroticism, Conscientiousness) model and SIS (Susceptible Infected Susceptible) model to construct the P-SIS (Personalized SIS) emotional contagion model. The P-SIS model shows the diversity of individuals in crowd effectively. Second, we couple the P-SIS model with the social force model to simulate emotional contagion on crowd evacuation. Finally, the photo-realistic rendering method is employed to obtain the animation of crowd evacuation. Experimental results show that our method can simulate crowd evacuation realistically and has guiding significance for crowd evacuation in the emergency circumstances.

  7. A novel dynamic mechanical testing technique for reverse shoulder replacements.

    PubMed

    Dabirrahmani, Danè; Bokor, Desmond; Appleyard, Richard

    2014-04-01

    In vitro mechanical testing of orthopedic implants provides information regarding their mechanical performance under simulated biomechanical conditions. Current in vitro component stability testing methods for reverse shoulder implants are based on anatomical shoulder designs, which do not capture the dynamic nature of these loads. With glenoid component loosening as one of the most prevalent modes of failure in reverse shoulder replacements, it is important to establish a testing protocol with a more realistic loading regime. This paper introduces a novel method of mechanically testing reverse shoulder implants, using more realistic load magnitudes and vectors, than is currently practiced. Using a custom made jig setup within an Instron mechanical testing system, it is possible to simulate the change in magnitude and direction of the joint load during arm abduction. This method is a step towards a more realistic testing protocol for measuring reverse shoulder implant stability.

  8. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

  9. 75 FR 35689 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... using realistic simulations.\\14\\ \\13\\ Id. P 1331. \\14\\ Reliability Standard PER-002-0. 9. In Order No... development process to: (1) Include formal training requirements for reliability coordinators similar to those... simulation technology such as a simulator, virtual technology, or other technology in their emergency...

  10. Direct Simulation Monte Carlo Calculations in Support of the Columbia Shuttle Orbiter Accident Investigation

    NASA Technical Reports Server (NTRS)

    Gallis, Michael A.; LeBeau, Gerald J.; Boyles, Katie A.

    2003-01-01

    The Direct Simulation Monte Carlo method was used to provide 3-D simulations of the early entry phase of the Shuttle Orbiter. Undamaged and damaged scenarios were modeled to provide calibration points for engineering "bridging function" type of analysis. Currently the simulation technology (software and hardware) are mature enough to allow realistic simulations of three dimensional vehicles.

  11. 3D Thermal and Mechanical Analysis of a Single Event Burnout

    NASA Astrophysics Data System (ADS)

    Peretti, Gabriela; Demarco, Gustavo; Romero, Eduardo; Tais, Carlos

    2015-08-01

    This paper presents a study related to thermal and mechanical behavior of power DMOS transistors during a Single Event Burnout (SEB) process. We use a cylindrical heat generation region for emulating the thermal and mechanical phenomena related to the SEB. In this way, it is avoided the complexity of the mathematical treatment of the ion-device interaction. This work considers locating the heat generation region in positions that are more realistic than the ones used in previous work. For performing the study, we formulate and validate a new 3D model for the transistor that maintains the computational cost at reasonable level. The resulting mathematical models are solved by means of the Finite Element Method. The simulations results show that the failure dynamics is dominated by the mechanical stress in the metal layer. Additionally, the time to failure depends on the heat source position, for a given power and dimension of the generation region. The results suggest that 3D modeling should be considered for a detailed study of thermal and mechanical effects induced by SEBs.

  12. What We Don't Understand About Ion Acceleration Flares

    NASA Technical Reports Server (NTRS)

    Reames, Donald V.; Ng, C. K.; Tylka, A. J.

    1999-01-01

    There are now strong associations between the (3)He-rich, Fe-rich ions in "impulsive" solar energetic particle (SEP) events and the similar abundances derived from gamma-ray lines from flares. Compact flares, where wave energy can predominate, are ideal sites for the study of wave-particle physics. Yet there are nagging questions about the magnetic geometry, the relation between ions that escape and those that interact, and the relative roles of cascading Alfven waves and the EMIC waves required to enhance He-3. There are also questions about the relative timing of ion and electron acceleration and of heating; these relate to the variation of ionization states before and during acceleration and during transport out of the corona. We can construct a model that addresses many of these issues, but problems do remain. Our greatest lack is realistic theoretical simulations of element abundances, spectra, and their variations. By contrast, we now have a much better idea of the acceleration at CME-driven shock waves in the rare but large "gradual" SEP events, largely because of their slow temporal evolution and great spatial extent.

  13. Future equivalent of 2010 Russian heatwave intensified by weakening soil moisture constraints

    NASA Astrophysics Data System (ADS)

    Rasmijn, L. M.; van der Schrier, G.; Bintanja, R.; Barkmeijer, J.; Sterl, A.; Hazeleger, W.

    2018-05-01

    The 2010 heatwave in eastern Europe and Russia ranks among the hottest events ever recorded in the region1,2. The excessive summer warmth was related to an anomalously widespread and intense quasi-stationary anticyclonic circulation anomaly over western Russia, reinforced by depletion of spring soil moisture1,3-5. At present, high soil moisture levels and strong surface evaporation generally tend to cap maximum summer temperatures6-8, but these constraints may weaken under future warming9,10. Here, we use a data assimilation technique in which future climate model simulations are nudged to realistically represent the persistence and strength of the 2010 blocked atmospheric flow. In the future, synoptically driven extreme warming under favourable large-scale atmospheric conditions will no longer be suppressed by abundant soil moisture, leading to a disproportional intensification of future heatwaves. This implies that future mid-latitude heatwaves analogous to the 2010 event will become even more extreme than previously thought, with temperature extremes increasing by 8.4 °C over western Russia. Thus, the socioeconomic impacts of future heatwaves will probably be amplified beyond current estimates.

  14. The quasi 2 day wave response in TIME-GCM nudged with NOGAPS-ALPHA

    NASA Astrophysics Data System (ADS)

    Wang, Jack C.; Chang, Loren C.; Yue, Jia; Wang, Wenbin; Siskind, D. E.

    2017-05-01

    The quasi 2 day wave (QTDW) is a traveling planetary wave that can be enhanced rapidly to large amplitudes in the mesosphere and lower thermosphere (MLT) region during the northern winter postsolstice period. In this study, we present five case studies of QTDW events during January and February 2005, 2006 and 2008-2010 by using the Thermosphere-Ionosphere-Mesosphere Electrodynamics-General Circulation Model (TIME-GCM) nudged with the Navy Operational Global Atmospheric Prediction System-Advanced Level Physics High Altitude (NOGAPS-ALPHA) Weather Forecast Model. With NOGAPS-ALPHA introducing more realistic lower atmospheric forcing in TIME-GCM, the QTDW events have successfully been reproduced in the TIME-GCM. The nudged TIME-GCM simulations show good agreement in zonal mean state with the NOGAPS-ALPHA 6 h reanalysis data and the horizontal wind model below the mesopause; however, it has large discrepancies in the tropics above the mesopause. The zonal mean zonal wind in the mesosphere has sharp vertical gradients in the nudged TIME-GCM. The results suggest that the parameterized gravity wave forcing may need to be retuned in the assimilative TIME-GCM.

  15. Parametric Sensitivity Analysis for the Asian Summer Monsoon Precipitation Simulation in the Beijing Climate Center AGCM Version 2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Ben; Zhang, Yaocun; Qian, Yun

    In this study, we apply an efficient sampling approach and conduct a large number of simulations to explore the sensitivity of the simulated Asian summer monsoon (ASM) precipitation, including the climatological state and interannual variability, to eight parameters related to the cloud and precipitation processes in the Beijing Climate Center AGCM version 2.1 (BCC_AGCM2.1). Our results show that BCC_AGCM2.1 has large biases in simulating the ASM precipitation. The precipitation efficiency and evaporation coefficient for deep convection are the most sensitive parameters in simulating the ASM precipitation. With optimal parameter values, the simulated precipitation climatology could be remarkably improved, e.g. increasedmore » precipitation over the equator Indian Ocean, suppressed precipitation over the Philippine Sea, and more realistic Meiyu distribution over Eastern China. The ASM precipitation interannual variability is further analyzed, with a focus on the ENSO impacts. It shows the simulations with better ASM precipitation climatology can also produce more realistic precipitation anomalies during El Niño decaying summer. In the low-skill experiments for precipitation climatology, the ENSO-induced precipitation anomalies are most significant over continents (vs. over ocean in observation) in the South Asian monsoon region. More realistic results are derived from the higher-skill experiments with stronger anomalies over the Indian Ocean and weaker anomalies over India and the western Pacific, favoring more evident easterly anomalies forced by the tropical Indian Ocean warming and stronger Indian Ocean-western Pacific tele-connection as observed. Our model results reveal a strong connection between the simulated ASM precipitation climatological state and interannual variability in BCC_AGCM2.1 when key parameters are perturbed.« less

  16. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Realistic radio communications in pilot simulator training

    DOT National Transportation Integrated Search

    2000-12-01

    This report summarizes the first-year efforts of assessing the requirement and feasibility of simulating radio communication automatically. A review of the training and crew resource/task management literature showed both practical and theoretical su...

  18. Locomotive crashworthiness research : modeling, simulation, and validation

    DOT National Transportation Integrated Search

    2001-07-01

    A technique was developed to realistically simulate the dynamic, nonlinear structural behavior of moving rail vehicles and objects struck during a collision. A new approach considered the interdependence of the many vehicles connected in typical rail...

  19. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  20. CHARMM-GUI Membrane Builder toward realistic biological membrane simulations.

    PubMed

    Wu, Emilia L; Cheng, Xi; Jo, Sunhwan; Rui, Huan; Song, Kevin C; Dávila-Contreras, Eder M; Qi, Yifei; Lee, Jumin; Monje-Galvan, Viviana; Venable, Richard M; Klauda, Jeffery B; Im, Wonpil

    2014-10-15

    CHARMM-GUI Membrane Builder, http://www.charmm-gui.org/input/membrane, is a web-based user interface designed to interactively build all-atom protein/membrane or membrane-only systems for molecular dynamics simulations through an automated optimized process. In this work, we describe the new features and major improvements in Membrane Builder that allow users to robustly build realistic biological membrane systems, including (1) addition of new lipid types, such as phosphoinositides, cardiolipin (CL), sphingolipids, bacterial lipids, and ergosterol, yielding more than 180 lipid types, (2) enhanced building procedure for lipid packing around protein, (3) reliable algorithm to detect lipid tail penetration to ring structures and protein surface, (4) distance-based algorithm for faster initial ion displacement, (5) CHARMM inputs for P21 image transformation, and (6) NAMD equilibration and production inputs. The robustness of these new features is illustrated by building and simulating a membrane model of the polar and septal regions of E. coli membrane, which contains five lipid types: CL lipids with two types of acyl chains and phosphatidylethanolamine lipids with three types of acyl chains. It is our hope that CHARMM-GUI Membrane Builder becomes a useful tool for simulation studies to better understand the structure and dynamics of proteins and lipids in realistic biological membrane environments. Copyright © 2014 Wiley Periodicals, Inc.

  1. On coarse projective integration for atomic deposition in amorphous systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuang, Claire Y., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov; Sinno, Talid, E-mail: talid@seas.upenn.edu; Han, Sang M., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov

    2015-10-07

    Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of time scales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity, and computational efficiency. Coarse projective integration, an example application of the “equation-free” framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute time derivatives of slowly evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to themore » application of this technique in realistic settings is the “lifting” operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO{sub 2} substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO{sub 2} using only a few measures of the island size distribution. The approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less

  2. On Coarse Projective Integration for Atomic Deposition in Amorphous Systems

    DOE PAGES

    Chuang, Claire Y.; Han, Sang M.; Zepeda-Ruiz, Luis A.; ...

    2015-10-02

    Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of timescales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity and computational efficiency. Coarse projective integration, an example application of the ‘equation-free’ framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute gradients of slowly-evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to the application of thismore » technique in realistic settings is the ‘lifting’ operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO 2 substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO 2 using only a few measures of the island size distribution. In conclusion, the approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less

  3. Realistic mass ratio magnetic reconnection simulations with the Multi Level Multi Domain method

    NASA Astrophysics Data System (ADS)

    Innocenti, Maria Elena; Beck, Arnaud; Lapenta, Giovanni; Markidis, Stefano

    2014-05-01

    Space physics simulations with the ambition of realistically representing both ion and electron dynamics have to be able to cope with the huge scale separation between the electron and ion parameters while respecting the stability constraints of the numerical method of choice. Explicit Particle In Cell (PIC) simulations with realistic mass ratio are limited in the size of the problems they can tackle by the restrictive stability constraints of the explicit method (Birdsall and Langdon, 2004). Many alternatives are available to reduce such computation costs. Reduced mass ratios can be used, with the caveats highlighted in Bret and Dieckmann (2010). Fully implicit (Chen et al., 2011a; Markidis and Lapenta, 2011) or semi implicit (Vu and Brackbill, 1992; Lapenta et al., 2006; Cohen et al., 1989) methods can bypass the strict stability constraints of explicit PIC codes. Adaptive Mesh Refinement (AMR) techniques (Vay et al., 2004; Fujimoto and Sydora, 2008) can be employed to change locally the simulation resolution. We focus here on the Multi Level Multi Domain (MLMD) method introduced in Innocenti et al. (2013) and Beck et al. (2013). The method combines the advantages of implicit algorithms and adaptivity. Two levels are fully simulated with fields and particles. The so called "refined level" simulates a fraction of the "coarse level" with a resolution RF times bigger than the coarse level resolution, where RF is the Refinement Factor between the levels. This method is particularly suitable for magnetic reconnection simulations (Biskamp, 2005), where the characteristic Ion and Electron Diffusion Regions (IDR and EDR) develop at the ion and electron scales respectively (Daughton et al., 2006). In Innocenti et al. (2013) we showed that basic wave and instability processes are correctly reproduced by MLMD simulations. In Beck et al. (2013) we applied the technique to plasma expansion and magnetic reconnection problems. We showed that notable computational time savings can be achieved. More importantly, we were able to correctly reproduce EDR features, such as the inversion layer of the electric field observed in Chen et al. (2011b), with a MLMD simulation at a significantly lower cost. Here, we present recent results on EDR dynamics achieved with the MLMD method and a realistic mass ratio.

  4. Mid-Western US heavy summer-precipitation in regional and global climate models: the impact on model skill and consensus through an analogue lens

    NASA Astrophysics Data System (ADS)

    Gao, Xiang; Schlosser, C. Adam

    2018-04-01

    Regional climate models (RCMs) can simulate heavy precipitation more accurately than general circulation models (GCMs) through more realistic representation of topography and mesoscale processes. Analogue methods of downscaling, which identify the large-scale atmospheric conditions associated with heavy precipitation, can also produce more accurate and precise heavy precipitation frequency in GCMs than the simulated precipitation. In this study, we examine the performances of the analogue method versus direct simulation, when applied to RCM and GCM simulations, in detecting present-day and future changes in summer (JJA) heavy precipitation over the Midwestern United States. We find analogue methods are comparable to MERRA-2 and its bias-corrected precipitation in characterizing the occurrence and interannual variations of observed heavy precipitation events, all significantly improving upon MERRA precipitation. For the late twentieth-century heavy precipitation frequency, RCM precipitation improves upon the corresponding driving GCM with greater accuracy yet comparable inter-model discrepancies, while both RCM- and GCM-based analogue results outperform their model-simulated precipitation counterparts in terms of accuracy and model consensus. For the projected trends in heavy precipitation frequency through the mid twenty-first century, analogue method also manifests its superiority to direct simulation with reduced intermodel disparities, while the RCM-based analogue and simulated precipitation do not demonstrate a salient improvement (in model consensus) over the GCM-based assessment. However, a number of caveats preclude any overall judgement, and further work—over any region of interest—should include a larger sample of GCMs and RCMs as well as ensemble simulations to comprehensively account for internal variability.

  5. Simulation of the Onset of the Southeast Asian Monsoon During 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Lau, W.; Baker, R.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  6. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Lau, W.; Baker, R. D.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  7. VERA Core Simulator Methodology for PWR Cycle Depletion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kochunas, Brendan; Collins, Benjamin S; Jabaay, Daniel

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclearmore » reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.« less

  8. Virtual operating room for team training in surgery.

    PubMed

    Abelson, Jonathan S; Silverman, Elliott; Banfelder, Jason; Naides, Alexandra; Costa, Ricardo; Dakin, Gregory

    2015-09-01

    We proposed to develop a novel virtual reality (VR) team training system. The objective of this study was to determine the feasibility of creating a VR operating room to simulate a surgical crisis scenario and evaluate the simulator for construct and face validity. We modified ICE STORM (Integrated Clinical Environment; Systems, Training, Operations, Research, Methods), a VR-based system capable of modeling a variety of health care personnel and environments. ICE STORM was used to simulate a standardized surgical crisis scenario, whereby participants needed to correct 4 elements responsible for loss of laparoscopic visualization. The construct and face validity of the environment were measured. Thirty-three participants completed the VR simulation. Attendings completed the simulation in less time than trainees (271 vs 201 seconds, P = .032). Participants felt the training environment was realistic and had a favorable impression of the simulation. All participants felt the workload of the simulation was low. Creation of a VR-based operating room for team training in surgery is feasible and can afford a realistic team training environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  10. Simulating Nonmodel-Fitting Responses in a CAT Environment. ACT Research Report Series 98-10.

    ERIC Educational Resources Information Center

    Yi, Qing; Nering, Michael L.

    This study developed a model to simulate nonmodel-fitting responses in a computerized adaptive testing (CAT) environment, and to examine the effectiveness of the model. The underlying idea was to simulate examinees' test behaviors realistically. This study simulated a situation in which examinees are exposed to or are coached on test items before…

  11. Sorting cancer karyotypes using double-cut-and-joins, duplications and deletions.

    PubMed

    Zeira, Ron; Shamir, Ron

    2018-05-03

    Problems of genome rearrangement are central in both evolution and cancer research. Most genome rearrangement models assume that the genome contains a single copy of each gene and the only changes in the genome are structural, i.e., reordering of segments. In contrast, tumor genomes also undergo numerical changes such as deletions and duplications, and thus the number of copies of genes varies. Dealing with unequal gene content is a very challenging task, addressed by few algorithms to date. More realistic models are needed to help trace genome evolution during tumorigenesis. Here we present a model for the evolution of genomes with multiple gene copies using the operation types double-cut-and-joins, duplications and deletions. The events supported by the model are reversals, translocations, tandem duplications, segmental deletions, and chromosomal amplifications and deletions, covering most types of structural and numerical changes observed in tumor samples. Our goal is to find a series of operations of minimum length that transform one karyotype into the other. We show that the problem is NP-hard and give an integer linear programming formulation that solves the problem exactly under some mild assumptions. We test our method on simulated genomes and on ovarian cancer genomes. Our study advances the state of the art in two ways: It allows a broader set of operations than extant models, thus being more realistic, and it is the first study attempting to reconstruct the full sequence of structural and numerical events during cancer evolution. Code and data are available in https://github.com/Shamir-Lab/Sorting-Cancer-Karyotypes. ronzeira@post.tau.ac.il, rshamir@tau.ac.il. Supplementary data are available at Bioinformatics online.

  12. Beyond Iconic Simulation

    ERIC Educational Resources Information Center

    Dormans, Joris

    2011-01-01

    Realism remains a prominent topic in game design and industry research; yet, a strong academic case can be made that games are anything, but realistic. This article frames realism in games in semiotic terms as iconic simulation and argues that games can gain expressiveness when they move beyond the current focus on iconic simulation. In parallel…

  13. Aircraft Simulators and Pilot Training.

    ERIC Educational Resources Information Center

    Caro, Paul W.

    Flight simulators are built as realistically as possible, presumably to enhance their training value. Yet, their training value is determined by the way they are used. Traditionally, simulators have been less important for training than have aircraft, but they are currently emerging as primary pilot training vehicles. This new emphasis is an…

  14. Simulation as a Method of Teaching Communication for Multinational Corporations.

    ERIC Educational Resources Information Center

    Stull, James B.; Baird, John W.

    Interpersonal simulations may be used as a module in cultural awareness programs to provide realistic environments in which students, supervisors, and managers may practice communication skills that are effective in multicultural environments. To conduct and implement a cross-cultural simulation, facilitators should proceed through four stages:…

  15. Duration to Establish an Emergency Vascular Access and How to Accelerate It: A Simulation-Based Study Performed in Real-Life Neonatal Resuscitation Rooms.

    PubMed

    Schwindt, Eva M; Hoffmann, Florian; Deindl, Philipp; Waldhoer, Thomas J; Schwindt, Jens C

    2018-05-01

    To compare the duration to establish an umbilical venous catheter and an intraosseous access in real hospital delivery rooms and as a secondary aim to assess delaying factors during establishment and to provide recommendations to accelerate vascular access in neonatal resuscitation. Retrospective analysis of audio-video recorded neonatal simulation training. Simulation training events in exact replications of actual delivery/resuscitation rooms of 16 hospitals with different levels of care (Austria and Germany). Equipment was prepared the same way as for real clinical events. Medical teams of four to five persons with birth-related background (midwives, nurses, neonatologists, and anesthesiologists) in a realistic team composition. Audio-video recorded mannequin-based simulated resuscitation of an asphyxiated newborn including the establishment of either umbilical venous catheter or intraosseous access. The duration of access establishment (time from decision to first flush/aspiration), preparation (decision to start of procedure), and the procedure itself (start to first flush/aspiration) was significantly longer for umbilical venous catheter than for intraosseous access (overall duration 199 vs 86 s). Delaying factors for umbilical venous catheter establishment were mainly due to the complex approach itself, the multitude of equipment required, and uncertainties about necessary hygiene standards. Challenges in intraosseous access establishment were handling of the unfamiliar material and absence of an intraosseous access kit in the resuscitation room. There was no significant difference between the required duration for access establishment between large centers and small hospitals, but a trend was observed that duration for umbilical venous catheter was longer in small hospitals than in centers. Duration for intraosseous access was similar in both hospital types. Vascular access establishment in neonatal resuscitation could be accelerated by infrastructural improvements and specific training of medical teams. In simulated in situ neonatal resuscitation, intraosseous access is faster to establish than umbilical venous catheter. Future studies are required to assess efficacy and safety of both approaches in real resuscitation settings.

  16. Simulation of South-Asian Summer Monsoon in a GCM

    NASA Astrophysics Data System (ADS)

    Ajayamohan, R. S.

    2007-10-01

    Major characteristics of Indian summer monsoon climate are analyzed using simulations from the upgraded version of Florida State University Global Spectral Model (FSUGSM). The Indian monsoon has been studied in terms of mean precipitation and low-level and upper-level circulation patterns and compared with observations. In addition, the model's fidelity in simulating observed monsoon intraseasonal variability, interannual variability and teleconnection patterns is examined. The model is successful in simulating the major rainbelts over the Indian monsoon region. However, the model exhibits bias in simulating the precipitation bands over the South China Sea and the West Pacific region. Seasonal mean circulation patterns of low-level and upper-level winds are consistent with the model's precipitation pattern. Basic features like onset and peak phase of monsoon are realistically simulated. However, model simulation indicates an early withdrawal of monsoon. Northward propagation of rainbelts over the Indian continent is simulated fairly well, but the propagation is weak over the ocean. The model simulates the meridional dipole structure associated with the monsoon intraseasonal variability realistically. The model is unable to capture the observed interannual variability of monsoon and its teleconnection patterns. Estimate of potential predictability of the model reveals the dominating influence of internal variability over the Indian monsoon region.

  17. Building the Case for SNAP: Creation of Multi-Band, Simulated Images With Shapelets

    NASA Technical Reports Server (NTRS)

    Ferry, Matthew A.

    2005-01-01

    Dark energy has simultaneously been the most elusive and most important phenomenon in the shaping of the universe. A case for a proposed space-telescope called SNAP (SuperNova Acceleration Probe) is being built, a crucial component of which is image simulations. One method for this is "Shapelets," developed at Caltech. Shapelets form an orthonormal basis and are uniquely able to represent realistic space images and create new images based on real ones. Previously, simulations were created using the Hubble Deep Field (HDF) as a basis Set in one band. In this project, image simulations are created.using the 4 bands of the Hubble Ultra Deep Field (UDF) as a basis set. This provides a better basis for simulations because (1) the survey is deeper, (2) they have a higher resolution, and (3) this is a step closer to simulating the 9 bands of SNAP. Image simulations are achieved by detecting sources in the UDF, decomposing them into shapelets, tweaking their parameters in realistic ways, and recomposing them into new images. Morphological tests were also run to verify the realism of the simulations. They have a wide variety of uses, including the ability to create weak gravitational lensing simulations.

  18. Hazard-Free Pyrotechnic Simulator

    NASA Technical Reports Server (NTRS)

    Mcalister, William B., Jr.

    1988-01-01

    Simulator evaluates performance of firing circuits for electroexplosive devices (EED's) safely and inexpensively. Tests circuits realistically when pyrotechnic squibs not connected and eliminates risks of explosions. Used to test such devices as batteries where test conditions might otherwise degrade them.

  19. Data assimilation of citizen collected information for real-time flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE < 0.4 m in average. For the second more realistic situation, the error becomes larger (RMSE 0.5 m) and the impact of the optimal interpolation becomes comparatively less effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for assimilating citizen collected information for real-time flood hazard mapping in the future.

  20. A computational proof of concept of a machine-intelligent artificial pancreas using Lyapunov stability and differential game theory.

    PubMed

    Greenwood, Nigel J C; Gunton, Jenny E

    2014-07-01

    This study demonstrated the novel application of a "machine-intelligent" mathematical structure, combining differential game theory and Lyapunov-based control theory, to the artificial pancreas to handle dynamic uncertainties. Realistic type 1 diabetes (T1D) models from the literature were combined into a composite system. Using a mixture of "black box" simulations and actual data from diabetic medical histories, realistic sets of diabetic time series were constructed for blood glucose (BG), interstitial fluid glucose, infused insulin, meal estimates, and sometimes plasma insulin assays. The problem of underdetermined parameters was side stepped by applying a variant of a genetic algorithm to partial information, whereby multiple candidate-personalized models were constructed and then rigorously tested using further data. These formed a "dynamic envelope" of trajectories in state space, where each trajectory was generated by a hypothesis on the hidden T1D system dynamics. This dynamic envelope was then culled to a reduced form to cover observed dynamic behavior. A machine-intelligent autonomous algorithm then implemented game theory to construct real-time insulin infusion strategies, based on the flow of these trajectories through state space and their interactions with hypoglycemic or near-hyperglycemic states. This technique was tested on 2 simulated participants over a total of fifty-five 24-hour days, with no hypoglycemic or hyperglycemic events, despite significant uncertainties from using actual diabetic meal histories with 10-minute warnings. In the main case studies, BG was steered within the desired target set for 99.8% of a 16-hour daily assessment period. Tests confirmed algorithm robustness for ±25% carbohydrate error. For over 99% of the overall 55-day simulation period, either formal controller stability was achieved to the desired target or else the trajectory was within the desired target. These results suggest that this is a stable, high-confidence way to generate closed-loop insulin infusion strategies. © 2014 Diabetes Technology Society.

  1. Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres

    NASA Technical Reports Server (NTRS)

    McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.

    1999-01-01

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.

  2. Molecular Simulation Results on Charged Carbon Nanotube Forest-Based Supercapacitors.

    PubMed

    Muralidharan, Ajay; Pratt, Lawrence R; Hoffman, Gary G; Chaudhari, Mangesh I; Rempe, Susan B

    2018-06-22

    Electrochemical double-layer capacitances of charged carbon nanotube (CNT) forests with tetraethyl ammonium tetrafluoro borate electrolyte in propylene carbonate are studied on the basis of molecular dynamics simulation. Direct molecular simulation of the filling of pore spaces of the forest is feasible even with realistic, small CNT spacings. The numerical solution of the Poisson equation based on the extracted average charge densities then yields a regular experimental dependence on the width of the pore spaces, in contrast to the anomalous pattern observed in experiments on other carbon materials and also in simulations on planar slot-like pores. The capacitances obtained have realistic magnitudes but are insensitive to electric potential differences between the electrodes in this model. This agrees with previous calculations on CNT forest supercapacitors, but not with experiments which have suggested electrochemical doping for these systems. Those phenomena remain for further theory/modeling work. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Three-dimensional electron microscopy simulation with the CASINO Monte Carlo software.

    PubMed

    Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique

    2011-01-01

    Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this article, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. Copyright © 2011 Wiley Periodicals, Inc.

  4. Modeling the performance of direct-detection Doppler lidar systems including cloud and solar background variability.

    PubMed

    McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D

    1999-10-20

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campione, Salvatore; Warne, Larry K.; Jorgenson, Roy E.

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects,more » the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.« less

  6. Three-Dimensional Electron Microscopy Simulation with the CASINO Monte Carlo Software

    PubMed Central

    Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique

    2011-01-01

    Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this paper, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. PMID:21769885

  7. Realistic finite temperature simulations of magnetic systems using quantum statistics

    NASA Astrophysics Data System (ADS)

    Bergqvist, Lars; Bergman, Anders

    2018-01-01

    We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.

  8. New Methodologies Applied to Seismic Hazard Assessment in Southern Calabria (Italy)

    NASA Astrophysics Data System (ADS)

    Console, R.; Chiappini, M.; Speranza, F.; Carluccio, R.; Greco, M.

    2016-12-01

    Although it is generally recognized that the M7+ 1783 and 1908 Calabria earthquakes were caused by normal faults rupturing the upper crust of the southern Calabria-Peloritani area, no consensus exists on seismogenic source location and orientation. A recent high-resolution low-altitude aeromagnetic survey of southern Calabria and Messina straits suggested that the sources of the 1783 and 1908 earthquakes are en echelon faults belonging to the same NW dipping normal fault system straddling the whole southern Calabria. The application of a newly developed physics-based earthquake simulator to the active fault system modeled by the data obtained from the aeromagnetic survey and other recent geological studies has allowed the production of catalogs lasting 100,000 years and containing more than 25,000 events of magnitudes ≥ 4.0. The algorithm on which this simulator is based is constrained by several physical elements as: (a) an average slip rate due to tectonic loading for every single segment in the investigated fault system, (b) the process of rupture growth and termination, leading to a self-organized earthquake magnitude distribution, and (c) interaction between earthquake sources, including small magnitude events. Events nucleated in one segment are allowed to expand into neighboring segments, if they are separated by a given maximum range of distance. The application of our simulation algorithm to Calabria region provides typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range. Lastly, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedence probability of given values of peak acceleration (PGA) on the territory under investigation. These maps can be compared with the existing hazard maps that are presently used in the national seismic building regulations.

  9. Measurements and Modeling of Turbulent Fluxes during Persistent Cold Air Pool Events in Salt Lake Valley, Utah

    NASA Astrophysics Data System (ADS)

    Ivey, C. E.; Sun, X.; Holmes, H.

    2017-12-01

    Land surface processes are important in meteorology and climate research since they control the partitioning of surface energy and water exchange at the earth's surface. The surface layer is coupled to the planetary boundary layer (PBL) by surface fluxes, which serve as sinks or sources of energy, moisture, momentum, and atmospheric pollutants. Quantifying the surface heat and momentum fluxes at the land-atmosphere interface, especially for different surface land cover types, is important because they can further influence the atmospheric dynamics, vertical mixing, and transport processes that impact local, regional, and global climate. A cold air pool (CAP) forms when a topographic depression (i.e., valley) fills with cold air, where the air in the stagnant layer is colder than the air aloft. Insufficient surface heating, which is not able to sufficiently erode the temperature inversion that forms during the nighttime stable boundary layer, can lead to the formation of persistent CAPs during wintertime. These persistent CAPs can last for days, or even weeks, and are associated with increased air pollution concentrations. Thus, realistic simulations of the land-atmosphere exchange are meaningful to achieve improved predictions of the accumulation, transport, and dispersion of air pollution concentrations. The focus of this presentation is on observations and modeling results using turbulence data collected in Salt Lake Valley, Utah during the 2010-2011 wintertime Persistent Cold Air Pool Study (PCAPS). Turbulent fluxes and the surface energy balance over seven land use types are quantified. The urban site has an energy balance ratio (EBR) larger than one (1.276). Negative Bowen ratio (-0.070) is found at the cropland site. In addition to turbulence observations, half-hourly WRF simulated net radiation, latent heat, sensible heat, ground heat fluxes during one persistent CAP event are evaluated using the PCAPS observations. The results show that sensible and latent heat fluxes during the CAP event are overestimated. The sensitivity of WRF results to large-scale forcing datasets, PBL schemes and land surface models (LSMs) are also investigated. The optimal WRF configuration for simulating surface turbulent fluxes and atmospheric mixing during CAP events is determined.

  10. Investigating the impact of land-use land-cover change on Indian summer monsoon daily rainfall and temperature during 1951–2005 using a regional climate model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halder, Subhadeep; Saha, Subodh K.; Dirmeyer, Paul A.

    Daily moderate rainfall events, which constitute a major portion of seasonal summer monsoon rainfall over central India, have decreased significantly during the period 1951 through 2005. On the other hand, mean and extreme near-surface daily temperature during the monsoon season have increased by a maximum of 1–1.5 °C. Using simulations made with a high-resolution regional climate model (RegCM4) and prescribed land cover of years 1950 and 2005, it is demonstrated that part of the changes in moderate rainfall events and temperature have been caused by land-use/land-cover change (LULCC), which is mostly anthropogenic. Model simulations show that the increase in seasonal mean and extreme temperature over centralmore » India coincides with the region of decrease in forest and increase in crop cover. Our results also show that LULCC alone causes warming in the extremes of daily mean and maximum temperatures by a maximum of 1–1.2 °C, which is comparable with the observed increasing trend in the extremes. Decrease in forest cover and simultaneous increase in crops not only reduces the evapotranspiration over land and large-scale convective instability, but also contributes toward decrease in moisture convergence through reduced surface roughness. These factors act together in reducing significantly the moderate rainfall events and the amount of rainfall in that category over central India. Additionally, the model simulations are repeated by removing the warming trend in sea surface temperatures over the Indian Ocean. As a result, enhanced warming at the surface and greater decrease in moderate rainfall events over central India compared to the earlier set of simulations are noticed. Results from these additional experiments corroborate our initial findings and confirm the contribution of LULCC in the decrease in moderate rainfall events and increase in daily mean and extreme temperature over India. Therefore, this study demonstrates the important implications of LULCC over India during the monsoon season. Although, the regional climate model helps in better resolving land–atmosphere feedbacks over the Indian region, the inferences do depend on the fidelity of the model in capturing the features of Indian monsoon realistically. Lastly, it is proposed that similar studies using a suite of climate models will further enrich our understanding about the role of LULCC in the Indian monsoon climate.« less

  11. Investigating the impact of land-use land-cover change on Indian summer monsoon daily rainfall and temperature during 1951–2005 using a regional climate model

    DOE PAGES

    Halder, Subhadeep; Saha, Subodh K.; Dirmeyer, Paul A.; ...

    2016-05-10

    Daily moderate rainfall events, which constitute a major portion of seasonal summer monsoon rainfall over central India, have decreased significantly during the period 1951 through 2005. On the other hand, mean and extreme near-surface daily temperature during the monsoon season have increased by a maximum of 1–1.5 °C. Using simulations made with a high-resolution regional climate model (RegCM4) and prescribed land cover of years 1950 and 2005, it is demonstrated that part of the changes in moderate rainfall events and temperature have been caused by land-use/land-cover change (LULCC), which is mostly anthropogenic. Model simulations show that the increase in seasonal mean and extreme temperature over centralmore » India coincides with the region of decrease in forest and increase in crop cover. Our results also show that LULCC alone causes warming in the extremes of daily mean and maximum temperatures by a maximum of 1–1.2 °C, which is comparable with the observed increasing trend in the extremes. Decrease in forest cover and simultaneous increase in crops not only reduces the evapotranspiration over land and large-scale convective instability, but also contributes toward decrease in moisture convergence through reduced surface roughness. These factors act together in reducing significantly the moderate rainfall events and the amount of rainfall in that category over central India. Additionally, the model simulations are repeated by removing the warming trend in sea surface temperatures over the Indian Ocean. As a result, enhanced warming at the surface and greater decrease in moderate rainfall events over central India compared to the earlier set of simulations are noticed. Results from these additional experiments corroborate our initial findings and confirm the contribution of LULCC in the decrease in moderate rainfall events and increase in daily mean and extreme temperature over India. Therefore, this study demonstrates the important implications of LULCC over India during the monsoon season. Although, the regional climate model helps in better resolving land–atmosphere feedbacks over the Indian region, the inferences do depend on the fidelity of the model in capturing the features of Indian monsoon realistically. Lastly, it is proposed that similar studies using a suite of climate models will further enrich our understanding about the role of LULCC in the Indian monsoon climate.« less

  12. Radiofrequency ablation of hepatic tumors: simulation, planning, and contribution of virtual reality and haptics.

    PubMed

    Villard, Caroline; Soler, Luc; Gangi, Afshin

    2005-08-01

    For radiofrequency ablation (RFA) of liver tumors, evaluation of vascular architecture, post-RFA necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator, treatment planning, and training tool, that simulates the insertion of the needle, the necrosis of the treated area, and proposes an optimal needle placement, has been developed. The 3D scenes are automatically reconstructed from enhanced spiral CT scans. The simulator takes into account the cooling effect of local vessels greater than 3 mm in diameter, making necrosis shapes more realistic. Optimal needle positioning can be automatically generated by the software to produce complete destruction of the tumor, with maximum respect of the healthy liver and of all major structures to avoid. We also studied how the use of virtual reality and haptic devices are valuable to make simulation and training realistic and effective.

  13. A 3D virtual reality simulator for training of minimally invasive surgery.

    PubMed

    Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin

    2014-01-01

    For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.

  14. Flying to Neverland: How readers tacitly judge norms during comprehension.

    PubMed

    Foy, Jeffrey E; Gerrig, Richard J

    2014-11-01

    As readers gain experience with specific narrative worlds, they accumulate information that allows them to experience events as normal or unusual within those worlds. In this article, we contrast two accounts for how readers access information about specific narrative worlds to make tacit judgments of normalcy. We conducted two experiments. In Experiment 1, participants read stories about an ordinary character (e.g., a police officer in Boston) or a familiar fantastic character (e.g., Superman). Each story described a realistic event (e.g., the character being killed by bullets) or a fantastic event (e.g., bullets bouncing off the character's chest). Participants were faster to read events that were consistent with their prior knowledge about the story world. In Experiments 2a and 2b, participants read stories about familiar fantastic characters, unfamiliar fantastic characters (e.g., a Kryptonian named Dev-em), and unfamiliar ordinary characters. In Experiment 2a, participants were equally fast to read about the familiar and unfamiliar fantastic characters experiencing fantastic events, both of which were read faster than the unfamiliar ordinary characters sentences. In Experiment 2b, participants were fastest to read about unfamiliar ordinary characters experiencing realistic events and were equally slow for familiar and unfamiliar fantastic characters. Our experiments provide evidence that readers routinely use inductive reasoning to go beyond their prior knowledge when reading fictional narratives, affecting whether they experience events as normal or unusual.

  15. Simulation at the point of care: reduced-cost, in situ training via a mobile cart.

    PubMed

    Weinstock, Peter H; Kappus, Liana J; Garden, Alexander; Burns, Jeffrey P

    2009-03-01

    The rapid growth of simulation in health care has challenged traditional paradigms of hospital-based education and training. Simulation addresses patient safety through deliberative practice of high-risk low-frequency events within a safe, structured environment. Despite its inherent appeal, widespread adoption of simulation is prohibited by high cost, limited space, interruptions to clinical duties, and the inability to replicate important nuances of clinical environments. We therefore sought to develop a reduced-cost low-space mobile cart to provide realistic simulation experiences to a range of providers within the clinical environment and to serve as a model for transportable, cost-effective, widespread simulation-based training of bona-fide workplace teams. Descriptive study. A tertiary care pediatric teaching hospital. A self-contained mobile simulation cart was constructed at a cost of $8054 (mannequin not included). The cart is compatible with any mannequin and contains all equipment needed to produce a high quality simulation experience equivalent to that of our on-site center--including didactics and debriefing with videotaped recordings complete with vital sign overlay. Over a 3-year period the cart delivered 57 courses to 425 participants from five pediatric departments. All individuals were trained among their native teams and within their own clinical environment. By bringing all pedagogical elements to the actual clinical environment, a mobile cart can provide simulation to hospital teams that might not otherwise benefit from the educational tool. By reducing the setup cost and the need for dedicated space, the mobile approach provides a mechanism to increase the number of institutions capable of harnessing the power of simulation-based education internationally.

  16. Parallel computing in enterprise modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less

  17. Investigation of Periodic-Disturbance Identification and Rejection in Spacecraft

    DTIC Science & Technology

    2006-08-01

    linear theory. Therefore, it is of interest to examine its efficacy on the current nonlinear spacecraft model. In addition, the robustness of the...School, Monterey, California 93943 Spacecraft periodic-disturbance rejection using a realistic spacecraft hardware simulator and its associated models...is investigated. The effectiveness of the dipole-type disturbance rejection filter on the current realistic nonlinear rigid-body spacecraft model is

  18. Comparison of texture synthesis methods for content generation in ultrasound simulation for training

    NASA Astrophysics Data System (ADS)

    Mattausch, Oliver; Ren, Elizabeth; Bajka, Michael; Vanhoey, Kenneth; Goksel, Orcun

    2017-03-01

    Navigation and interpretation of ultrasound (US) images require substantial expertise, the training of which can be aided by virtual-reality simulators. However, a major challenge in creating plausible simulated US images is the generation of realistic ultrasound speckle. Since typical ultrasound speckle exhibits many properties of Markov Random Fields, it is conceivable to use texture synthesis for generating plausible US appearance. In this work, we investigate popular classes of texture synthesis methods for generating realistic US content. In a user study, we evaluate their performance for reproducing homogeneous tissue regions in B-mode US images from small image samples of similar tissue and report the best-performing synthesis methods. We further show that regression trees can be used on speckle texture features to learn a predictor for US realism.

  19. Simulation of radiofrequency ablation in real human anatomy.

    PubMed

    Zorbas, George; Samaras, Theodoros

    2014-12-01

    The objective of the current work was to simulate radiofrequency ablation treatment in computational models with realistic human anatomy, in order to investigate the effect of realistic geometry in the treatment outcome. The body sites considered in the study were liver, lung and kidney. One numerical model for each body site was obtained from Duke, member of the IT'IS Virtual Family. A spherical tumour was embedded in each model and a single electrode was inserted into the tumour. The same excitation voltage was used in all cases to underline the differences in the resulting temperature rise, due to different anatomy at each body site investigated. The same numerical calculations were performed for a two-compartment model of the tissue geometry, as well as with the use of an analytical approximation for a single tissue compartment. Radiofrequency ablation (RFA) therapy appears efficient for tumours in liver and lung, but less efficient in kidney. Moreover, the time evolution of temperature for a realistic geometry differs from that for a two-compartment model, but even more for an infinite homogenous tissue model. However, it appears that the most critical parameters of computational models for RFA treatment planning are tissue properties rather than tissue geometry. Computational simulations of realistic anatomy models show that the conventional technique of a single electrode inside the tumour volume requires a careful choice of both the excitation voltage and treatment time in order to achieve effective treatment, since the ablation zone differs considerably for various body sites.

  20. Smsynth: AN Imagery Synthesis System for Soil Moisture Retrieval

    NASA Astrophysics Data System (ADS)

    Cao, Y.; Xu, L.; Peng, J.

    2018-04-01

    Soil moisture (SM) is a important variable in various research areas, such as weather and climate forecasting, agriculture, drought and flood monitoring and prediction, and human health. An ongoing challenge in estimating SM via synthetic aperture radar (SAR) is the development of the retrieval SM methods, especially the empirical models needs as training samples a lot of measurements of SM and soil roughness parameters which are very difficult to acquire. As such, it is difficult to develop empirical models using realistic SAR imagery and it is necessary to develop methods to synthesis SAR imagery. To tackle this issue, a SAR imagery synthesis system based on the SM named SMSynth is presented, which can simulate radar signals that are realistic as far as possible to the real SAR imagery. In SMSynth, SAR backscatter coefficients for each soil type are simulated via the Oh model under the Bayesian framework, where the spatial correlation is modeled by the Markov random field (MRF) model. The backscattering coefficients simulated based on the designed soil parameters and sensor parameters are added into the Bayesian framework through the data likelihood where the soil parameters and sensor parameters are set as realistic as possible to the circumstances on the ground and in the validity range of the Oh model. In this way, a complete and coherent Bayesian probabilistic framework is established. Experimental results show that SMSynth is capable of generating realistic SAR images that suit the needs of a large amount of training samples of empirical models.

  1. Dynamics and transport in the stratosphere : Simulations with a general circulation mode

    NASA Astrophysics Data System (ADS)

    van Aalst, Maarten Krispijn

    2005-01-01

    The middle atmosphere is strongly affected by two of the world's most important environmental problems: global climate change and stratospheric ozone depletion, caused by anthropogenic emissions of greenhouse gases and chlorofluorocarbons (CFCs), respectively. General circulation models with coupled chemistry are a key tool to advance our understanding of the complex interplay between dynamics, chemistry and radiation in the middle atmosphere. A key problem of such models is that they generate their own meteorology, and thus cannot be used for comparisons with instantaneous measurements. This thesis presents the first application of a simple data assimilation method, Newtonian relaxation, to reproduce realistic synoptical conditions in a state-of-the-art middle atmosphere general circulation model, MA-ECHAM. By nudging the model's meteorology slightly towards analyzed observations from a weather forecasting system (ECMWF), we have simulated specific atmospheric processes during particular meteorological episodes, such as the 1999/2000 Arctic winter. The nudging technique is intended to interfere as little as possible with the model's own dynamics. In fact, we found that we could even limit the nudging to the troposphere, leaving the middle atmosphere entirely free. In that setup, the model realistically reproduced many aspects of the instantaneous meteorology of the middle atmosphere, such as the unusually early major warming and breakup of the 2002 Antarctic vortex. However, we found that this required careful interpolation of the nudging data, and a correct choice of nudging parameters. We obtained the best results when we first projected the nudging data onto the model's normal modes so that we could filter out the (spurious) fast components. In a four-year simulation, for which we also introduced an additional nudging of the stratospheric quasi-biennial oscillation, we found that the model reproduced much of the interannual variability throughout the stratosphere, including the Antarctic temperature minima crucial for polar ozone chemistry, but failed to capture the precise timing and evolution of Arctic stratospheric warmings. We also identified an important model deficiency regarding tracer transport in the lower polar stratosphere. The success of the runs with tropospheric nudging in simulating the right stratospheric conditions, including the model capability to forecast major stratospheric warming events, bodes well for the model's representation of the dynamic coupling between the troposphere and the stratosphere, an important element of realistic simulation of the future climate of the middle atmosphere (which will partly depend on a changing wave forcing from the troposphere). However, for some aspects of stratospheric dynamics, such as the quasi-biennial oscillation, a higher vertical resolution is required, which might also help to reduce some of the transport problems identified in the lower polar vortex. The nudging technique applied and developed in this thesis offers excellent prospects for applications in coupled-chemistry simulations of the middle atmosphere, including for the interpretation of instantaneous measurements. In particular, it can be used to test and improve the new MA-ECHAM5/MESSy/MECCA coupled chemistry climate model system, in preparation for more reliable simulations of past and future climates.

  2. The Impact of Simulated Mesoscale Convective Systems on Global Precipitation: A Multiscale Modeling Study

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, Jiun-Dar

    2017-01-01

    The importance of precipitating mesoscale convective systems (MCSs) has been quantified from TRMM precipitation radar and microwave imager retrievals. MCSs generate more than 50% of the rainfall in most tropical regions. MCSs usually have horizontal scales of a few hundred kilometers (km); therefore, a large domain with several hundred km is required for realistic simulations of MCSs in cloud-resolving models (CRMs). Almost all traditional global and climate models do not have adequate parameterizations to represent MCSs. Typical multi-scale modeling frameworks (MMFs) may also lack the resolution (4 km grid spacing) and domain size (128 km) to realistically simulate MCSs. In this study, the impact of MCSs on precipitation is examined by conducting model simulations using the Goddard Cumulus Ensemble (GCE) model and Goddard MMF (GMMF). The results indicate that both models can realistically simulate MCSs with more grid points (i.e., 128 and 256) and higher resolutions (1 or 2 km) compared to those simulations with fewer grid points (i.e., 32 and 64) and low resolution (4 km). The modeling results also show the strengths of the Hadley circulations, mean zonal and regional vertical velocities, surface evaporation, and amount of surface rainfall are weaker or reduced in the GMMF when using more CRM grid points and higher CRM resolution. In addition, the results indicate that large-scale surface evaporation and wind feed back are key processes for determining the surface rainfall amount in the GMMF. A sensitivity test with reduced sea surface temperatures shows both reduced surface rainfall and evaporation.

  3. The impact of simulated mesoscale convective systems on global precipitation: A multiscale modeling study

    NASA Astrophysics Data System (ADS)

    Tao, Wei-Kuo; Chern, Jiun-Dar

    2017-06-01

    The importance of precipitating mesoscale convective systems (MCSs) has been quantified from TRMM precipitation radar and microwave imager retrievals. MCSs generate more than 50% of the rainfall in most tropical regions. MCSs usually have horizontal scales of a few hundred kilometers (km); therefore, a large domain with several hundred km is required for realistic simulations of MCSs in cloud-resolving models (CRMs). Almost all traditional global and climate models do not have adequate parameterizations to represent MCSs. Typical multiscale modeling frameworks (MMFs) may also lack the resolution (4 km grid spacing) and domain size (128 km) to realistically simulate MCSs. The impact of MCSs on precipitation is examined by conducting model simulations using the Goddard Cumulus Ensemble (GCE, a CRM) model and Goddard MMF that uses the GCEs as its embedded CRMs. Both models can realistically simulate MCSs with more grid points (i.e., 128 and 256) and higher resolutions (1 or 2 km) compared to those simulations with fewer grid points (i.e., 32 and 64) and low resolution (4 km). The modeling results also show the strengths of the Hadley circulations, mean zonal and regional vertical velocities, surface evaporation, and amount of surface rainfall are weaker or reduced in the Goddard MMF when using more CRM grid points and higher CRM resolution. In addition, the results indicate that large-scale surface evaporation and wind feedback are key processes for determining the surface rainfall amount in the GMMF. A sensitivity test with reduced sea surface temperatures shows both reduced surface rainfall and evaporation.

  4. Continuous Sub-daily Rainfall Simulation for Regional Flood Risk Assessment - Modelling of Spatio-temporal Correlation Structure of Extreme Precipitation in the Austrian Alps

    NASA Astrophysics Data System (ADS)

    Salinas, J. L.; Nester, T.; Komma, J.; Bloeschl, G.

    2017-12-01

    Generation of realistic synthetic spatial rainfall is of pivotal importance for assessing regional hydroclimatic hazard as the input for long term rainfall-runoff simulations. The correct reproduction of observed rainfall characteristics, such as regional intensity-duration-frequency curves, and spatial and temporal correlations is necessary to adequately model the magnitude and frequency of the flood peaks, by reproducing antecedent soil moisture conditions before extreme rainfall events, and joint probability of flood waves at confluences. In this work, a modification of the model presented by Bardossy and Platte (1992), where precipitation is first modeled on a station basis as a multivariate autoregressive model (mAr) in a Normal space. The spatial and temporal correlation structures are imposed in the Normal space, allowing for a different temporal autocorrelation parameter for each station, and simultaneously ensuring the positive-definiteness of the correlation matrix of the mAr errors. The Normal rainfall is then transformed to a Gamma-distributed space, with parameters varying monthly according to a sinusoidal function, in order to adapt to the observed rainfall seasonality. One of the main differences with the original model is the simulation time-step, reduced from 24h to 6h. Due to a larger availability of daily rainfall data, as opposite to sub-daily (e.g. hourly), the parameters of the Gamma distributions are calibrated to reproduce simultaneously a series of daily rainfall characteristics (mean daily rainfall, standard deviations of daily rainfall, and 24h intensity-duration-frequency [IDF] curves), as well as other aggregated rainfall measures (mean annual rainfall, and monthly rainfall). The calibration of the spatial and temporal correlation parameters is performed in a way that the catchment-averaged IDF curves aggregated at different temporal scales fit the measured ones. The rainfall model is used to generate 10.000 years of synthetic precipitation, fed into a rainfall-runoff model to derive the flood frequency in the Tirolean Alps in Austria. Given the number of generated events, the simulation framework is able to generate a large variety of rainfall patterns, as well as reproduce the variograms of relevant extreme rainfall events in the region of interest.

  5. A location-based multiple point statistics method: modelling the reservoir with non-stationary characteristics

    NASA Astrophysics Data System (ADS)

    Yin, Yanshu; Feng, Wenjie

    2017-12-01

    In this paper, a location-based multiple point statistics method is developed to model a non-stationary reservoir. The proposed method characterizes the relationship between the sedimentary pattern and the deposit location using the relative central position distance function, which alleviates the requirement that the training image and the simulated grids have the same dimension. The weights in every direction of the distance function can be changed to characterize the reservoir heterogeneity in various directions. The local integral replacements of data events, structured random path, distance tolerance and multi-grid strategy are applied to reproduce the sedimentary patterns and obtain a more realistic result. This method is compared with the traditional Snesim method using a synthesized 3-D training image of Poyang Lake and a reservoir model of Shengli Oilfield in China. The results indicate that the new method can reproduce the non-stationary characteristics better than the traditional method and is more suitable for simulation of delta-front deposits. These results show that the new method is a powerful tool for modelling a reservoir with non-stationary characteristics.

  6. Combining Computational Fluid Dynamics and Agent-Based Modeling: A New Approach to Evacuation Planning

    PubMed Central

    Epstein, Joshua M.; Pankajakshan, Ramesh; Hammond, Ross A.

    2011-01-01

    We introduce a novel hybrid of two fields—Computational Fluid Dynamics (CFD) and Agent-Based Modeling (ABM)—as a powerful new technique for urban evacuation planning. CFD is a predominant technique for modeling airborne transport of contaminants, while ABM is a powerful approach for modeling social dynamics in populations of adaptive individuals. The hybrid CFD-ABM method is capable of simulating how large, spatially-distributed populations might respond to a physically realistic contaminant plume. We demonstrate the overall feasibility of CFD-ABM evacuation design, using the case of a hypothetical aerosol release in Los Angeles to explore potential effectiveness of various policy regimes. We conclude by arguing that this new approach can be powerfully applied to arbitrary population centers, offering an unprecedented preparedness and catastrophic event response tool. PMID:21687788

  7. Non-standard interactions and neutrinos from dark matter annihilation in the Sun

    NASA Astrophysics Data System (ADS)

    Demidov, S. V.

    2018-02-01

    We perform an analysis of the influence of non-standard neutrino interactions (NSI) on neutrino signal from dark matter annihilations in the Sun. Taking experimentally allowed benchmark values for the matter NSI parameters we show that the evolution of such neutrinos with energies at GeV scale can be considerably modified. We simulate propagation of neutrinos from the Sun to the Earth for realistic dark matter annihilation channels and find that the matter NSI can result in at most 30% correction to the signal rate of muon track events at neutrino telescopes. Still present experimental bounds on dark matter from these searches are robust in the presence of NSI within considerable part of their allowed parameter space. At the same time electron neutrino flux from dark matter annihilation in the Sun can be changed by a factor of few.

  8. Effects of elevated CO2, warming and summer drought on the carbon balance in a Danish heathland after seven treatment years - results from the CLIMAITE project

    NASA Astrophysics Data System (ADS)

    Steenberg Larsen, Klaus; Ambus, Per; Beier, Claus; Ibrom, Andreas; Ransijn, Johannes; Kappel Schmidt, Inger; Wu, Jian

    2013-04-01

    In a Danish heathland co-dominated by heather (Calluna vulgaris) and grasses (Deschampsia flexuosa) we simulated realistic future climate scenarios in a full-factorial design of elevated atmospheric CO2 (510 ppm), increased temperatures (0.5-1.5 °Celcius) and intensified summer drought events (4-6 weeks per year). Treatments were initiated in 2005. Using manual chamber techniques, we measured soil respiration (SR), ecosystem respiration (ER) and net ecosystem exchange of CO2 (NEE) and determined gross ecosystem photosynthesis (GEP) as NEE - ER. We also monitored carbon losses in the form of dissolved organic carbon (DOC) in leached soil water. The results indicate that across all combinations of treatments with elevated CO2, SR rates increased by 20-30%, whereas GEP rates increased by

  9. Macroscopic response to microscopic intrinsic noise in three-dimensional Fisher fronts.

    PubMed

    Nesic, S; Cuerno, R; Moro, E

    2014-10-31

    We study the dynamics of three-dimensional Fisher fronts in the presence of density fluctuations. To this end we simulate the Fisher equation subject to stochastic internal noise, and study how the front moves and roughens as a function of the number of particles in the system, N. Our results suggest that the macroscopic behavior of the system is driven by the microscopic dynamics at its leading edge where number fluctuations are dominated by rare events. Contrary to naive expectations, the strength of front fluctuations decays extremely slowly as 1/logN, inducing large-scale fluctuations which we find belong to the one-dimensional Kardar-Parisi-Zhang universality class of kinetically rough interfaces. Hence, we find that there is no weak-noise regime for Fisher fronts, even for realistic numbers of particles in macroscopic systems.

  10. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  11. The first super geomagnetic storm of solar cycle 24: "The St. Patrick day (17 March 2015)" event

    NASA Astrophysics Data System (ADS)

    Wu, C. C.; Liou, K.; Socker, D. G.; Howard, R.; Jackson, B. V.; Yu, H. S.; Hutting, L.; Plunkett, S. P.

    2015-12-01

    The first super geomagnetic storm of solar cycle 24 occurred on the "St. Patrick's day" (17 March 2015). Notably, it was a two-step storm. The source of the storm can be traced back to the solar event on March 15, 2015. At ~2:10 UT on that day, SOHO/LASCO C3 recorded a partial halo corona mass ejection (CME) which was associated with a C9.1/1F flare (S22W25) and a series of type II/IV radio bursts. The propagation speed of this CME is estimated to be ~668 km/s during 02:10 - 06:20 UT (Figure 1). An interplanetary (IP) shock, likely driven by the CME, arrived at the Wind spacecraft at 03:59 UT on 17 March (Figure 2). The arrival of the IP shock at the Earth may have caused a sudden storm commencement (SSC) at 04:45 UT on March 17. The storm intensified (Dst dropped to -80 nT at ~10:00 UT) during the crossing of the CME sheath. Later, the storm recovered slightly (Dst ~ -50 nT) after the IMF turned northward. At 11:01 UT, IMF started turning southward again due to the large magnetic cloud (MC) field itself and caused the second storm intensification, reaching Dst = - 228 nT on March 18. We conclude that the St. Patrick day event is a two-step storm. The first step is associated with the sheath, whereas the second step is associated with the MC. Here, we employ a numerical simulation using the global, three-dimensional (3D), time-dependent, magnetohydrodynamic (MHD) model (H3DMHD, Wu et al. 2007) to study the CME propagation from the Sun to the Earth. The H3DMHD model has been modified so that it can be driven by (solar wind) data at the inner boundary of the computational domain. In this study, we use time varying, 3D solar wind velocity and density reconstructed from STELab, Japan interplanetary scintillation (IPS) data by the University of California, San Diego, and magnetic field at the IPS inner boundary provided by CSSS model closed-loop propagation (Jackson et a., 2015). The simulation result matches well with the in situ solar wind plasma and field data at Wind, in terms of the peak values of the IP shock and its arrival time (Figure 3). The simulation not only helps us to identify the driver of the IP shock, but also demonstrates that the modified H3DMHD model is capable of realistic simulations of large solar event. In this presentation, we will discuss the CME/storm event with detailed data from observations (Wind and SOHO) and our numerical simulation.

  12. Impact of the spatial distribution of the atmospheric forcing on water mass formation in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    BéRanger, Karine; Drillet, Yann; Houssais, Marie-NoëLle; Testor, Pierre; Bourdallé-Badie, Romain; Alhammoud, Bahjat; Bozec, Alexandra; Mortier, Laurent; Bouruet-Aubertot, Pascale; CréPon, Michel

    2010-12-01

    The impact of the atmospheric forcing on the winter ocean convection in the Mediterranean Sea was studied with a high-resolution ocean general circulation model. The major areas of focus are the Levantine basin, the Aegean-Cretan Sea, the Adriatic Sea, and the Gulf of Lion. Two companion simulations differing by the horizontal resolution of the atmospheric forcing were compared. The first simulation (MED16-ERA40) was forced by air-sea fields from ERA40, which is the ECMWF reanalysis. The second simulation (MED16-ECMWF) was forced by the ECMWF-analyzed surface fields that have a horizontal resolution twice as high as those of ERA40. The analysis of the standard deviations of the atmospheric fields shows that increasing the resolution of the atmospheric forcing leads in all regions to a better channeling of the winds by mountains and to the generation of atmospheric mesoscale patterns. Comparing the companion ocean simulation results with available observations in the Adriatic Sea and in the Gulf of Lion shows that MED16-ECMWF is more realistic than MED16-ERA40. In the eastern Mediterranean, although deep water formation occurs in the two experiments, the depth reached by the convection is deeper in MED16-ECMWF. In the Gulf of Lion, deep water formation occurs only in MED16-ECMWF. This larger sensitivity of the western Mediterranean convection to the forcing resolution is investigated by running a set of sensitivity experiments to analyze the impact of different time-space resolutions of the forcing on the intense winter convection event in winter 1998-1999. The sensitivity to the forcing appears to be mainly related to the effect of wind channeling by the land orography, which can only be reproduced in atmospheric models of sufficient resolution. Thus, well-positioned patterns of enhanced wind stress and ocean surface heat loss are able to maintain a vigorous gyre circulation favoring efficient preconditioning of the area at the beginning of winter and to drive realistic buoyancy loss and mixing responsible for strong convection at the end of winter.

  13. Traffic flow simulation for an urban freeway corridor

    DOT National Transportation Integrated Search

    1998-01-01

    The objective of this paper is to develop a realistic and operational macroscopic traffic flow simulation model which requires relatively less data collection efforts. Such a model should be capable of delineating the dynamics of traffic flow created...

  14. Astronauts Grissom and Young in Gemini Mission Simulator

    NASA Image and Video Library

    1964-05-22

    S64-25295 (March 1964) --- Astronauts Virgil I. (Gus) Grissom (right) and John W. Young, prime crew for the first manned Gemini mission (GT-3), are shown inside a Gemini mission simulator at McDonnell Aircraft Corp., St. Louis, MO. The simulator will provide Gemini astronauts and ground crews with realistic mission simulation during intensive training prior to actual launch.

  15. Realistic Simulations of Coronagraphic Observations with WFIRST

    NASA Astrophysics Data System (ADS)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  16. Realistic page-turning of electronic books

    NASA Astrophysics Data System (ADS)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  17. Virtual reality welder training

    NASA Astrophysics Data System (ADS)

    White, Steven A.; Reiners, Dirk; Prachyabrued, Mores; Borst, Christoph W.; Chambers, Terrence L.

    2010-01-01

    This document describes the Virtual Reality Simulated MIG Lab (sMIG), a system for Virtual Reality welder training. It is designed to reproduce the experience of metal inert gas (MIG) welding faithfully enough to be used as a teaching tool for beginning welding students. To make the experience as realistic as possible it employs physically accurate and tracked input devices, a real-time welding simulation, real-time sound generation and a 3D display for output. Thanks to being a fully digital system it can go beyond providing just a realistic welding experience by giving interactive and immediate feedback to the student to avoid learning wrong movements from day 1.

  18. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    NASA Astrophysics Data System (ADS)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  19. Simulations of Madden-Julian Oscillation in High Resolution Atmospheric General Circulation Model

    NASA Astrophysics Data System (ADS)

    Deng, Liping; Stenchikov, Georgiy; McCabe, Matthew; Bangalath, HamzaKunhu; Raj, Jerry; Osipov, Sergey

    2014-05-01

    The simulation of tropical signals, especially the Madden-Julian Oscillation (MJO), is one of the major deficiencies in current numerical models. The unrealistic features in the MJO simulations include the weak amplitude, more power at higher frequencies, displacement of the temporal and spatial distributions, eastward propagation speed being too fast, and a lack of coherent structure for the eastward propagation from the Indian Ocean to the Pacific (e.g., Slingo et al. 1996). While some improvement in simulating MJO variance and coherent eastward propagation has been attributed to model physics, model mean background state and air-sea interaction, studies have shown that the model resolution, especially for higher horizontal resolution, may play an important role in producing a more realistic simulation of MJO (e.g., Sperber et al. 2005). In this study, we employ unique high-resolution (25-km) simulations conducted using the Geophysical Fluid Dynamics Laboratory global High Resolution Atmospheric Model (HIRAM) to evaluate the MJO simulation against the European Center for Medium-range Weather Forecasts (ECMWF) Interim re-analysis (ERAI) dataset. We specifically focus on the ability of the model to represent the MJO related amplitude, spatial distribution, eastward propagation, and horizontal and vertical structures. Additionally, as the HIRAM output covers not only an historic period (1979-2012) but also future period (2012-2050), the impact of future climate change related to the MJO is illustrated. The possible changes in intensity and frequency of extreme weather and climate events (e.g., strong wind and heavy rainfall) in the western Pacific, the Indian Ocean and the Middle East North Africa (MENA) region are highlighted.

  20. QERx- A Faster than Real-Time Emulator for Space Processors

    NASA Astrophysics Data System (ADS)

    Carvalho, B.; Pidgeon, A.; Robinson, P.

    2012-08-01

    Developing software for space systems is challenging. Especially because, in order to be sure it can cope with the harshness of the environment and the imperative requirements and constrains imposed by the platform were it will run, it needs to be tested exhaustively. Software Validation Facilities (SVF) are known to the industry and developers, and provide the means to run the On-Board Software (OBSW) in a realistic environment, allowing the development team to debug and test the software.But the challenge is to be able to keep up with the performance of the new processors (LEON2 and LEON3), which need to be emulated within the SVF. Such processor emulators are also used in Operational Simulators, used to support mission preparation and train mission operators. These simulators mimic the satellite and its behaviour, as realistically as possible. For test/operational efficiency reasons and because they will need to interact with external systems, both these uses cases require the processor emulators to provide real-time, or faster, performance.It is known to the industry that the performance of previously available emulators is not enough to cope with the performance of the new processors available in the market. SciSys approached this problem with dynamic translation technology trying to keep costs down by avoiding a hardware solution and keeping the integration flexibility of full software emulation.SciSys presented “QERx: A High Performance Emulator for Software Validation and Simulations” [1], in a previous DASIA event. Since then that idea has evolved and QERx has been successfully validated. SciSys is now presenting QERx as a product that can be tailored to fit different emulation needs. This paper will present QERx latest developments and current status.

  1. The effect of regularization in motion compensated PET image reconstruction: a realistic numerical 4D simulation study.

    PubMed

    Tsoumpas, C; Polycarpou, I; Thielemans, K; Buerger, C; King, A P; Schaeffter, T; Marsden, P K

    2013-03-21

    Following continuous improvement in PET spatial resolution, respiratory motion correction has become an important task. Two of the most common approaches that utilize all detected PET events to motion-correct PET data are the reconstruct-transform-average method (RTA) and motion-compensated image reconstruction (MCIR). In RTA, separate images are reconstructed for each respiratory frame, subsequently transformed to one reference frame and finally averaged to produce a motion-corrected image. In MCIR, the projection data from all frames are reconstructed by including motion information in the system matrix so that a motion-corrected image is reconstructed directly. Previous theoretical analyses have explained why MCIR is expected to outperform RTA. It has been suggested that MCIR creates less noise than RTA because the images for each separate respiratory frame will be severely affected by noise. However, recent investigations have shown that in the unregularized case RTA images can have fewer noise artefacts, while MCIR images are more quantitatively accurate but have the common salt-and-pepper noise. In this paper, we perform a realistic numerical 4D simulation study to compare the advantages gained by including regularization within reconstruction for RTA and MCIR, in particular using the median-root-prior incorporated in the ordered subsets maximum a posteriori one-step-late algorithm. In this investigation we have demonstrated that MCIR with proper regularization parameters reconstructs lesions with less bias and root mean square error and similar CNR and standard deviation to regularized RTA. This finding is reproducible for a variety of noise levels (25, 50, 100 million counts), lesion sizes (8 mm, 14 mm diameter) and iterations. Nevertheless, regularized RTA can also be a practical solution for motion compensation as a proper level of regularization reduces both bias and mean square error.

  2. Learning English with "The Sims": Exploiting Authentic Computer Simulation Games for L2 Learning

    ERIC Educational Resources Information Center

    Ranalli, Jim

    2008-01-01

    With their realistic animation, complex scenarios and impressive interactivity, computer simulation games might be able to provide context-rich, cognitively engaging virtual environments for language learning. However, simulation games designed for L2 learners are in short supply. As an alternative, could games designed for the mass-market be…

  3. Nucleic acids: theory and computer simulation, Y2K.

    PubMed

    Beveridge, D L; McConnell, K J

    2000-04-01

    Molecular dynamics simulations on DNA and RNA that include solvent are now being performed under realistic environmental conditions of water activity and salt. Improvements to force-fields and treatments of long-range interactions have significantly increased the reliability of simulations. New studies of sequence effects, axis bending, solvation and conformational transitions have appeared.

  4. Simulating tracer transport in variably saturated soils and shallow groundwater

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to develop a realistic model to simulate the complex processes of flow and tracer transport in variably saturated soils and to compare simulation results with the detailed monitoring observations. The USDA-ARS OPE3 field site was selected for the case study due to ava...

  5. Performance Evaluation Gravity Probe B Design

    NASA Technical Reports Server (NTRS)

    Francis, Ronnie; Wells, Eugene M.

    1996-01-01

    This final report documents the work done to develop a 6 degree-of-freedom simulation of the Lockheed Martin Gravity Probe B (GPB) Spacecraft. This simulation includes the effects of vehicle flexibility and propellant slosh. The simulation was used to investigate the control performance of the spacecraft when subjected to realistic on orbit disturbances.

  6. Realistic simulated MRI and SPECT databases. Application to SPECT/MRI registration evaluation.

    PubMed

    Aubert-Broche, Berengere; Grova, Christophe; Reilhac, Anthonin; Evans, Alan C; Collins, D Louis

    2006-01-01

    This paper describes the construction of simulated SPECT and MRI databases that account for realistic anatomical and functional variability. The data is used as a gold-standard to evaluate four SPECT/MRI similarity-based registration methods. Simulation realism was accounted for using accurate physical models of data generation and acquisition. MRI and SPECT simulations were generated from three subjects to take into account inter-subject anatomical variability. Functional SPECT data were computed from six functional models of brain perfusion. Previous models of normal perfusion and ictal perfusion observed in Mesial Temporal Lobe Epilepsy (MTLE) were considered to generate functional variability. We studied the impact noise and intensity non-uniformity in MRI simulations and SPECT scatter correction may have on registration accuracy. We quantified the amount of registration error caused by anatomical and functional variability. Registration involving ictal data was less accurate than registration involving normal data. MR intensity nonuniformity was the main factor decreasing registration accuracy. The proposed simulated database is promising to evaluate many functional neuroimaging methods, involving MRI and SPECT data.

  7. An Effective Construction Method of Modular Manipulator 3D Virtual Simulation Platform

    NASA Astrophysics Data System (ADS)

    Li, Xianhua; Lv, Lei; Sheng, Rui; Sun, Qing; Zhang, Leigang

    2018-06-01

    This work discusses about a fast and efficient method of constructing an open 3D manipulator virtual simulation platform which make it easier for teachers and students to learn about positive and inverse kinematics of a robot manipulator. The method was carried out using MATLAB. In which, the Robotics Toolbox, MATLAB GUI and 3D animation with the help of modelling using SolidWorks, were fully applied to produce a good visualization of the system. The advantages of using quickly build is its powerful function of the input and output and its ability to simulate a 3D manipulator realistically. In this article, a Schunk six DOF modular manipulator was constructed by the author's research group to be used as example. The implementation steps of this method was detailed described, and thereafter, a high-level open and realistic visualization manipulator 3D virtual simulation platform was achieved. With the graphs obtained from simulation, the test results show that the manipulator 3D virtual simulation platform can be constructed quickly with good usability and high maneuverability, and it can meet the needs of scientific research and teaching.

  8. Pile-Up Discrimination Algorithms for the HOLMES Experiment

    NASA Astrophysics Data System (ADS)

    Ferri, E.; Alpert, B.; Bennett, D.; Faverzani, M.; Fowler, J.; Giachero, A.; Hays-Wehle, J.; Maino, M.; Nucciotti, A.; Puiu, A.; Ullom, J.

    2016-07-01

    The HOLMES experiment is a new large-scale experiment for the electron neutrino mass determination by means of the electron capture decay of ^{163}Ho. In such an experiment, random coincidence events are one of the main sources of background which impair the ability to identify the effect of a non-vanishing neutrino mass. In order to resolve these spurious events, detectors characterized by a fast response are needed as well as pile-up recognition algorithms. For that reason, we have developed a code for testing the discrimination efficiency of various algorithms in recognizing pile up events in dependence of the time separation between two pulses. The tests are performed on simulated realistic TES signals and noise. Indeed, the pulse profile is obtained by solving the two coupled differential equations which describe the response of the TES according to the Irwin-Hilton model. To these pulses, a noise waveform which takes into account all the noise sources regularly present in a real TES is added. The amplitude of the generated pulses is distributed as the ^{163}Ho calorimetric spectrum. Furthermore, the rise time of these pulses has been chosen taking into account the constraints given by both the bandwidth of the microwave multiplexing read out with a flux ramp demodulation and the bandwidth of the ADC boards currently available for ROACH2. Among the different rejection techniques evaluated, the Wiener Filter technique, a digital filter to gain time resolution, has shown an excellent pile-up rejection efficiency. The obtained time resolution closely matches the baseline specifications of the HOLMES experiment. We report here a description of our simulation code and a comparison of the different rejection techniques.

  9. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  10. Accuracy assessment of high-rate GPS measurements for seismology

    NASA Astrophysics Data System (ADS)

    Elosegui, P.; Davis, J. L.; Ekström, G.

    2007-12-01

    Analysis of GPS measurements with a controlled laboratory system, built to simulate the ground motions caused by tectonic earthquakes and other transient geophysical signals such as glacial earthquakes, enables us to assess the technique of high-rate GPS. The root-mean-square (rms) position error of this system when undergoing realistic simulated seismic motions is 0.05~mm, with maximum position errors of 0.1~mm, thus providing "ground truth" GPS displacements. We have acquired an extensive set of high-rate GPS measurements while inducing seismic motions on a GPS antenna mounted on this system with a temporal spectrum similar to real seismic events. We found that, for a particular 15-min-long test event, the rms error of the 1-Hz GPS position estimates was 2.5~mm, with maximum position errors of 10~mm, and the error spectrum of the GPS estimates was approximately flicker noise. These results may however represent a best-case scenario since they were obtained over a short (~10~m) baseline, thereby greatly mitigating baseline-dependent errors, and when the number and distribution of satellites on the sky was good. For example, we have determined that the rms error can increase by a factor of 2--3 as the GPS constellation changes throughout the day, with an average value of 3.5~mm for eight identical, hourly-spaced, consecutive test events. The rms error also increases with increasing baseline, as one would expect, with an average rms error for a ~1400~km baseline of 9~mm. We will present an assessment of the accuracy of high-rate GPS based on these measurements, discuss the implications of this study for seismology, and describe new applications in glaciology.

  11. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility

    NASA Technical Reports Server (NTRS)

    Moehlmann, D.; Kochan, H.

    1992-01-01

    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.

  12. Man-in-the-control-loop simulation of manipulators

    NASA Technical Reports Server (NTRS)

    Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.

  13. Exploitation of realistic computational anthropomorphic phantoms for the optimization of nuclear imaging acquisition and processing protocols.

    PubMed

    Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C

    2014-01-01

    Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies.

  14. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoang, Tuan L.; Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, CA 94550; Marian, Jaime, E-mail: jmarian@ucla.edu

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a proceduremore » for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe{sup 3+}, He{sup +} and H{sup +}) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.« less

  15. Realistic computer network simulation for network intrusion detection dataset generation

    NASA Astrophysics Data System (ADS)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  16. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    NASA Astrophysics Data System (ADS)

    Hoang, Tuan L.; Marian, Jaime; Bulatov, Vasily V.; Hosemann, Peter

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a procedure for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe3+, He+ and H+) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.

  17. Design of a digital phantom population for myocardial perfusion SPECT imaging research.

    PubMed

    Ghaly, Michael; Du, Yong; Fung, George S K; Tsui, Benjamin M W; Links, Jonathan M; Frey, Eric

    2014-06-21

    Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.

  18. Design of a digital phantom population for myocardial perfusion SPECT imaging research

    NASA Astrophysics Data System (ADS)

    Ghaly, Michael; Du, Yong; Fung, George S. K.; Tsui, Benjamin M. W.; Links, Jonathan M.; Frey, Eric

    2014-06-01

    Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.

  19. Training in Methods in Computational Neuroscience

    DTIC Science & Technology

    1992-08-29

    in Tritonia. Roger Traub Models with realistic neurons , with an emphasis on large-scale modeling of epileptic phenomena in hippocampus. Rodolpho...Cell Model Plan: 1) Convert some of my simulations from NEURON to GENESIS (and thus learn GENESIS). 2) Develop a realistic inhibtory model . 3) Further...General Hospital, MA Course Project: Membrane Properties of a Neostriatal Neuron and Dopamine Modulation The purpose of my project was to model the

  20. Idealized Simulations of a Squall Line from the MC3E Field Campaign Applying Three Bin Microphysics Schemes: Dynamic and Thermodynamic Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, Lulin; Fan, Jiwen; Lebo, Zachary J.

    The squall line event on May 20, 2011, during the Midlatitude Continental Convective Clouds (MC3E) field campaign has been simulated by three bin (spectral) microphysics schemes coupled into the Weather Research and Forecasting (WRF) model. Semi-idealized three-dimensional simulations driven by temperature and moisture profiles acquired by a radiosonde released in the pre-convection environment at 1200 UTC in Morris, Oklahoma show that each scheme produced a squall line with features broadly consistent with the observed storm characteristics. However, substantial differences in the details of the simulated dynamic and thermodynamic structure are evident. These differences are attributed to different algorithms and numericalmore » representations of microphysical processes, assumptions of the hydrometeor processes and properties, especially ice particle mass, density, and terminal velocity relationships with size, and the resulting interactions between the microphysics, cold pool, and dynamics. This study shows that different bin microphysics schemes, designed to be conceptually more realistic and thus arguably more accurate than bulk microphysics schemes, still simulate a wide spread of microphysical, thermodynamic, and dynamic characteristics of a squall line, qualitatively similar to the spread of squall line characteristics using various bulk schemes. Future work may focus on improving the representation of ice particle properties in bin schemes to reduce this uncertainty and using the similar assumptions for all schemes to isolate the impact of physics from numerics.« less

  1. Initial assessment of image quality for low-dose PET: evaluation of lesion detectability

    NASA Astrophysics Data System (ADS)

    Schaefferkoetter, Joshua D.; Yan, Jianhua; Townsend, David W.; Conti, Maurizio

    2015-07-01

    In the context of investigating the potential of low-dose PET imaging for screening applications, we developed methods to assess small lesion detectability as a function of the number of counts in the scan. We present here our methods and preliminary validation using tuberculosis cases. FDG-PET data from seventeen patients presenting diffuse hyper-metabolic lung lesions were selected for the study, to include a wide range of lesion sizes and contrasts. Reduced doses were simulated by randomly discarding events in the PET list mode, and ten realizations at each simulated dose were generated and reconstructed. The data were grouped into 9 categories determined by the number of included true events, from  >40 M to  <250 k counts. The images reconstructed from the original full statistical set were used to identify lung lesions, and each was, at every simulated dose, quantified by 6 parameters: lesion metabolic volume, lesion-to-background contrast, mean lesion tracer uptake, standard deviation of activity measurements (across realizations), lesion signal-to-noise ratio (SNR), and Hotelling observer SNR. Additionally, a lesion-detection task including 550 images was presented to several experienced image readers for qualitative assessment. Human observer performances were ranked using receiver operating characteristic analysis. The observer results were correlated with the lesion image measurements and used to train mathematical observer models. Absolute sensitivities and specificities of the human observers, as well as the area under the ROC curve, showed clustering and performance similarities among images produced from 5 million or greater counts. The results presented here are from a clinically realistic but highly constrained experiment, and more work is needed to validate these findings with a larger patient population.

  2. Initial assessment of image quality for low-dose PET: evaluation of lesion detectability.

    PubMed

    Schaefferkoetter, Joshua D; Yan, Jianhua; Townsend, David W; Conti, Maurizio

    2015-07-21

    In the context of investigating the potential of low-dose PET imaging for screening applications, we developed methods to assess small lesion detectability as a function of the number of counts in the scan. We present here our methods and preliminary validation using tuberculosis cases. FDG-PET data from seventeen patients presenting diffuse hyper-metabolic lung lesions were selected for the study, to include a wide range of lesion sizes and contrasts. Reduced doses were simulated by randomly discarding events in the PET list mode, and ten realizations at each simulated dose were generated and reconstructed. The data were grouped into 9 categories determined by the number of included true events, from  >40 M to  <250 k counts. The images reconstructed from the original full statistical set were used to identify lung lesions, and each was, at every simulated dose, quantified by 6 parameters: lesion metabolic volume, lesion-to-background contrast, mean lesion tracer uptake, standard deviation of activity measurements (across realizations), lesion signal-to-noise ratio (SNR), and Hotelling observer SNR. Additionally, a lesion-detection task including 550 images was presented to several experienced image readers for qualitative assessment. Human observer performances were ranked using receiver operating characteristic analysis. The observer results were correlated with the lesion image measurements and used to train mathematical observer models. Absolute sensitivities and specificities of the human observers, as well as the area under the ROC curve, showed clustering and performance similarities among images produced from 5 million or greater counts. The results presented here are from a clinically realistic but highly constrained experiment, and more work is needed to validate these findings with a larger patient population.

  3. Mechanisms of diurnal precipitation over the US Great Plains: a cloud resolving model perspective

    NASA Astrophysics Data System (ADS)

    Lee, Myong-In; Choi, Ildae; Tao, Wei-Kuo; Schubert, Siegfried D.; Kang, In-Sik

    2010-02-01

    The mechanisms of summertime diurnal precipitation in the US Great Plains were examined with the two-dimensional (2D) Goddard Cumulus Ensemble (GCE) cloud-resolving model (CRM). The model was constrained by the observed large-scale background state and surface flux derived from the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Program’s Intensive Observing Period (IOP) data at the Southern Great Plains (SGP). The model, when continuously-forced by realistic surface flux and large-scale advection, simulates reasonably well the temporal evolution of the observed rainfall episodes, particularly for the strongly forced precipitation events. However, the model exhibits a deficiency for the weakly forced events driven by diurnal convection. Additional tests were run with the GCE model in order to discriminate between the mechanisms that determine daytime and nighttime convection. In these tests, the model was constrained with the same repeating diurnal variation in the large-scale advection and/or surface flux. The results indicate that it is primarily the surface heat and moisture flux that is responsible for the development of deep convection in the afternoon, whereas the large-scale upward motion and associated moisture advection play an important role in preconditioning nocturnal convection. In the nighttime, high clouds are continuously built up through their interaction and feedback with long-wave radiation, eventually initiating deep convection from the boundary layer. Without these upper-level destabilization processes, the model tends to produce only daytime convection in response to boundary layer heating. This study suggests that the correct simulation of the diurnal variation in precipitation requires that the free-atmospheric destabilization mechanisms resolved in the CRM simulation must be adequately parameterized in current general circulation models (GCMs) many of which are overly sensitive to the parameterized boundary layer heating.

  4. Estimating the human influence on Hurricanes Harvey, Irma and Maria

    NASA Astrophysics Data System (ADS)

    Wehner, M. F.; Patricola, C. M.; Risser, M. D.

    2017-12-01

    Attribution of the human-induced climate change influence on the physical characteristics of individual extreme weather events has become an advanced science over the past decade. However, it is only recently that such quantification of anthropogenic influences on event magnitudes and probability of occurrence could be applied to very extreme storms such as hurricanes. We present results from two different classes of attribution studies for the impactful Atlantic hurricanes of 2017. The first is an analysis of the record rainfall amounts during Hurricane Harvey in the Houston, Texas area. We analyzed observed precipitation from the Global Historical Climatology Network with a covariate-based extreme value statistical analysis, accounting for both the external influence of global warming and the internal influence of ENSO. We found that human-induced climate change likely increased Hurricane Harvey's total rainfall by at least 19%, and likely increased the chances of the observed rainfall by a factor of at least 3.5. This suggests that changes exceeded Clausius-Clapeyron scaling, motivating attribution studies using dynamical climate models. The second analysis consists of two sets of hindcast simulations of Hurricanes Harvey, Irma, and Maria using the Weather Research and Forecasting model (WRF) at 4.5 km resolution. The first uses realistic boundary and initial conditions and present-day greenhouse gas forcings while the second uses perturbed conditions and pre-industrial greenhouse has forcings to simulate counterfactual storms without anthropogenic influences. These simulations quantify the fraction of Harvey's precipitation attributable to human activities and test the super Clausius-Clapeyron scaling suggested by the observational analysis. We will further quantify the human influence on intensity for Harvey, Irma, and Maria.

  5. A new spatial snow distribution in hydrological models parameterized from observed spatial variability of precipitation.

    NASA Astrophysics Data System (ADS)

    Skaugen, Thomas; Weltzien, Ingunn

    2016-04-01

    The traditional catchment hydrological model with its many free calibration parameters is not a well suited tool for prediction under conditions for which is has not been calibrated. Important tasks for hydrological modelling such as prediction in ungauged basins and assessing hydrological effects of climate change are hence not solved satisfactory. In order to reduce the number of calibration parameters in hydrological models we have introduced a new model which uses a dynamic gamma distribution as the spatial frequency distribution of snow water equivalent (SWE). The parameters are estimated from observed spatial variability of precipitation and the magnitude of accumulation and melting events and are hence not subject to calibration. The relationship between spatial mean and variance of precipitation is found to follow a pattern where decreasing temporal correlation with increasing accumulation or duration of the event leads to a levelling off or even a decrease of the spatial variance. The new model for snow distribution is implemented in the, already parameter parsimonious, DDD (Distance Distribution Dynamics) hydrological model and was tested for 71 Norwegian catchments. We compared the new snow distribution model with the current operational snow distribution model where a fixed, calibrated coefficient of variation parameterizes a log-normal model for snow distribution. Results show that the precision of runoff simulations is equal, but that the new snow distribution model better simulates snow covered area (SCA) when compared with MODIS satellite derived snow cover. In addition, SWE is simulated more realistically in that seasonal snow is melted out and the building up of "snow towers" is prevented and hence spurious trends in SWE.

  6. Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events

    NASA Astrophysics Data System (ADS)

    Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.

    2018-05-01

    The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.

  7. An investigation of the role of current and future remote sensing data systems in numerical meteorology

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Smith, William L.

    1993-01-01

    The goals of this research endeavor have been to develop a flexible and relatively complete framework for the investigation of current and future satellite data sources in numerical meteorology. In order to realistically model how satellite information might be used for these purposes, it is necessary that Observing System Simulation Experiments (OSSEs) be as complete as possible. It is therefore desirable that these experiments simulate in entirety the sequence of steps involved in bringing satellite information from the radiance level through product retrieval to a realistic analysis and forecast sequence. In this project we have worked to make this sequence realistic by synthesizing raw satellite data from surrogate atmospheres, deriving satellite products from these data and subsequently producing analyses and forecasts using the retrieved products. The accomplishments made in 1991 are presented. The emphasis was on examining atmospheric soundings and microphysical products which we expect to produce with the launch of the Advanced Microwave Sounding Unit (AMSU), slated for flight in mid 1994.

  8. [Variation in closeness to reality of standardized resuscitation scenarios : Effects on the success of cognitive learning of medical students].

    PubMed

    Schaumberg, A

    2015-04-01

    Simulation often relies on a case-based learning approach and is used as a teaching tool for a variety of audiences. The knowledge transfer goes beyond the mere exchange of soft skills and practical abilities and also includes practical knowledge and decision-making behavior; however, verification of knowledge or practical skills seldom unfolds during simulations. Simulation-based learning seems to affect many learning domains and can, therefore, be considered to be multifactorial in nature. At present, studies examining the effects of learning environments with varying levels of reality on the cognitive long-term retention of students are lacking. The present study focused on the question whether case scenarios with varying levels of reality produce differences in the cognitive long-term retention of students, in particular with regard to the learning dimensions knowledge, understanding and transfer. The study was conducted on 153 students in the first clinical semester at the Justus-Liebig University of Giessen. Students were randomly selected and subsequently assigned, also in a random fashion, to two practice groups, i.e. realistic and unrealistic. In both groups the students were presented with standardized case scenarios consisting of three case studies, which were accurately defined with a case report containing a detailed description of each scenario and all relevant values so as to ensure identical conditions for both groups. The unrealistic group sat in an unfurnished practice room as a learning environment. The realistic group sat in a furnished learning environment with various background pictures and ambient noise. Students received examination questions before, immediately following and 14 days after the practice. Examination questions were identical at each of the three time points, classified into three learning dimensions following Bloom's taxonomy and evaluated. Furthermore, examination questions were supplemented by a questionnaire concerning the individual perception of reality and own learning success, to be filled in by students immediately after the practice. Examination questions and questionnaires were anonymous but associated with each other. Even with less experienced participants, realistic simulation design led to a significant increase of knowledge immediately after the end of the simulation. This effect, however, did not impact the cognitive long-term retention of students. While the realistic group showed a higher initial knowledge after the simulation, this "knowledge delta" was forgotten within 14 days, putting them back on par with the unrealistic comparison group. It could be significantly demonstrated that 2 weeks after the practice, comprehension questions were answered better than those on pure knowledge. Therefore, it can be concluded that even vaguely realistic simulation scenarios affect the learning dimension of understanding. For simulation-based learning the outcome depends not only on knowledge, practical skills and motivational variables but also on the onset of negative emotions, perception of own ability and personality profile. Simulation training alone does not appear to guarantee learning success but it seems to be necessary to establish a simulation setting suitable for the education level, needs and personality characteristics of the students.

  9. Virtual gaming simulation of a mental health assessment: A usability study.

    PubMed

    Verkuyl, Margaret; Romaniuk, Daria; Mastrilli, Paula

    2018-05-18

    Providing safe and realistic virtual simulations could be an effective way to facilitate the transition from the classroom to clinical practice. As nursing programs begin to include virtual simulations as a learning strategy; it is critical to first assess the technology for ease of use and usefulness. A virtual gaming simulation was developed, and a usability study was conducted to assess its ease of use and usefulness for students and faculty. The Technology Acceptance Model provided the framework for the study, which included expert review and testing by nursing faculty and nursing students. This study highlighted the importance of assessing ease of use and usefulness in a virtual game simulation and provided feedback for the development of an effective virtual gaming simulation. The study participants said the virtual gaming simulation was engaging, realistic and similar to a clinical experience. Participants found the game easy to use and useful. Testing provided the development team with ideas to improve the user interface. The usability methodology provided is a replicable approach to testing virtual experiences before a research study or before implementing virtual experiences into curriculum. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  11. Alborz-I array: A simulation on performance and properties of the array around the knee of the cosmic ray spectrum

    NASA Astrophysics Data System (ADS)

    Abdollahi, Soheila; Bahmanabadi, Mahmud; Pezeshkian, Yousef; Mortazavi Moghaddam, Saba

    2016-03-01

    The first phase of the Alborz Observatory Array (Alborz-I) consists of 20 plastic scintillation detectors each one with surface area of 0.25 m2spread over an area of 40 × 40 m2 realized to the study of Extensive Air Showers around the knee at the Sharif University of Technology campus. The first stage of the project including construction and operation of a prototype system has now been completed and the electronics that will be used in the array instrument has been tested under field conditions. In order to achieve a realistic estimate of the array performance, a large number of simulated CORSIKA showers have been used. In the present work, theoretical results obtained in the study of different array layouts and trigger conditions are described. Using Monte Carlo simulations of showers the rate of detected events per day and the trigger probability functions, i.e., the probability for an extensive air shower to trigger a ground based array as a function of the shower core distance to the center of array are presented for energies above 1 TeV and zenith angles up to 60°. Moreover, the angular resolution of the Alborz-I array is obtained.

  12. A Bone Marrow Aspirate and Trephine Simulator.

    PubMed

    Yap, Eng Soo; Koh, Pei Lin; Ng, Chin Hin; de Mel, Sanjay; Chee, Yen Lin

    2015-08-01

    Bone marrow aspirate and trephine (BMAT) biopsy is a commonly performed procedure in hematology-oncology practice. Although complications are uncommon, they can cause significant morbidity and mortality. Simulation models are an excellent tool to teach novice doctors basic procedural skills before performing the actual procedure on patients to improve patient safety and well-being. There are no commercial BMAT simulators, and this technical report describes the rationale, technical specifications, and construction of a low-cost, easily constructed, reusable BMAT simulator that reproduced the tactile properties of tissue layers for use as a teaching tool in our resident BMAT simulation course. Preliminary data of learner responses to the simulator were also collected. From April 2013 to November 2013, 32 internal medicine residents underwent the BMAT simulation course. Eighteen (56%) completed the online survey, 11 residents with previous experience doing BMAT and 7 without experience. Despite the difference in operative experience, both experienced and novice residents all agreed or strongly agreed that the model aided their understanding of the BMAT procedure. All agreed or strongly agreed that this enhanced their knowledge of anatomy and 16 residents (89%) agreed or strongly agreed that this model was a realistic simulator. We present a novel, low-cost, easily constructed, realistic BMAT simulator for training novice doctors to perform BMAT.

  13. The differential effect of realistic and unrealistic counterfactual thinking on regret.

    PubMed

    Sevdalis, Nick; Kokkinaki, Flora

    2006-06-01

    Research has established that realistic counterfactual thinking can determine the intensity and the content of people's affective reactions to decision outcomes and events. Not much is known, however, about the affective consequences of counterfactual thinking that is unrealistic (i.e., that does not correspond to the main causes of a negative outcome). In three experiments, we investigate the influence of realistic and unrealistic counterfactuals on experienced regret after negative outcomes. In Experiment 1, we found that participants who thought unrealistically about a poor outcome reported less regret than those who thought realistically about it. In Experiments 2a and 2b, we replicated this finding and we showed that the decrease in regret was associated with a shift in the causal attributions of the poor outcome. Participants who thought unrealistically attributed it more to external circumstances and less to their own behaviours than those who thought realistically about it. We discuss the implications of these findings for the role of counterfactuals as self-serving biases and the functionality of regret as a counterfactual emotion.

  14. Stimulation from Simulation? A Teaching Model of Hillslope Hydrology for Use on Microcomputers.

    ERIC Educational Resources Information Center

    Burt, Tim; Butcher, Dave

    1986-01-01

    The design and use of a simple computer model which simulates a hillslope hydrology is described in a teaching context. The model shows a relatively complex environmental system can be constructed on the basis of a simple but realistic theory, thus allowing students to simulate the hydrological response of real hillslopes. (Author/TRS)

  15. Combining Simulated Patients and Simulators: Pilot Study of Hybrid Simulation in Teaching Cardiac Auscultation

    ERIC Educational Resources Information Center

    Friederichs, Hendrik; Weissenstein, Anne; Ligges, Sandra; Möller, David; Becker, Jan C.; Marschall, Bernhard

    2014-01-01

    Auscultation torsos are widely used to teach position-dependent heart sounds and murmurs. To provide a more realistic teaching experience, both whole body auscultation mannequins and torsos have been used in clinical examination skills training at the Medical Faculty of the University of Muenster since the winter term of 2008-2009. This training…

  16. Impact of Using a Robot Patient for Nursing Skill Training in Patient Transfer

    ERIC Educational Resources Information Center

    Huang, Zhifeng; Lin, Chingszu; Kanai-Pak, Masako; Maeda, Jukai; Kitajima, Yasuko; Nakamura, Mitsuhiro; Kuwahara, Noriaki; Ogata, Taiki; Ota, Jun

    2017-01-01

    In the past few decades, simulation training has been used to help nurses improve their patient-transfer skills. However, the effectiveness of such training remains limited because it lacks effective ways of simulating patients' actions realistically. It is difficult for nurses to use the skills learned from simulation training to transfer an…

  17. Evaluation of sprayable fixatives on a sandy soil for potential use in a dirty bomb response.

    PubMed

    Fritz, Brad G; Whitaker, John D

    2008-06-01

    After the events of 11 September 2001, the possibility of a dirty bomb being detonated within the United States seems more realistic. Development of tools for use in response to a dirty bomb detonation has become a topic of both discussion and research. While it has been reported that the health risk to the public from such an event would likely be small, it is thought that the psychological impact could be considerable. One response option that has been considered is adapting sprayable solutions for the purpose of fixing contamination in place, thereby limiting the spread of contamination by wind and rain and facilitating subsequent cleanup. This work evaluated two commercially available particle fixatives (IsoFIX-HT and IsoFIX-RC) for their effectiveness in preventing dispersal of simulated contamination. Nonradioactive cesium chloride and cobalt oxide particles were selected as the simulated contamination and applied to the surface of three outdoor test plots. Two test plots were treated with fixatives; the third plot provided a control. Samples were collected over 95 days to observe changes in tracer concentration on the surface of the test plots. One fixative (IsoFIX-RC) effectively held the tracer in place with no net loss of tracer, while the other fixative (IsoFIX-HT) had no impact on the loss of tracer relative to the control. Under the conditions tested, IsoFIX-RC appears capable of fixing surface contamination in place for at least several months.

  18. A Multiplicative Cascade Model for High-Resolution Space-Time Downscaling of Rainfall

    NASA Astrophysics Data System (ADS)

    Raut, Bhupendra A.; Seed, Alan W.; Reeder, Michael J.; Jakob, Christian

    2018-02-01

    Distributions of rainfall with the time and space resolutions of minutes and kilometers, respectively, are often needed to drive the hydrological models used in a range of engineering, environmental, and urban design applications. The work described here is the first step in constructing a model capable of downscaling rainfall to scales of minutes and kilometers from time and space resolutions of several hours and a hundred kilometers. A multiplicative random cascade model known as the Short-Term Ensemble Prediction System is run with parameters from the radar observations at Melbourne (Australia). The orographic effects are added through multiplicative correction factor after the model is run. In the first set of model calculations, 112 significant rain events over Melbourne are simulated 100 times. Because of the stochastic nature of the cascade model, the simulations represent 100 possible realizations of the same rain event. The cascade model produces realistic spatial and temporal patterns of rainfall at 6 min and 1 km resolution (the resolution of the radar data), the statistical properties of which are in close agreement with observation. In the second set of calculations, the cascade model is run continuously for all days from January 2008 to August 2015 and the rainfall accumulations are compared at 12 locations in the greater Melbourne area. The statistical properties of the observations lie with envelope of the 100 ensemble members. The model successfully reproduces the frequency distribution of the 6 min rainfall intensities, storm durations, interarrival times, and autocorrelation function.

  19. QBO Influence on Polar Stratospheric Variability in the GEOS Chemistry-Climate Model

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Oman, L. D.; Li, F.; Slong, I.-S.; Newman, P. A.; Nielsen, J. E.

    2010-01-01

    The quasi-biennial oscillation modulates the strength of both the Arctic and Antarctic stratospheric vortices. Model and observational studies have found that the phase and characteristics of the quasi-biennial oscillation (QBO) contribute to the high degree of variability in the Arctic stratosphere in winter. While the Antarctic stratosphere is less variable, recent work has shown that Southern Hemisphere planetary wave driving increases in response to "warm pool" El Nino events that are coincident with the easterly phase of the QBO. These events hasten the breakup of the Antarctic polar vortex. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is now capable of generating a realistic QBO, due a new parameterization of gravity wave drag. In this presentation, we will use this new model capability to assess the influence of the QBO on polar stratospheric variability. Using simulations of the recent past, we will compare the modeled relationship between QBO phase and mid-winter vortex strength with the observed Holton-Tan relation, in both hemispheres. We will use simulations of the 21 St century to estimate future trends in the relationship between QBO phase and vortex strength. In addition, we will evaluate the combined influence of the QBO and El Nino/Southern Oscillation (ENSO) on the timing of the breakup of the polar stratospheric vortices in the GEOS CCM. We will compare the influence of these two natural phenomena with trends in the vortex breakup associated with ozone recovery and increasing greenhouse gas concentrations.

  20. Tsunamigenic Gravity Waves in the Thermosphere-Ionosphere System: Challenges and Opportunities (Invited)

    NASA Astrophysics Data System (ADS)

    Hickey, M. P.

    2010-12-01

    There has been a recent resurgence of interest in the association between tsunamis and traveling ionospheric disturbances (TIDs), fueled in part by the use of GPS satellite technologies to remotely monitor the ionosphere. The TID observations have also triggered a renewed interest in the modeling of such events. Up to this point in time the various model simulations have incorporated various simplifications, some of which are briefly described. A future challenge is to bring together suites of models that each realistically describes one of the subsystems. In this talk I will describe the results of using a linear spectral full-wave model to simulate the propagation of a gravity wave disturbance from the sea surface to the thermosphere. In the model this disturbance is driven by a lower boundary perturbation that mimics a tsunami. A linear model describing the response of the ionosphere to neutral atmosphere perturbations, and airglow perturbations driven by ionosphere and neutral atmosphere fluctuations are also described. Additionally, the gravity wave disturbances carries wave momentum, which will be deposited in the thermosphere accompanying the viscous dissipation of wave energy and lead to accelerations of the mean state. In spite of the simplicity of these models, much can be learned from them. It is suggested that these rare events offer a fairly unique opportunity to test models describing such processes. Model predictions of total electron content (TEC) fluctuations are also briefly compared with TEC measurements obtained following some recent major tsunamis.

Top