Science.gov

Sample records for event system simulation

  1. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  2. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  3. Rare event simulation of the chaotic Lorenz 96 dynamical system

    NASA Astrophysics Data System (ADS)

    Wouters, Jeroen; Bouchet, Freddy

    2015-04-01

    The simulation of rare events is becoming increasingly important in the climate sciences. Several sessions are devoted to rare and extreme events at this meeting and the IPCC has devoted a special report to risk management of extreme events (SREX). Brute force simulation of rare events can however be very costly. To obtain satisfactory statistics on a 1/1000y event, one needs to perform simulations over several thousands of years. Recently, a class of rare event simulation algorithms has been introduced that could yield significant increases in performance with respect to brute force simulations (see e.g. [1]). In these algorithms an ensemble of simulations is evolved in parallel, while at certain interaction times ensemble members are killed and cloned so as to have better statistics in the region of phase space that is relevant to the rare event of interest. We will discuss the implementational issues and performance gains for these algorithms. We also present results on a first application of a rare event simulation algorithm to a toy model for chaos in the atmosphere, the Lorenz 96 model. We demonstrate that for the estimation of the histogram tail of the energy observable, the algorithm gives a significant error reduction. We will furthermore discuss first results and an outlook on the application of rare event simulation algorithms to study blocking atmospheric circulation and heat wave events in the PlaSim climate model [2]. [1] Del Moral, P. & Garnier, J. Genealogical particle analysis of rare events. The Annals of Applied Probability 15, 2496-2534 (2005). [2] http://www.mi.uni-hamburg.de/Planet-Simul.216.0.html

  4. Enhancing Complex System Performance Using Discrete-Event Simulation

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    2010-01-01

    In this paper, we utilize discrete-event simulation (DES) merged with human factors analysis to provide the venue within which the separation and deconfliction of the system/human operating principles can occur. A concrete example is presented to illustrate the performance enhancement gains for an aviation cargo flow and security inspection system achieved through the development and use of a process DES. The overall performance of the system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and total number of pallets waiting for inspection in the queue. These metrics are performance indicators of the system's ability to service current needs and respond to additional requests. We studied and analyzed different scenarios by changing various model parameters such as the number of pieces per pallet ratio, number of inspectors and cargo handling personnel, number of forklifts, number and types of detection systems, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures identified effective ways to meet inspection requirements while maintaining or reducing overall operational cost and eliminating any shipping delays associated with any proposed changes in inspection requirements. With this understanding effective operational strategies can be developed to optimally use personnel while still maintaining plant efficiency, reducing process interruptions, and holding or reducing costs.

  5. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  6. An Early Warning System for Loan Risk Assessment Based on Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Zhou, Hong; Qiu, Yue; Wu, Yueqin

    System simulation is one of important tool for risk assessment. In this paper, a new method is presented to deal with credit risk assessment problems for commercial banks based on rare event simulation. The failure probability of repaying loans of listed company is taken as the criterion to measure the level of credit risk. The rare-event concept is adopted to construct the model of credit risk identification in commercial banks, and cross-entropy scheme is designed to implement the rare event simulation, based on which the loss probability can be assessed. Numerical experiments have shown that the method has a strong capability to identify the credit risk for commercial banks and offers a good tool for early warning.

  7. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  8. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    NASA Astrophysics Data System (ADS)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  9. Simulation of a 36 h solar particle event at LLUMC using a proton beam scanning system

    NASA Astrophysics Data System (ADS)

    Coutrakon, G. B.; Benton, E. R.; Gridley, D. S.; Hickey, T.; Hubbard, J.; Koss, P.; Moyers, M. F.; Nelson, G. A.; Pecaut, M. J.; Sanders, E.; Shahnazi, K.

    2007-08-01

    A radiation biology experiment was performed in the research room of the proton therapy facility at Loma Linda University Medical Center to simulate the proton exposure produced by a solar particle event. The experiment used two scanning magnets for X and Y deflection of the proton beam and covered a usable target area of nearly 1 m2. The magnet scanning control system consisted of Lab View 6.0 software running on a PC. The goal of this experiment was to study the immune system response of 48 mice simultaneously exposed to 2 Gy of protons that simulated the dose rate and energy spectrum of the September 1989 solar particle event. The 2 Gy dose was delivered to the entrance of the mice cages over 36 h. Both ion chamber and TLD measurements indicated that the dose delivered was within 9% of the intended value. A spot scanning technique using one spot per accelerator cycle (2.2 s) was used to deliver doses as low as 1 μGy per beam spot. Rapid beam termination (less than 5 ms) on each spot was obtained by energizing a quadrupole in the proton synchrotron once the dose limit was reached for each spot. A parallel plate ion chamber placed adjacent to the mice cages provided fluence (or dose) measurements for each beam energy during each hour of the experiment. An intensity modulated spot scanning technique can be used in a variety of ways for radiation biology and a second experiment is being designed with this proton beam scanning system to simultaneously irradiate four groups of mice with different dose rates within the 1 m2 area. Also, large electronic devices being tested for radiation damage have been exposed in this beam without the use of patch fields. The same scanning system has potential application for intensity modulated proton therapy (IMPT) as well. This paper discusses the beam delivery system and dosimetry of the irradiation.

  10. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  11. The IDES framework: A case study in development of a parallel discrete-event simulation system

    SciTech Connect

    Nicol, D.M.; Johnson, M.M.; Yoshimura, A.S.

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  12. Spontaneous onset of a Madden-Julian oscillation event in a cloud-system-resolving simulation

    NASA Astrophysics Data System (ADS)

    Miura, Hiroaki; Satoh, Masaki; Katsumata, Masaki

    2009-07-01

    Spontaneous onset of a Madden-Julian Oscillation (MJO) event in November 2006 was reproduced at a proper location and time by a global cloud-resolving model (CRM) used with a relatively coarse horizontal grid. Preconditioning of moisture was simulated about 4-days prior to the onset in the Indian Ocean, which agreed with data obtained in an in-situ observation. To investigate influence of zonal Sea Surface Temperature (SST) gradient in the Indian Ocean, we conducted a sensitivity study comparing composites made from five ensemble simulations. It was found that the eastward-moving signal of this MJO event could be obscured if SST were zonally uniform in the western Indian Ocean. Zonal SST gradient has not been considered important in the previous studies about the MJO onset, but SST distribution locating cooler SST in the west side possibly help enhance convection in slow eastward-moving envelopes of the MJO.

  13. Dynamic simulation recalls condensate piping event

    SciTech Connect

    Farrell, R.J.; Reneberg, K.O. ); Moy, H.C. )

    1994-05-01

    This article describes how experience gained from simulating and reconstructing a condensate piping event will be used by Consolidated Edison to analyze control system problems. A cooperative effort by Con Edison and the Chemical Engineering Department at Polytechnic University used modular modeling system to investigate the probable cause of a Con Edison condensate piping event. Con Edison commissioned the work to serve as a case study for the more general problem of control systems analysis using dynamic simulation and MMS.

  14. Simulation of Heinrich events and their climate impact with an Earth system model

    NASA Astrophysics Data System (ADS)

    Ganopolski, A.; Calov, R.

    2003-04-01

    Heinrich events related to large-scale surges of the Laurentide Ice Sheet into the Atlantic Ocean represent one of the most dramatic types of abrupt climate change occurring during the glacial age. MacAyeal proposed a "binge/purge" free oscillatory mechanism, explaining HEs as transitions between two modes of operation of ice sheets: slow movement of ice over a frozen base and a fast sliding mode when the ice bed is at the melting point. This type of self-sustained multi-millennial oscillation has been simulated in simplified 2-D ice sheet models, but in realistic 3-D models such a large-scale instability of the LIS has not so far been reproduced. Here using a coupled atmosphere-ocean-vegetation-ice sheet model, we simulate quasi-periodic large-scale rapid surges from the Laurentide Ice Sheet under typical glacial climate conditions. The average time between simulated events is about 7,000 yrs, while the surging phase of each event lasts only several hundred years, with a total ice volume discharge corresponding to 5--10 m of sea level rise. The crucial factor needed for existence of mega-surges in our model is employment of different sliding laws over hard bed (rocks) and over soft water-saturated sediments, like those in the Hudson Bay and Hudson Strait. The area of deformable sediments served as a geological template for the mega-surges. During each HE, the elevation drops by more than one km over the Hudson Bay, and the Laurentide ice sheet changes from one-dome to a two-dome structure -- one dome being located over the southeast of Alberta and another over the southwest of Quebec. In our model the ice surges represent internal oscillations of the ice sheet related to rapid transitions between two metastable modes of ice sheet dynamics over the area covered by deformable sediments. At the same time, we demonstrate the possibility of both internal and external synchronization between instabilities of different ice sheets, as indicated in palaeoclimate records

  15. StochKit2: software for discrete stochastic simulation of biochemical systems with events

    PubMed Central

    Sanft, Kevin R.; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R.

    2011-01-01

    Summary: StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. Availability: StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. Contact: petzold@engineering.ucsb.edu PMID:21727139

  16. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  17. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    SciTech Connect

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  18. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  19. The global event system

    SciTech Connect

    Winans, J.

    1994-03-02

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  20. Weighted next reaction method and parameter selection for efficient simulation of rare events in biochemical reaction systems

    PubMed Central

    2011-01-01

    The weighted stochastic simulation algorithm (wSSA) recently developed by Kuwahara and Mura and the refined wSSA proposed by Gillespie et al. based on the importance sampling technique open the door for efficient estimation of the probability of rare events in biochemical reaction systems. In this paper, we first apply the importance sampling technique to the next reaction method (NRM) of the stochastic simulation algorithm and develop a weighted NRM (wNRM). We then develop a systematic method for selecting the values of importance sampling parameters, which can be applied to both the wSSA and the wNRM. Numerical results demonstrate that our parameter selection method can substantially improve the performance of the wSSA and the wNRM in terms of simulation efficiency and accuracy. PMID:21910924

  1. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  2. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  3. Numerical Simulations of Two Wildfire Events Using a Combined Modeling System (HIGRAD/BEHAVE)

    SciTech Connect

    Reisner, J.; Bossert, J.; Winterkamp, J.

    1997-12-31

    The ability to accurately forecast the spread of a wildfire would significantly reduce human suffering and loss of life, the destruction of property, and expenditures for assessment and recovery. To help achieve this goal we have developed a model which accurately simulates the interactions between winds and the heat source associated with a wildfire. We have termed our new model HIGRAD or High resolution model for strong GRA-Dient applications. HIGRAD employs a sophisticated numerical technique to prevent numerical Oscillations from occurring in the vicinity of the lire. Of importance for fire modeling, HIGRAD uses a numerical technique which allows for the use of a compressible equation set, but without the time-step restrictions associated with the propagation of sound-waves.

  4. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  5. A Simbol-X Event Simulator

    SciTech Connect

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-05-11

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  6. Running Parallel Discrete Event Simulators on Sierra

    SciTech Connect

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  7. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  8. Simulating Heinrich event 1 with interactive icebergs

    NASA Astrophysics Data System (ADS)

    Jongma, J. I.; Renssen, H.; Roche, D. M.

    2013-03-01

    During the last glacial, major abrupt climate events known as Heinrich events left distinct fingerprints of ice rafted detritus, and are thus associated with iceberg armadas; the release of many icebergs into the North Atlantic Ocean. We simulated the impact of a large armada of icebergs on glacial climate in a coupled atmosphere-ocean model. In our model, dynamic-thermodynamic icebergs influence the climate through two direct effects. First, melting of the icebergs causes freshening of the upper ocean, and second, the latent heat used in the phase-transition of ice to water results in cooling of the iceberg surroundings. This cooling effect of icebergs is generally neglected in models. We investigated the role of the latent heat by performing a sensitivity experiment in which the cooling effect is switched off. At the peak of the simulated Heinrich event, icebergs lacking the latent heat flux are much less efficient in shutting down the meridional overturning circulation than icebergs that include both the freshening and the cooling effects. The cause of this intriguing result must be sought in the involvement of a secondary mechanism: facilitation of sea-ice formation, which can disturb deep water production at key convection sites, with consequences for the thermohaline circulation. We performed additional sensitivity experiments, designed to explore the effect of the more plausible distribution of the dynamic icebergs' melting fluxes compared to a classic hosing approach with homogeneous spreading of the melt fluxes over a section in the mid-latitude North Atlantic (NA) Ocean. The early response of the climate system is much stronger in the iceberg experiments than in the hosing experiments, which must be a distribution-effect: the dynamically distributed icebergs quickly affect western NADW formation, which synergizes with direct sea-ice facilitation, causing an earlier sea-ice expansion and climatic response. Furthermore, compared to dynamic

  9. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  10. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  11. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  12. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  13. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  14. Scanning picosecond tunable laser system for simulating MeV heavy ion-induced charge collection events as a function of temperature

    NASA Astrophysics Data System (ADS)

    Laird, Jamie Stuart; Chen, Yuan; Scheick, Leif; Vo, Tuan; Johnston, Allan

    2008-08-01

    A new methodology for using scanning picosecond laser microscopy to simulate cosmic ray induced radiation effects as a function of temperature is described in detail. The built system is centered on diffraction-limited focusing of the output from a broadband (690-960 nm) ultrafast Ti:sapphire Tsunami laser pumped by a 532 nm Millennia laser. An acousto-optic modulator is used to provide pulse picking down to event rates necessary for the technologies and effects under study. The temperature dependence of the charge generation process for ions and photons is briefly reviewed and the need for wavelength tunability is discussed. An appropriate wavelength selection is critical for proper emulation of ion events over a wide temperature range. The system developed is detailed and illustrated by way of example on a deep-submicron complementary metal-oxide semiconductor test structure.

  15. MHD simulation of the Bastille day event

    NASA Astrophysics Data System (ADS)

    Linker, Jon; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-01

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 1033 ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  16. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  17. Complete event simulations of nuclear fission

    NASA Astrophysics Data System (ADS)

    Vogt, Ramona

    2015-10-01

    For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.

  18. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models. PMID:27140558

  19. Distributed discrete event simulation. Final report

    SciTech Connect

    De Vries, R.C.

    1988-02-01

    The presentation given here is restricted to discrete event simulation. The complexity of and time required for many present and potential discrete simulations exceeds the reasonable capacity of most present serial computers. The desire, then, is to implement the simulations on a parallel machine. However, certain problems arise in an effort to program the simulation on a parallel machine. In one category of methods deadlock care arise and some method is required to either detect deadlock and recover from it or to avoid deadlock through information passing. In the second category of methods, potentially incorrect simulations are allowed to proceed. If the situation is later determined to be incorrect, recovery from the error must be initiated. In either case, computation and information passing are required which would not be required in a serial implementation. The net effect is that the parallel simulation may not be much better than a serial simulation. In an effort to determine alternate approaches, important papers in the area were reviewed. As a part of that review process, each of the papers was summarized. The summary of each paper is presented in this report in the hopes that those doing future work in the area will be able to gain insight that might not otherwise be available, and to aid in deciding which papers would be most beneficial to pursue in more detail. The papers are broken down into categories and then by author. Conclusions reached after examining the papers and other material, such as direct talks with an author, are presented in the last section. Also presented there are some ideas that surfaced late in the research effort. These promise to be of some benefit in limiting information which must be passed between processes and in better understanding the structure of a distributed simulation. Pursuit of these ideas seems appropriate.

  20. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  1. Precision Event Simulation for Hadron Colliders

    NASA Astrophysics Data System (ADS)

    Hoeche, Stefan

    2016-03-01

    Hadron colliders are workhorses of particle physics, enabling scientific breakthroughs such as the discovery of the Higgs boson. Hadron beams reach the highest energies, but they also produce very complex collisions. Studying the underlying dynamics requires involved multi-particle calculations. Over the past decades Monte-Carlo simulation programs were developed to tackle this task. They have by now evolved into precision tools for theorists and experimenters alike. This talk will give an introduction to event generators and discuss the current status of development.

  2. Rare event simulation in radiation transport

    SciTech Connect

    Kollman, C.

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.

  3. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  4. Event Index — an LHCb Event Search System

    NASA Astrophysics Data System (ADS)

    Ustyuzhanin, A.; Artemov, A.; Kazeev, N.; Redkin, A.

    2015-12-01

    During LHC Run 1, the LHCb experiment recorded around 1011 collision events. This paper describes Event Index — an event search system. Its primary function is to quickly select subsets of events from a combination of conditions, such as the estimated decay channel or number of hits in a subdetector. Event Index is essentially Apache Lucene [1] optimized for read-only indexes distributed over independent shards on independent nodes.

  5. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  6. Empirical study of simulated two-planet microlensing events

    SciTech Connect

    Zhu, Wei; Gould, Andrew; Penny, Matthew; Mao, Shude; Gendron, Rieul

    2014-10-10

    We undertake the first study of two-planet microlensing models recovered from simulations of microlensing events generated by realistic multiplanet systems in which 292 planetary events, including 16 two-planet events, were detected from 6690 simulated light curves. We find that when two planets are recovered, their parameters are usually close to those of the two planets in the system most responsible for the perturbations. However, in 1 of the 16 examples, the apparent mass of both detected planets was more than doubled by the unmodeled influence of a third, massive planet. This fraction is larger than but statistically consistent with the roughly 1.5% rate of serious mass errors due to unmodeled planetary companions for the 274 cases from the same simulation in which a single planet is recovered. We conjecture that an analogous effect due to unmodeled stellar companions may occur more frequently. For 7 out of 23 cases in which two planets in the system would have been detected separately, only one planet was recovered because the perturbations due to the two planets had similar forms. This is a small fraction (7/274) of all recovered single-planet models, but almost a third of all events that might plausibly have led to two-planet models. Still, in these cases, the recovered planet tends to have parameters similar to one of the two real planets most responsible for the anomaly.

  7. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. PMID:26922594

  8. Event-based Simulation Model for Quantum Optics Experiments

    SciTech Connect

    De Raedt, H.; Michielsen, K.

    2011-03-28

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. We also discuss the possibility to refute our corpuscular model.

  9. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  10. Data Systems Dynamic Simulator

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Clark, Melana; Davenport, Bill; Message, Philip

    1993-01-01

    The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique.

  11. Numerical Simulations of Hot Vertical Displacement Events

    NASA Astrophysics Data System (ADS)

    Bunkers, K. J.; Sovinec, C. R.

    2015-11-01

    Loss of vertical positioning control in tokamaks leads to instability where hot confined plasma rests against the chamber wall. Resistive-MHD modeling with the NIMROD code is applied to model these events. After divertor-coil current is perturbed, resistive diffusion through the non-ideal wall sets the timescale as the simulated tokamak evolves from a diverted equilibrium to a limited configuration. Results show that plasma outflow along opening magnetic surfaces, just outside the confinement zone, approaches the local ion-acoustic speed. The projection of the plasma flow velocity into the surface-normal direction (n . V) near the surface exceeds the local E × B drift speed; near surfaces n × E is approximately the same as n ×Ewall in the nearly steady conditions. The safety factor of flux surfaces that remain intact is approximately constant over the evolution time, which is much shorter than the plasma resistive diffusion time. Assessment of external-kink stability and initial findings from 3D nonlinear computations are presented. This effort is supported by the U.S. Dept. of Energy, award numbers DE-FG02-06ER54850 and DE-FC02-08ER54975.

  12. Calculation of fission observables through event-by-event simulation

    NASA Astrophysics Data System (ADS)

    Randrup, Jørgen; Vogt, Ramona

    2009-08-01

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  13. Calculation of Fission Observables Through Event-by-Event Simulation

    SciTech Connect

    Randrup, J; Vogt, R

    2009-06-04

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to met this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  14. Single event effects and laser simulation studies

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Schwartz, H.; Mccarty, K.; Coss, J.; Barnes, C.

    1993-01-01

    The single event upset (SEU) linear energy transfer threshold (LETTH) of radiation hardened 64K Static Random Access Memories (SRAM's) was measured with a picosecond pulsed dye laser system. These results were compared with standard heavy ion accelerator (Brookhaven National Laboratory (BNL)) measurements of the same SRAM's. With heavy ions, the LETTH of the Honeywell HC6364 was 27 MeV-sq cm/mg at 125 C compared with a value of 24 MeV-sq cm/mg obtained with the laser. In the case of the second type of 64K SRAM, the IBM640lCRH no upsets were observed at 125 C with the highest LET ions used at BNL. In contrast, the pulsed dye laser tests indicated a value of 90 MeV-sq cm/mg at room temperature for the SEU-hardened IBM SRAM. No latchups or multiple SEU's were observed on any of the SRAM's even under worst case conditions. The results of this study suggest that the laser can be used as an inexpensive laboratory SEU prescreen tool in certain cases.

  15. New strategies for the simulation of rare events

    NASA Astrophysics Data System (ADS)

    Rahman, Jay Abid

    2002-01-01

    Rare events play an important role in numerous physical, chemical, and biological processes, from protein folding, relaxation in glasses, nucleation, isomerization, and diffusion. Understanding the dynamic and equilibrium characteristics of such processes has important implications for drug design, materials development, and catalysis. Simulations are used to obtain properties of these systems that are inaccessible through current experimental techniques. Unfortunately, simulations on these systems have been historically difficult not only due to the large system size, but also the rugged, multidimensional nature of the potential energy landscapes. Standard simulation methods fail for these systems as they generally become trapped in a local minimum, resulting in incomplete and inaccurate mapping of the potential energy surface. While there has been significant work in this area over the past several decades, the problem is still largely unresolved. In this work, we introduce a number of approaches for solving this problem. The first method, puddle skimming, adds a "puddle" potential to the surface, effectively raising the bottoms of the potential wells as though the surface were being filled with water. This reduces the amount of energy required to escape the well, decreasing the likelihood of trapping. This method is found to be best suited to smaller dimensional systems where the barrier heights are similar to each other. The second method, puddle jumping, adds additional "puddles" to the system and allows transitions between puddles. This method allows more complicated systems to be studied, both in terms of number of degrees of freedom and variety of barrier heights. We also apply the puddle strategy to transition path sampling, greatly extending the range of systems for which this method can be used to compute reaction rate constants with minimum additional work. Finally, we combine the puddle jumping method with parallel tempering, a state-of-the-art rare

  16. An adaptive synchronization protocol for parallel discrete event simulation

    SciTech Connect

    Bisset, K.R.

    1998-12-01

    Simulation, especially discrete event simulation (DES), is used in a variety of disciplines where numerical methods are difficult or impossible to apply. One problem with this method is that a sufficiently detailed simulation may take hours or days to execute, and multiple runs may be needed in order to generate the desired results. Parallel discrete event simulation (PDES) has been explored for many years as a method to decrease the time taken to execute a simulation. Many protocols have been developed which work well for particular types of simulations, but perform poorly when used for other types of simulations. Often it is difficult to know a priori whether a particular protocol is appropriate for a given problem. In this work, an adaptive synchronization method (ASM) is developed which works well on an entire spectrum of problems. The ASM determines, using an artificial neural network (ANN), the likelihood that a particular event is safe to process.

  17. The Advanced Photon Source event system

    SciTech Connect

    Lenkszus, F.R.; Laird, R.

    1995-12-31

    The Advanced Photon Source, like many other facilities, requires a means of transmitting timing information to distributed control system 1/0 controllers. The APS event system provides the means of distributing medium resolution/accuracy timing events throughout the facility. It consists of VME event generators and event receivers which are interconnected with 10OMbit/sec fiber optic links at distances of up to 650m in either a star or a daisy chain configuration. The systems event throughput rate is 1OMevents/sec with a peak-to-peak timing jitter down to lOOns depending on the source of the event. It is integrated into the EPICS-based A.PS control system through record and device support. Event generators broadcast timing events over fiber optic links to event receivers which are programmed to decode specific events. Event generators generate events in response to external inputs, from internal programmable event sequence RAMS, and from VME bus writes. The event receivers can be programmed to generate both pulse and set/reset level outputs to synchronize hardware, and to generate interrupts to initiate EPICS record processing. In addition, each event receiver contains a time stamp counter which is used to provide synchronized time stamps to EPICS records.

  18. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D

  19. Laser simulation of single event upsets

    SciTech Connect

    Buchner, S.P.; Wilson, D.; Kang, K.; Gill, D.; Mazer, J.A.; Raburn, W.D.; Campbell, A.B.; Knudson, A.R.

    1987-12-01

    A pulsed picosecond laser was used to produce upsets in both a commercial bipolar logic circuit and a specially designed CMOS SRAM test structure. Comparing the laser energy necessary for producing upsets in transistors that have different upset sensitivities with the single event upset (SEU) level predicted from circuit analysis showed that a picosecond laser could measure circuit sensitivity to SEUs. The technique makes it possible not only to test circuits rapidly for upset sensitivity but also, because the beam can be focussed down to a small spot size, to identify sensitive transistors.

  20. Computer simulation of underwater nuclear events

    SciTech Connect

    Kamegai, M.

    1986-09-01

    This report describes the computer simulation of two underwater nuclear explosions, Operation Wigwam and a modern hypothetical explosion of greater yield. The computer simulations were done in spherical geometry with the LASNEX computer code. Comparison of the LASNEX calculation with Snay's analytical results and the Wigwam measurements shows that agreement in the shock pressure versus range in water is better than 5%. The results of the calculations are also consistent with the cube root scaling law for an underwater blast wave. The time constant of the wave front was determined from the wave profiles taken at several points. The LASNEX time-constant calculation and Snay's theoretical results agree to within 20%. A time-constant-versus-range relation empirically fitted by Snay is valid only within a limited range at low pressures, whereas a time-constant formula based on Sedov's similarity solution holds at very high pressures. This leaves the intermediate pressure range with neither an empirical nor a theoretical formula for the time constant. These one-dimensional simulations demonstrate applicability of the computer code to investigations of this nature, and justify the use of this technique for more complex two-dimensional problems, namely, surface effects on underwater nuclear explosions. 16 refs., 8 figs., 2 tabs.

  1. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, D; Tromp, J; Rodgers, A

    2007-07-16

    Comprehensive test ban monitoring in terms of location and discrimination has progressed significantly in recent years. However, the characterization of sources and the estimation of low yields remains a particular challenge. As the recent Korean shot demonstrated, we can probably expect to have a small set of teleseismic, far-regional and high-frequency regional data to analyze in estimating the yield of an event. Since stacking helps to bring signals out of the noise, it becomes useful to conduct comparable analyses on neighboring events, earthquakes in this case. If these auxiliary events have accurate moments and source descriptions, we have a means of directly comparing effective source strengths. Although we will rely on modeling codes, 1D, 2D, and 3D, we will also apply a broadband calibration procedure to use longer periods (P>5s) waveform data to calibrate short-period (P between .5 to 2 Hz) and high-frequency (P between 2 to 10 Hz) as path specify station corrections from well-known regional sources. We have expanded our basic Cut-and-Paste (CAP) methodology to include not only timing shifts but also amplitude (f) corrections at recording sites. The name of this method was derived from source inversions that allow timing shifts between 'waveform segments' (or cutting the seismogram up and re-assembling) to correct for crustal variation. For convenience, we will refer to these f-dependent refinements as CAP+ for (SP) and CAP++ for still higher frequency. These methods allow the retrieval of source parameters using only P-waveforms where radiation patterns are obvious as demonstrated in this report and are well suited for explosion P-wave data. The method is easily extended to all distances because it uses Green's function although there may be some changes required in t* to adjust for offsets between local vs. teleseismic distances. In short, we use a mixture of model-dependent and empirical corrections to tackle the path effects. Although we reply on the

  2. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  3. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGESBeta

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  4. Threat radar system simulations

    NASA Astrophysics Data System (ADS)

    Miller, L.

    The capabilities, requirements, and goals of radar emitter simulators are discussed. Simulators are used to evaluate competing receiver designs, to quantify the performance envelope of a radar system, and to model the characteristics of a transmitted signal waveform. A database of candidate threat systems is developed and, in concert with intelligence data on a given weapons system, permits upgrading simulators to new projected threat capabilities. Four currently available simulation techniques are summarized, noting the usefulness of developing modular software for fast controlled-cost upgrades of simulation capabilities.

  5. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  6. Event simulation for colliders — A basic overview

    NASA Astrophysics Data System (ADS)

    Reuschle, Christian

    2015-05-01

    In this article we will discuss the basic calculational concepts to simulate particle physics events at high energy colliders. We will mainly focus on the physics in hadron colliders and particularly on the simulation of the perturbative parts, where we will in turn focus on the next-to-leading order QCD corrections.

  7. Event-by-event simulation of a quantum delayed-choice experiment

    NASA Astrophysics Data System (ADS)

    Donker, Hylke C.; De Raedt, Hans; Michielsen, Kristel

    2014-12-01

    The quantum delayed-choice experiment of Tang et al. (2012) is simulated on the level of individual events without making reference to concepts of quantum theory or without solving a wave equation. The simulation results are in excellent agreement with the quantum theoretical predictions of this experiment. The implication of the work presented in the present paper is that the experiment of Tang et al. can be explained in terms of cause-and-effect processes in an event-by-event manner.

  8. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  9. Designing Simulation Systems

    ERIC Educational Resources Information Center

    Twelker, Paul A.

    1969-01-01

    "The purpose of this paper is to outline the approach to designing instructional simulation systems developed at Teaching Research. The 13 phases of simulation design will be summarized, and an effort will be made to expose the vital decision points that confront the designer as he develops simulation experiences. (Author)

  10. Simulating Single-Event Upsets in Bipolar RAM's

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.

    1986-01-01

    Simulation technique saves testing. Uses interactive version of SPICE (Simulation Program with Integrated Circuit Emphasis). Device and subcircuit models available in software used to construct macromodel for an integrated bipolar transistor. Time-dependent current generators placed inside transistor macromodel to simulate charge collection from ion track. Significant finding of experiments is standard design practice of reducing power in unaddressed bipolar RAM cell increases sensitivity of cell to single-event upsets.

  11. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  12. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  13. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  14. Event-by-event fission simulation code, generates complete fission events

    Energy Science and Technology Software Center (ESTSC)

    2013-04-01

    FREYA is a computer code that generates complete fission events. The output includes the energy and momentum of these final state particles: fission products, prompt neutrons and prompt photons. The version of FREYA that is to be released is a module for MCNP6.

  15. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    PubMed

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically. PMID:26961779

  16. Simulation and study of small numbers of random events

    NASA Technical Reports Server (NTRS)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  17. Event-by-event simulation of experiments to create entanglement and violate Bell inequalities

    NASA Astrophysics Data System (ADS)

    Michielsen, K.; De Raedt, H.

    2013-10-01

    We discuss a discrete-event, particle-based simulation approach which reproduces the statistical distributions of Maxwell's theory and quantum theory by generating detection events one-by-one. This event-based approach gives a unified causeand- effect description of quantum optics experiments such as single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, quantum eraser, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments, and various neutron interferometry experiments. We illustrate the approach by application to single-photon Einstein-Podolsky- Rosen-Bohm experiments and single-neutron interferometry experiments that violate a Bell inequality.

  18. Route to extreme events in excitable systems

    NASA Astrophysics Data System (ADS)

    Karnatak, Rajat; Ansmann, Gerrit; Feudel, Ulrike; Lehnertz, Klaus

    2014-08-01

    Systems of FitzHugh-Nagumo units with different coupling topologies are capable of self-generating and -terminating strong deviations from their regular dynamics that can be regarded as extreme events due to their rareness and recurrent occurrence. Here we demonstrate the crucial role of an interior crisis in the emergence of extreme events. In parameter space we identify this interior crisis as the organizing center of the dynamics by employing concepts of mixed-mode oscillations and of leaking chaotic systems. We find that extreme events occur in certain regions in parameter space, and we show the robustness of this phenomenon with respect to the system size.

  19. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  20. Variability of simulants used in recreating stab events.

    PubMed

    Carr, D J; Wainwright, A

    2011-07-15

    Forensic investigators commonly use simulants/backing materials to mount fabrics and/or garments on when recreating damage due to stab events. Such work may be conducted in support of an investigation to connect a particular knife to a stabbing event by comparing the severance morphology obtained in the laboratory to that observed in the incident. There does not appear to have been a comparison of the effect of simulant type on the morphology of severances in fabrics and simulants, nor on the variability of simulants. This work investigates three simulants (pork, gelatine, expanded polystyrene), two knife blades (carving, bread), and how severances in the simulants and an apparel fabric typically used to manufacture T-shirts (single jersey) were affected by (i) simulant type and (ii) blade type. Severances were formed using a laboratory impact apparatus to ensure a consistent impact velocity and hence impact energy independently of the other variables. The impact velocity was chosen so that the force measured was similar to that measured in human performance trials. Force-time and energy-time curves were analysed and severance morphology (y, z directions) investigated. Simulant type and knife type significantly affected the critical forensic measurements of severance length (y direction) in the fabric and 'skin' (Tuftane). The use of EPS resulted in the lowest variability in data, further the severances recorded in both the fabric and Tuftane more accurately reflected the dimensions of the impacting knives. PMID:21371835

  1. Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Prive, Nikki

    2015-01-01

    This presentation gives an overview of Observing System Simulation Experiments (OSSEs). The components of an OSSE are described, along with discussion of the process for validating, calibrating, and performing experiments. a.

  2. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  3. Fission Reaction Event Yield Algorithm, FREYA - For event-by-event simulation of fission

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Randrup, J.; Vogt, R.

    2015-06-01

    From nuclear materials accountability to detection of special nuclear material, SNM, the need for better modeling of fission has grown over the past decades. Current radiation transport codes compute average quantities with great accuracy and performance, but performance and averaging come at the price of limited interaction-by-interaction modeling. For fission applications, these codes often lack the capability of modeling interactions exactly: energy is not conserved, energies of emitted particles are uncorrelated, prompt fission neutron and photon multiplicities are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g. the neutron multiplicity) and correlations between neutrons and photons. The new computational model, FREYA (Fission Reaction Event Yield Algorithm), aims to meet this need by modeling complete fission events. Thus it automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum. FREYA has been integrated into the LLNL Fission Library, and will soon be part of MCNPX2.7.0, MCNP6, TRIPOLI-4.9, and Geant4.10.

  4. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  5. Advanced Simulation of Coupled Earthquake and Tsunami Events (ASCETE) - Simulation Techniques for Realistic Tsunami Process Studies

    NASA Astrophysics Data System (ADS)

    Behrens, Joern; Bader, Michael; Breuer, Alexander N.; van Dinther, Ylona; Gabriel, Alice-A.; Galvez Barron, Percy E.; Rahnema, Kaveh; Vater, Stefan; Wollherr, Stephanie

    2015-04-01

    At the End of phase 1 of the ASCETE project a simulation framework for coupled physics-based rupture generation with tsunami propagation and inundation is available. Adaptive mesh tsunami propagation and inundation by discontinuous Galerkin Runge-Kutta methods allows for accurate and conservative inundation schemes. Combined with a tree-based refinement strategy to highly optimize the code for high-performance computing architectures, a modeling tool for high fidelity tsunami simulations has been constructed. Validation results demonstrate the capacity of the software. Rupture simulation is performed by an unstructured tetrahedral discontinuous Galerking ADER discretization, which allows for accurate representation of complex geometries. The implemented code was nominated for and was selected as a finalist for the Gordon Bell award in high-performance computing. Highly realistic rupture events can be simulated with this modeling tool. The coupling of rupture induced wave activity and displacement with hydrodynamic equations still poses a major problem due to diverging time and spatial scales. Some insight from the ASCETE set-up could be gained and the presentation will focus on the coupled behavior of the simulation system. Finally, an outlook to phase 2 of the ASCETE project will be given in which further development of detailed physical processes as well as near-realistic scenario computations are planned. ASCETE is funded by the Volkswagen Foundation.

  6. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  7. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  8. "Orpheus" cardiopulmonary bypass simulation system.

    PubMed

    Morris, Richard W; Pybus, David A

    2007-12-01

    In this paper we describe a high-fidelity perfusion simulation system intended for use in the training and continuing education of perfusionists. The system comprises a hydraulic simulator, an electronic interface unit and a controlling computer with associated real-time computer models. It is designed for use within an actual operating theatre, or within a specialized simulation facility. The hydraulic simulator can be positioned on an operating table and physically connected to the circuit of the institutional heart-lung machine. The institutional monitoring system is used to display the arterial and central venous pressures, the ECG and the nasopharyngeal temperature using appropriate connections. The simulator is able to reproduce the full spectrum of normal and abnormal events that may present during the course of cardiopulmonary bypass. The system incorporates a sophisticated blood gas model that accurately predicts the behavior of a modern, hollow-fiber oxygenator. Output from this model is displayed in the manner of an in-line blood gas electrode and is updated every 500 msecs. The perfusionist is able to administer a wide variety of drugs during a simulation session including: vasoconstrictors (metaraminol, epinephrine and phenylephrine), a vasodilator (sodium nitroprusside), chronotropes (epinephrine and atropine), an inotrope (epinephrine) and modifiers of coagulation (heparin and protamine). Each drug has a pharmacokinetic profile based on a three-compartment model plus an effect compartment. The simulation system has potential roles in the skill training of perfusionists, the development of crisis management protocols, the certification and accreditation of perfusionists and the evaluation of new perfusion equipment and/or techniques. PMID:18293807

  9. Simulating an Extreme Wind Event in a Topographically Complex Region

    NASA Astrophysics Data System (ADS)

    Lennard, Christopher

    2014-07-01

    Complex topography modifies local weather characteristics such as air temperature, rainfall and airflow within a larger regional extent. The Cape Peninsula around Cape Town, South Africa, is a complex topographical feature responsible for the modification of rainfall and wind fields largely downstream of the Peninsula. During the passage of a cold front on 2 October 2002, an extreme wind event associated with tornado-like damage occurred in the suburb of Manenberg, however synoptic conditions did not indicate convective activity typically associated with a tornado. A numerical regional climate model was operated at very high horizontal resolution (500 m) to investigate the dynamics of the event. The model simulated an interaction between the topography of the peninsula and an airflow direction change associated with the passage of the cold front. A small region of cyclonic circulation was simulated over Manenberg that was embedded in an area of negative vorticity and a leeward gravity wave. The feature lasted 14 min and moved in a north to south direction. Vertically, it was not evident above 220 m. The model assessment describes this event as a shallow but intense cyclonic vortex generated in the lee of the peninsula through an interaction between the peninsula and a change in wind direction as the cold front made landfall. The model did not simulate wind speeds associated with the observed damage suggesting that the horizontal grid resolution ought to be at the scale of the event to more completely understand such microscale airflow phenomena.

  10. SPICE: Simulation Package for Including Flavor in Collider Events

    NASA Astrophysics Data System (ADS)

    Engelhard, Guy; Feng, Jonathan L.; Galon, Iftah; Sanford, David; Yu, Felix

    2010-01-01

    We describe SPICE: Simulation Package for Including Flavor in Collider Events. SPICE takes as input two ingredients: a standard flavor-conserving supersymmetric spectrum and a set of flavor-violating slepton mass parameters, both of which are specified at some high "mediation" scale. SPICE then combines these two ingredients to form a flavor-violating model, determines the resulting low-energy spectrum and branching ratios, and outputs HERWIG and SUSY Les Houches files, which may be used to generate collider events. The flavor-conserving model may be any of the standard supersymmetric models, including minimal supergravity, minimal gauge-mediated supersymmetry breaking, and anomaly-mediated supersymmetry breaking supplemented by a universal scalar mass. The flavor-violating contributions may be specified in a number of ways, from specifying charges of fields under horizontal symmetries to completely specifying all flavor-violating parameters. SPICE is fully documented and publicly available, and is intended to be a user-friendly aid in the study of flavor at the Large Hadron Collider and other future colliders. Program summaryProgram title: SPICE Catalogue identifier: AEFL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 8153 No. of bytes in distributed program, including test data, etc.: 67 291 Distribution format: tar.gz Programming language: C++ Computer: Personal computer Operating system: Tested on Scientific Linux 4.x Classification: 11.1 External routines: SOFTSUSY [1,2] and SUSYHIT [3] Nature of problem: Simulation programs are required to compare theoretical models in particle physics with present and future data at particle colliders. SPICE determines the masses and decay branching ratios of

  11. Top Event Matrix Analysis Code System.

    Energy Science and Technology Software Center (ESTSC)

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  12. Towards Flexible Exascale Stream Processing System Simulation

    SciTech Connect

    Li, Cheng-Hong; Nair, Ravi; Ohba, Noboyuki; Shvadron, Uzi; Zaks, Ayal; Schenfeld, Eugen

    2012-01-01

    Stream processing is an important emerging computational model for performing complex operations on and across multi-source, high-volume, unpredictable dataflows. We present Flow, a platform for parallel and distributed stream processing system simulation that provides a flexible modeling environment for analyzing stream processing applications. The Flow stream processing system simulator is a high-performance, scalable simulator that automatically parallelizes chunks of the model space and incurs near-zero synchronization overhead for acyclic stream application graphs. We show promising parallel and distributed event rates exceeding 149 million events per second on a cluster with 512 processor cores.

  13. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  14. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  15. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  16. Device simulation of charge collection and single-event upset

    SciTech Connect

    Dodd, P.E.

    1996-04-01

    In this paper the author reviews the current status of device simulation of ionizing-radiation-induced charge collection and single-event upset (SEU), with an emphasis on significant results of recent years. The author presents an overview of device-modeling techniques applicable to the SEU problem and the unique challenges this task presents to the device modeler. He examines unloaded simulations of radiation-induced charge collection in simple p/n diodes, SEU in dynamic random access memories (DRAM`s), and SEU in static random access memories (SRAM`s). The author concludes with a few thoughts on future issues likely to confront the SEU device modeler.

  17. Extreme events evaluation over African cities with regional climate simulations

    NASA Astrophysics Data System (ADS)

    Bucchignani, Edoardo; Mercogliano, Paola; Simonis, Ingo; Engelbrecht, Francois

    2013-04-01

    The warming of the climate system in recent decades is evident from observations and is mainly related to the increase of anthropogenic greenhouse gas concentrations (IPCC, 2012). Given the expected climate change conditions on the African continent, as underlined in different publications, and their associated socio-economic impacts, an evaluation of the specific effects on some strategic African cities on the medium and long-term is of crucial importance with regard to the development of adaptation strategies. Assessments usually focus on averages climate properties rather than on variability or extremes, but often these last ones have more impacts on the society than averages values. Global Coupled Models (GCM) are generally used to simulate future climate scenarios as they guarantee physical consistency between variables; however, due to the coarse spatial resolution, their output cannot be used for impact studies on local scales, which makes necessary the generation of higher resolution climate change data. Regional Climate Models (RCM) describe better the phenomena forced by orography or by coastal lines, or that are related to convection. Therefore they can provide more detailed information on climate extremes that are hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws. The normal bias of the RCM to represent the local climatology is reduced using adequate statistical techniques based on the comparison of the simulated results with long observational time series. In the framework of the EU-FP7 CLUVA (Climate Change and Urban Vulnerability in Africa) project, regional projections of climate change at high resolution (about 8 km), have been performed for selected areas surrounding five African cities. At CMCC, the regional climate model COSMO-CLM has been employed: it is a non-hydrostatic model. For each domain, two simulations have been performed, considering the RCP4.5 and RCP8.5 emission

  18. Flash heat simulation events in the north Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Mazon, Jordi; Pino, David

    2013-04-01

    According to the definition of flash heat event proposed by Mazon et al. in the European Meteorology Meeting (2011 and 2012) from the studied case produced in the Northeast of the Iberian peninsula on 27th August 20120, some other flash heat events have been detected by automatic weather stations around the in the Mediterranean basin (South Italy, Crete island, South Greece and the northeast of the Iberian peninsula). Flash heat event covers those events in which a large increase of temperature last a spatial and temporal scale between heat wave (defined by the WMO as a phenomenon in which the daily maximum temperature of more than five consecutive days exceeds the average maximum temperature by 5°C, with respect to the 1961-1990 period) and heat burst (defined by the AMS as a rare atmospheric event characterized by gusty winds and a rapid increase in temperature and decrease in humidity that can last some minutes). Thus flash heat event may be considered as a rapid modification of the temperature that last several hours, lower than 48 hours, but usually less than 24 hours. Two different flash heat events have been simulated with the WRF mesoscale model in the Mediterranean basin. The results show that two different mechanisms are the main causes of these flash heat events. The first one occurred on 23rd March 2008 in Crete Island due to a strong Foehn effect caused by a strong south and southeast wind, in which the maximum temperature increased during some hours on the night at 32°C. The second one occurred on 1st August 2012 in the northeast of the Iberian Peninsula, caused by a rapid displacement of warm a ridge from North Africa that lasted around 24 hours.

  19. Analyses Of Transient Events In Complex Valve and Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Cavallo, Peter; Daines, Russell

    2005-01-01

    Valve systems in rocket propulsion systems and testing facilities are constantly subject to dynamic events resulting from the timing of valve motion leading to unsteady fluctuations in pressure and mass flow. Such events can also be accompanied by cavitation, resonance, system vibration leading to catastrophic failure. High-fidelity dynamic computational simulations of valve operation can yield important information of valve response to varying flow conditions. Prediction of transient behavior related to valve motion can serve as guidelines for valve scheduling, which is of crucial importance in engine operation and testing. In this paper, we present simulations of the diverse unsteady phenomena related to valve and feed systems that include valve stall, valve timing studies as well as cavitation instabilities in components utilized in the test loop.

  20. Attribution of extreme weather and climate events overestimated by unreliable climate simulations

    NASA Astrophysics Data System (ADS)

    Bellprat, Omar; Doblas-Reyes, Francisco

    2016-03-01

    Event attribution aims to estimate the role of an external driver after the occurrence of an extreme weather and climate event by comparing the probability that the event occurs in two counterfactual worlds. These probabilities are typically computed using ensembles of climate simulations whose simulated probabilities are known to be imperfect. The implications of using imperfect models in this context are largely unknown, limited by the number of observed extreme events in the past to conduct a robust evaluation. Using an idealized framework, this model limitation is studied by generating large number of simulations with variable reliability in simulated probability. The framework illustrates that unreliable climate simulations are prone to overestimate the attributable risk to climate change. Climate model ensembles tend to be overconfident in their representation of the climate variability which leads to systematic increase in the attributable risk to an extreme event. Our results suggest that event attribution approaches comprising of a single climate model would benefit from ensemble calibration in order to account for model inadequacies similarly as operational forecasting systems.

  1. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  2. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  3. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  4. Earthquake Simulations and Historical Patterns of Events: Forecasting the Next Great Earthquake in California

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Rundle, J. B.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M.; Kellogg, L. H.

    2013-12-01

    The fault system in California combined with some of the United States most densely populated regions is a recipe for devastation. It has been estimated that a repeat of the 1906 m=7.8 San Francisco earthquake could cause as much as $84 billion in damage. Earthquake forecasting can help alleviate the effects of these events by targeting disaster relief and preparedness in regions that will need it the most. However, accurate earthquake forecasting has proven difficult. We present a forecasting technique that uses simulated earthquake catalogs generated by Virtual California and patterns of historical events. As background, we also describe internal details of the Virtual California earthquake simulator.

  5. Discrete-Event Simulation Models of Plasmodium falciparum Malaria

    PubMed Central

    McKenzie, F. Ellis; Wong, Roger C.; Bossert, William H.

    2008-01-01

    We develop discrete-event simulation models using a single “timeline” variable to represent the Plasmodium falciparum lifecycle in individual hosts and vectors within interacting host and vector populations. Where they are comparable our conclusions regarding the relative importance of vector mortality and the durations of host immunity and parasite development are congruent with those of classic differential-equation models of malaria, epidemiology. However, our results also imply that in regions with intense perennial transmission, the influence of mosquito mortality on malaria prevalence in humans may be rivaled by that of the duration of host infectivity. PMID:18668185

  6. Parallel system simulation

    SciTech Connect

    Tai, H.M.; Saeks, R.

    1984-03-01

    A relaxation algorithm for solving large-scale system simulation problems in parallel is proposed. The algorithm, which is composed of both a time-step parallel algorithm and a component-wise parallel algorithm, is described. The interconnected nature of the system, which is characterized by the component connection model, is fully exploited by this approach. A technique for finding an optimal number of the time steps is also described. Finally, this algorithm is illustrated via several examples in which the possible trade-offs between the speed-up ratio, efficiency, and waiting time are analyzed.

  7. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  8. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    SciTech Connect

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  9. 3D Simulation of External Flooding Events for the RISMC Pathway

    SciTech Connect

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  10. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  11. Cardiovascular Events in Systemic Lupus Erythematosus

    PubMed Central

    Fernández-Nebro, Antonio; Rúa-Figueroa, Íñigo; López-Longo, Francisco J.; Galindo-Izquierdo, María; Calvo-Alén, Jaime; Olivé-Marqués, Alejandro; Ordóñez-Cañizares, Carmen; Martín-Martínez, María A.; Blanco, Ricardo; Melero-González, Rafael; Ibáñez-Rúan, Jesús; Bernal-Vidal, José Antonio; Tomero-Muriel, Eva; Uriarte-Isacelaya, Esther; Horcada-Rubio, Loreto; Freire-González, Mercedes; Narváez, Javier; Boteanu, Alina L.; Santos-Soler, Gregorio; Andreu, José L.; Pego-Reigosa, José M.

    2015-01-01

    Abstract This article estimates the frequency of cardiovascular (CV) events that occurred after diagnosis in a large Spanish cohort of patients with systemic lupus erythematosus (SLE) and investigates the main risk factors for atherosclerosis. RELESSER is a nationwide multicenter, hospital-based registry of SLE patients. This is a cross-sectional study. Demographic and clinical variables, the presence of traditional risk factors, and CV events were collected. A CV event was defined as a myocardial infarction, angina, stroke, and/or peripheral artery disease. Multiple logistic regression analysis was performed to investigate the possible risk factors for atherosclerosis. From 2011 to 2012, 3658 SLE patients were enrolled. Of these, 374 (10.9%) patients suffered at least a CV event. In 269 (7.4%) patients, the CV events occurred after SLE diagnosis (86.2% women, median [interquartile range] age 54.9 years [43.2–66.1], and SLE duration of 212.0 months [120.8–289.0]). Strokes (5.7%) were the most frequent CV event, followed by ischemic heart disease (3.8%) and peripheral artery disease (2.2%). Multivariate analysis identified age (odds ratio [95% confidence interval], 1.03 [1.02–1.04]), hypertension (1.71 [1.20–2.44]), smoking (1.48 [1.06–2.07]), diabetes (2.2 [1.32–3.74]), dyslipidemia (2.18 [1.54–3.09]), neurolupus (2.42 [1.56–3.75]), valvulopathy (2.44 [1.34–4.26]), serositis (1.54 [1.09–2.18]), antiphospholipid antibodies (1.57 [1.13–2.17]), low complement (1.81 [1.12–2.93]), and azathioprine (1.47 [1.04–2.07]) as risk factors for CV events. We have confirmed that SLE patients suffer a high prevalence of premature CV disease. Both traditional and nontraditional risk factors contribute to this higher prevalence. Although it needs to be verified with future studies, our study also shows—for the first time—an association between diabetes and CV events in SLE patients. PMID:26200625

  12. The Flexible Rare Event Sampling Harness System (FRESHS)

    NASA Astrophysics Data System (ADS)

    Kratzer, Kai; Berryman, Joshua T.; Taudt, Aaron; Zeman, Johannes; Arnold, Axel

    2014-07-01

    We present the software package FRESHS (http://www.freshs.org) for parallel simulation of rare events using sampling techniques from the ‘splitting’ family of methods. Initially, Forward Flux Sampling (FFS) and Stochastic Process Rare Event Sampling (SPRES) have been implemented. These two methods together make rare event sampling available for both quasi-static and full non-equilibrium regimes. Our framework provides a plugin system for software implementing the underlying physics of the system of interest. At present, example plugins exist for our framework to steer the popular MD packages GROMACS, LAMMPS and ESPResSo, but due to the simple interface of our plugin system, it is also easy to attach other simulation software or self-written code. Use of our framework does not require recompilation of the simulation program. The modular structure allows the flexible implementation of further sampling methods or physics engines and creates a basis for objective comparison of different sampling algorithms. Our code is designed to make optimal use of available compute resources. System states are managed using standard database technology so as to allow checkpointing, scaling and flexible analysis. The communication within the framework uses plain TCP/IP networking and is therefore suited to high-performance parallel hardware as well as to distributed or even heterogeneous networks of inexpensive machines. For FFS we implemented an automatic interface placement that ensures optimal, nearly constant flux through the interfaces. We introduce ‘ghost’ (or ‘look-ahead’) runs that remedy the bottleneck which occurs when progressing to the next interface. FRESHS is open-source, providing a publicly available parallelized rare event sampling system.

  13. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  14. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  15. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  16. Simulated cold events in the northern North Atlantic during the last millennium

    NASA Astrophysics Data System (ADS)

    Moreno-Chamarro, Eduardo; Zanchettin, Davide; Lohmann, Katja; Jungclaus, Johann

    2014-05-01

    Paleoceanographic data show large inter-decadal cold excursions in sea-surface temperatures (SSTs) in the western subpolar gyre region and north of Iceland throughout the last millennium. A series of such events could have contributed to demise the Norse settlements over Greenland during the 13th to the 15th century due to associated deteriorating environmental conditions in the region. However, spatial extent, attribution and mechanism(s) of these cold events are not known. In this contribution, we use climate model simulations to clarify the role of the ocean and of coupled ocean-atmosphere dynamics in triggering these cold events, and to assess whether they can be explained by internal climate variability alone. Specifically, we investigate the North Atlantic-Arctic climate variability in a 1000-year control run describing an unperturbed pre-industrial climate, and in a 3-member ensemble of full-forcing transient simulations of the last millennium. Simulations are performed with the Max Planck Institute-Earth System Model for paleo-applications. In the control and transient simulations, we identified cold events of similar amplitude and duration to the reconstructed data. Spatial patterns and temporal evolutions of simulated cold events are similar in both simulation types. In the transient runs, furthermore, they do not robustly coincide with periods of strong external forcing (e.g. of major volcanic eruptions). We therefore conclude that such events can emerge because of internally-generated regional climate variability alone. Local ocean-atmosphere coupled processes in the North Atlantic subpolar gyre region appear as key part of the mechanism of simulated cold events. In particular, they are typically associated with the onset of prolonged positive sea-level pressure anomalies over the North Atlantic and associated weaker and south-eastward displaced subpolar gyre. The salt transport reduction by the Irminger Current together with an intensification of the

  17. Coupling expert systems and simulation

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G.; Padalkar, S.; Rodriguez-Moscoso, J.; Hsieh, B. J.; Vinz, F.; Fernandez, K. R.

    1988-01-01

    A prototype coupled system called NESS (NASA Expert Simulation System) is described. NESS assists the user in running digital simulations of dynamic systems, interprets the output data to performance specifications, and recommends a suitable series compensator to be added to the simulation model.

  18. Simulation of a Storm Surge Event at the North Sea (Germany) Using a Fully Coupled Approach

    NASA Astrophysics Data System (ADS)

    Yang, J.; Graf, T.

    2012-04-01

    Tidal fluctuation and storm surge events lead to saltwater intrusion into a coastal aquifer. Tidal fluctuation causes dynamic boundary conditions of the seaside boundary, where submerged zones are of Dirichlet-type, and where aerial zones are of Neumann type. In a storm surge event, saltwater will flow on the land surface towards the inland and cover parts of the land surface. Saltwater will eventually infiltrate the unsaturated soil and percolate downwards towards the groundwater table. To simulate that dynamic coastal flow system, a fully integrated approach based on the numerical "HydroGeoSphere" model is being developed, where the coastal zone is treated as a hydraulically coupled surface-subsurface system. That new approach will allow simulation of: (i) surface flow, (ii) variably saturated, density-dependent groundwater flow, (iii) salt transport in the surface and in the subsurface, and (iv) water and salt interaction between surface and subsurface. In the new approach, tide and storm surge events induce a time variant head that is applied to nodes of the surface domain thus tide or storm surge force will be applied to the system through surface domain. The hydraulic interaction between the surface domain and the subsurface domain simplify the flow and transport boundary conditions caused by tidal fluctuation and storm surge events. This newly proposed approach is the first conceptual model of a fully coupled surface-subsurface coastal flow domain. It allows simulation of tidal activity and storm surges at a heretofore impossible complexity.

  19. Transportation Anslysis Simulation System

    SciTech Connect

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at the level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not account

  20. Transportation Anslysis Simulation System

    Energy Science and Technology Software Center (ESTSC)

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at themore » level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not

  1. Simulation of rare events in quantum error correction

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey; Vargo, Alexander

    2013-12-01

    We consider the problem of calculating the logical error probability for a stabilizer quantum code subject to random Pauli errors. To access the regime of large code distances where logical errors are extremely unlikely we adopt the splitting method widely used in Monte Carlo simulations of rare events and Bennett's acceptance ratio method for estimating the free energy difference between two canonical ensembles. To illustrate the power of these methods in the context of error correction, we calculate the logical error probability PL for the two-dimensional surface code on a square lattice with a pair of holes for all code distances d≤20 and all error rates p below the fault-tolerance threshold. Our numerical results confirm the expected exponential decay PL˜exp[-α(p)d] and provide a simple fitting formula for the decay rate α(p). Both noiseless and noisy syndrome readout circuits are considered.

  2. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  3. Rare-event simulation methods for equilibrium and non-equilibrium events

    NASA Astrophysics Data System (ADS)

    Ziff, Robert

    2014-03-01

    Rare events are those that occur with a very low probability in experiment, or are common but difficult to sample using standard computer simulation techniques. Such processes require advanced methods in order to obtain useful results in reasonable amounts of computer time. We discuss some of those techniques here, including the ``barrier'' method, splitting methods, and a Forward-Flux Sampling in Time (FFST) algorithm, and apply them to measure the nucleation times of the first-order transition in the Ziff-Gulari-Barshad model of surface catalysis, including nucleation in finite equilibrium states, which are measured to occur with probabilities as low as 10°C(-40). We also study the transitions in the Maier-Stein model of chemical kinetics, and use the methods to find the harmonic measure in percolation and Diffusion-Limited Aggregation (DLA) clusters. co-authors: David Adams, Google, and Leonard Sander, University of Michigan.

  4. Simulating neural systems with Xyce.

    SciTech Connect

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting; Warrender, Christina E.; Aimone, James Bradley; Teeter, Corinne; Duda, Alex M.

    2012-12-01

    Sandia's parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  5. Production of Nitrogen Oxides by Laboratory Simulated Transient Luminous Events

    NASA Astrophysics Data System (ADS)

    Peterson, H.; Bailey, M.; Hallett, J.; Beasley, W.

    2007-12-01

    Restoration of the polar stratospheric ozone layer has occurred at rates below those originally expected following reductions in chlorofluorocarbon (CFC) usage. Additional reactions affecting ozone depletion now must also be considered. This research examines nitrogen oxides (NOx) produced in the middle atmosphere by transient luminous events (TLEs), with NOx production in this layer contributing to the loss of stratospheric ozone. In particular, NOx produced by sprites in the mesosphere would be transported to the polar stratosphere via the global meridional circulation and downward diffusion. A pressure-controlled vacuum chamber was used to simulate middle atmosphere pressures, while a power supply and in-chamber electrodes were used to simulate TLEs in the pressure controlled environment. Chemiluminescence NOx analyzers were used to sample NOx produced by the chamber discharges- originally a Monitor Labs Model 8440E, later a Thermo Environment Model 42. Total NOx production for each discharge as well as NOx per ampere of current and NOx per Joule of discharge energy were plotted. Absolute NOx production was greatest for discharge environments with upper tropospheric pressures (100-380 torr), while NOx/J was greatest for discharge environments with stratospheric pressures (around 10 torr). The different production efficiencies in NOx/J as a function of pressure pointed to three different production regimes, each with its own reaction mechanisms: one for tropospheric pressures, one for stratospheric pressures, and one for upper stratospheric to mesospheric pressures (no greater than 1 torr).

  6. Cellular Dynamic Simulator: An Event Driven Molecular Simulation Environment for Cellular Physiology

    PubMed Central

    Byrne, Michael J.; Waxham, M. Neal; Kubota, Yoshihisa

    2010-01-01

    In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations. PMID:20361275

  7. Numerical simulation of the October 2002 dust event in Australia

    NASA Astrophysics Data System (ADS)

    Shao, Yaping; Leys, John F.; McTainsh, Grant H.; Tews, Kenn

    2007-04-01

    In comparison to the major dust sources in the Northern Hemisphere, Australia is a relatively minor contributor to the global dust budget. However, severe dust storms do occur in Australia, especially in drought years. In this study, we simulate the 22-23 October 2002 dust storm using an integrated dust model, which is probably the most severe dust storm in Australia in at least the past 40 years. The model results are compared with synoptic visibility data and satellite images and for several stations, with high-volume sampler measurements. The model simulations are then used to estimate dust load, emission, and deposition, both for over the continent and for over the ocean. The main dust sources and sinks are identified. Dust sources include the desert areas in northern South Australia, the grazing lands in western New South Wales (NSW), and the farm lands in NSW, Victoria, and Western Australia, as well as areas in Queensland and Northern Territory. The desert areas appear to be the strongest source. The maximum dust emission is around 2000 μg m-2 s-1, and the maximum net dust emission is around 500 μg m-2 s-1. The total amount of dust eroded from the Australian continent during this dust event is around 95.8 Mt, of which 93.67 Mt is deposited on the continent and 2.13 Mt in the ocean. The maximum total dust load over the simulation domain is around 5 Mt. The magnitude of this Australian dust storm corresponds to a northeast Asian dust storm of moderate size.

  8. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  9. Aging and brain rejuvenation as systemic events

    PubMed Central

    Bouchard, Jill; Villeda, Saul A

    2015-01-01

    The effects of aging were traditionally thought to be immutable, particularly evident in the loss of plasticity and cognitive abilities occurring in the aged central nervous system (CNS). However, it is becoming increasingly apparent that extrinsic systemic manipulations such as exercise, caloric restriction, and changing blood composition by heterochronic parabiosis or young plasma administration can partially counteract this age-related loss of plasticity in the aged brain. In this review, we discuss the process of aging and rejuvenation as systemic events. We summarize genetic studies that demonstrate a surprising level of malleability in organismal lifespan, and highlight the potential for systemic manipulations to functionally reverse the effects of aging in the CNS. Based on mounting evidence, we propose that rejuvenating effects of systemic manipulations are mediated, in part, by blood-borne ‘pro-youthful’ factors. Thus, systemic manipulations promoting a younger blood composition provide effective strategies to rejuvenate the aged brain. As a consequence, we can now consider reactivating latent plasticity dormant in the aged CNS as a means to rejuvenate regenerative, synaptic, and cognitive functions late in life, with potential implications even for extending lifespan. PMID:25327899

  10. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    SciTech Connect

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  11. Bioaccumulation and Aquatic System Simulator

    EPA Science Inventory

    BASS (Bioaccumulation and Aquatic )System Simulator) is a Fortran 95 simulation program that predicts the population and bioaccumulation dynamics of age-structured fish assemblages that are exposed to hydrophobic organic pollutants and class B and bord...

  12. Hydrogen Event Containment Response Code System.

    Energy Science and Technology Software Center (ESTSC)

    1999-11-23

    Version: 00 Distribution is restricted to the United States Only. HECTR1.5 (Hydrogen Event-Containment Transient Response) is a lumped-volume containment analysis program that is most useful for performing parametric studies. Its main purpose is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other containment problems. Six gases; steam, nitrogen, oxygen, hydrogen, carbon monoxide, and carbonmore » dioxide are modified along with sumps containing liquid water. HECTR can model virtually all the containment systems of importance in ice condenser, large dry and Mark III containments. A postprocessor, ACHILES1.5, is included. It processes the time-dependent variable output (compartment pressures, flow junction velocities, surface temperatures, etc.) produced by HECTR. ACHILES can produce tables and graphs of these data.« less

  13. Theorising interventions as events in systems.

    PubMed

    Hawe, Penelope; Shiell, Alan; Riley, Therese

    2009-06-01

    Conventional thinking about preventive interventions focuses over simplistically on the "package" of activities and/or their educational messages. An alternative is to focus on the dynamic properties of the context into which the intervention is introduced. Schools, communities and worksites can be thought of as complex ecological systems. They can be theorised on three dimensions: (1) their constituent activity settings (e.g., clubs, festivals, assemblies, classrooms); (2) the social networks that connect the people and the settings; and (3) time. An intervention may then be seen as a critical event in the history of a system, leading to the evolution of new structures of interaction and new shared meanings. Interventions impact on evolving networks of person-time-place interaction, changing relationships, displacing existing activities and redistributing and transforming resources. This alternative view has significant implications for how interventions should be evaluated and how they could be made more effective. We explore this idea, drawing on social network analysis and complex systems theory. PMID:19390961

  14. Discrete Event Supervisory Control Applied to Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Shah, Neerav

    2005-01-01

    The theory of discrete event supervisory (DES) control was applied to the optimal control of a twin-engine aircraft propulsion system and demonstrated in a simulation. The supervisory control, which is implemented as a finite-state automaton, oversees the behavior of a system and manages it in such a way that it maximizes a performance criterion, similar to a traditional optimal control problem. DES controllers can be nested such that a high-level controller supervises multiple lower level controllers. This structure can be expanded to control huge, complex systems, providing optimal performance and increasing autonomy with each additional level. The DES control strategy for propulsion systems was validated using a distributed testbed consisting of multiple computers--each representing a module of the overall propulsion system--to simulate real-time hardware-in-the-loop testing. In the first experiment, DES control was applied to the operation of a nonlinear simulation of a turbofan engine (running in closed loop using its own feedback controller) to minimize engine structural damage caused by a combination of thermal and structural loads. This enables increased on-wing time for the engine through better management of the engine-component life usage. Thus, the engine-level DES acts as a life-extending controller through its interaction with and manipulation of the engine s operation.

  15. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  16. Small-World Synchronized Computing Networks for Scalable Parallel Discrete-Event Simulations

    NASA Astrophysics Data System (ADS)

    Guclu, Hasan; Korniss, Gyorgy; Toroczkai, Zoltan; Novotny, Mark A.

    We study the scalability of parallel discrete-event simulations for arbitrary short-range interacting systems with asynchronous dynamics. When the synchronization topology mimics that of the short-range interacting underlying system, the virtual time horizon (corresponding to the progress of the processing elements) exhibits Kardar-Parisi-Zhang-like kinetic roughening. Although the virtual times, on average, progress at a nonzero rate, their statistical spread diverges with the number of processing elements, hindering efficient data collection. We show that when the synchronization topology is extended to include quenched random communication links between the processing elements, they make a close-to-uniform progress with a nonzero rate, without global synchronization. We discuss in detail a coarse-grained description for the small-world synchronized virtual time horizon and compare the findings to those obtained by simulating the simulations based on the exact algorithmic rules.

  17. Numerical simulation diagnostics of a flash flood event in Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Samman, Ahmad

    On 26 January 2011, a severe storm hit the city of Jeddah, the second largest city in the Kingdom of Saudi Arabia. The storm resulted in heavy rainfall, which produced a flash flood in a short period of time. This event caused at least eleven fatalities and more than 114 injuries. Unfortunately, the observed rainfall data are limited to the weather station at King Abdul Aziz International airport, which is north of the city, while the most extreme precipitation occurred over the southern part of the city. This observation was useful to compare simulation result even though it does not reflect the severity of the event. The Regional Atmospheric Modeling System (RAMS) developed at Colorado State University was used to study this storm event. RAMS simulations indicted that a quasi-stationary Mesoscale convective system developed over the city of Jeddah and lasted for several hours. It was the source of the huge amount of rainfall. The model computed a total rainfall of more than 110 mm in the southern part of the city, where the flash flood occurred. This precipitation estimation was confirmed by the actual observation of the weather radar. While the annual rainfall in Jeddah during the winter varies from 50 to 100 mm, the amount of the rainfall resulting from this storm event exceeded the climatological total annual rainfall. The simulation of this event showed that warm sea surface temperature, combined with high humidity in the lower atmosphere and a large amount of convective available potential energy (CAPE) provided a favorable environment for convection. It also showed the presence of a cyclonic system over the north and eastern parts of the Mediterranean Sea, and a subtropical anti-cyclone over Northeastern Africa that contributed to cold air advection bringing cold air to the Jeddah area. In addition, an anti-cyclone (blocking) centered over east and southeastern parts of the Arabian Peninsula and the Arabian Sea produced a low level jet over the southern

  18. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  19. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  20. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  1. System time-domain simulation

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Eggleston, T. W.; Goris, A. C.; Fashano, M.; Paynter, D.; Tranter, W. H.

    1980-01-01

    Complex systems are simulated by engineers without extensive computer experience. Analyst uses free-form engineering-oriented language to input "black box" description. System Time Domain (SYSTID) Simulation Program generates appropriate algorithms and proceeds with simulation. Program is easily linked to postprocessing routines. SYSTID program is written in FORTRAN IV for batch execution and has been implemented on UNIVAC 1110 under control of EXEC 8, Level 31.

  2. Using WIRED to study Simulated Linear Collider Detector Events

    SciTech Connect

    George, A

    2004-02-05

    The purpose of this project is to enhance the properties of the LCD WIRED Event Display. By extending the functionality of the display, physicists will be able to view events with more detail and interpret data faster. Poor characteristics associated with WIRED can severely affect the way we understand events, but by bringing attention to specific attributes we open doors to new ideas. Events displayed inside of the LCD have many different properties; this is why scientists need to be able to distinguish data using a plethora of symbols and other graphics. This paper will explain how we can view events differently using clustering and displaying results with track finding. Different source codes extracted from HEP libraries will be analyzed and tested to see which codes display the information needed. It is clear that, through these changes certain aspects of WIRED will be recognized more often allowing good event display which lead to better physics results.

  3. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  4. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Licensee event report system. 50.73 Section 50.73 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF PRODUCTION AND UTILIZATION FACILITIES Inspections, Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder of an operating license under this...

  5. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  6. Regional Climate Simulation of the Anomalous Events of 1998 using a Stretched-Grid GCM with Multiple Areas of Interest

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The GEOS (Goddard Earth Observing System) stretched-grid (SG) GCM developed and thoroughly tested over the last few years, is used for simulating the major anomalous regional climate events of 1998. The anomalous regional climate events are simulated simultaneously during the 13 months long (November-1997 - December-1998) SG-GCM simulation due to using the new SG-design with multiple (four) areas of interest. The following areas/regions of interest (one at each global quadrant) are implemented: U.S./Northern Mexico, the El-Nino/Brazil area, India-China, and Eastern Indian Ocean/Australia.

  7. Decision support system for managing oil spill events.

    PubMed

    Keramitsoglou, Iphigenia; Cartalis, Constantinos; Kassomenos, Pavlos

    2003-08-01

    The Mediterranean environment is exposed to various hazards, including oil spills, forest fires, and floods, making the development of a decision support system (DSS) for emergency management an objective of utmost importance. The present work presents a complete DSS for managing marine pollution events caused by oil spills. The system provides all the necessary tools for early detection of oil-spills from satellite images, monitoring of their evolution, estimation of the accident consequences and provision of support to responsible Public Authorities during clean-up operations. The heart of the system is an image processing-geographic information system and other assistant individual software tools that perform oil spill evolution simulation and all other necessary numerical calculations as well as cartographic and reporting tasks related to a specific management of the oil spill event. The cartographic information is derived from the extant general maps representing detailed information concerning several regional environmental and land-cover characteristics as well as financial activities of the application area. Early notification of the authorities with up-to-date accurate information on the position and evolution of the oil spill, combined with the detailed coastal maps, is of paramount importance for emergency assessment and effective clean-up operations that would prevent environmental hazard. An application was developed for the Region of Crete, an area particularly vulnerable to oil spills due to its location, ecological characteristics, and local economic activities. PMID:14753653

  8. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  9. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  10. Event Plane Resolution Simulations for The Fast Interaction Trigger Detector of ALICE at the LHC

    NASA Astrophysics Data System (ADS)

    Sulaimon, Isiaka; Harton, Austin; Garcia, Edmundo; Alice-Fit Collaboration

    2016-03-01

    CERN (European Center for Nuclear Research) is a global laboratory that studies proton and heavy ion collisions at the Large Hadron Collider (LHC). ALICE (A Large Ion Collider Experiment) is one of four large experiments of the LHC. ALICE is dedicated to the study of the transition of matter to Quark Gluon Plasma in heavy ion collisions. In the present ALICE detector there are two sub-detectors, (the T0 and V0), that provide minimum bias trigger, multiplicity trigger, beam-gas event rejection, collision time for other sub detectors, on line multiplicity and event plane determination. In order to adapt these functionalities to the collision rates expected for the LHC upgrade after 2020, it is planned to replace these systems by a single detector system, called the Fast Interaction Trigger (FIT). In this presentation we describe the performance parameters of the FIT upgrade; show the proposed characteristics of the T0-Plus and the simulations that support the conceptual design of this detector. In particular we describe the performance simulations of the event plane resolution. This material is based upon work supported by the National Science Foundation under Grants NSF-PHY-0968903 and NSF-PHY-1305280.

  11. Parallelized event chain algorithm for dense hard sphere and polymer systems

    SciTech Connect

    Kampmann, Tobias A. Boltz, Horst-Holger; Kierfeld, Jan

    2015-01-15

    We combine parallelization and cluster Monte Carlo for hard sphere systems and present a parallelized event chain algorithm for the hard disk system in two dimensions. For parallelization we use a spatial partitioning approach into simulation cells. We find that it is crucial for correctness to ensure detailed balance on the level of Monte Carlo sweeps by drawing the starting sphere of event chains within each simulation cell with replacement. We analyze the performance gains for the parallelized event chain and find a criterion for an optimal degree of parallelization. Because of the cluster nature of event chain moves massive parallelization will not be optimal. Finally, we discuss first applications of the event chain algorithm to dense polymer systems, i.e., bundle-forming solutions of attractive semiflexible polymers.

  12. Towards High Performance Discrete-Event Simulations of Smart Electric Grids

    SciTech Connect

    Perumalla, Kalyan S; Nutaro, James J; Yoginath, Srikanth B

    2011-01-01

    Future electric grid technology is envisioned on the notion of a smart grid in which responsive end-user devices play an integral part of the transmission and distribution control systems. Detailed simulation is often the primary choice in analyzing small network designs, and the only choice in analyzing large-scale electric network designs. Here, we identify and articulate the high-performance computing needs underlying high-resolution discrete event simulation of smart electric grid operation large network scenarios such as the entire Eastern Interconnect. We focus on the simulator's most computationally intensive operation, namely, the dynamic numerical solution for the electric grid state, for both time-integration as well as event-detection. We explore solution approaches using general-purpose dense and sparse solvers, and propose a scalable solver specialized for the sparse structures of actual electric networks. Based on experiments with an implementation in the THYME simulator, we identify performance issues and possible solution approaches for smart grid experimentation in the large.

  13. Connecting Macroscopic Observables and Microscopic Assembly Events in Amyloid Formation Using Coarse Grained Simulations

    PubMed Central

    Bieler, Noah S.; Knowles, Tuomas P. J.; Frenkel, Daan; Vácha, Robert

    2012-01-01

    The pre-fibrillar stages of amyloid formation have been implicated in cellular toxicity, but have proved to be challenging to study directly in experiments and simulations. Rational strategies to suppress the formation of toxic amyloid oligomers require a better understanding of the mechanisms by which they are generated. We report Dynamical Monte Carlo simulations that allow us to study the early stages of amyloid formation. We use a generic, coarse-grained model of an amyloidogenic peptide that has two internal states: the first one representing the soluble random coil structure and the second one the -sheet conformation. We find that this system exhibits a propensity towards fibrillar self-assembly following the formation of a critical nucleus. Our calculations establish connections between the early nucleation events and the kinetic information available in the later stages of the aggregation process that are commonly probed in experiments. We analyze the kinetic behaviour in our simulations within the framework of the theory of classical nucleated polymerisation, and are able to connect the structural events at the early stages in amyloid growth with the resulting macroscopic observables such as the effective nucleus size. Furthermore, the free-energy landscapes that emerge from these simulations allow us to identify pertinent properties of the monomeric state that could be targeted to suppress oligomer formation. PMID:23071427

  14. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    PubMed Central

    van Beek, Johannes H. G. M.; Supandi, Farahaniza; Gavai, Anand K.; de Graaf, Albert A.; Binsl, Thomas W.; Hettling, Hannes

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible. PMID:21969677

  15. Healthcare system simulation using Witness

    NASA Astrophysics Data System (ADS)

    Khakdaman, Masoud; Zeinahvazi, Milad; Zohoori, Bahareh; Nasiri, Fardokht; Yew Wong, Kuan

    2013-02-01

    Simulation techniques have a proven track record in manufacturing industry as well as other areas such as healthcare system improvement. In this study, simulation model of a health center in Malaysia is developed through the application of WITNESS simulation software which has shown its flexibility and capability in manufacturing industry. Modelling procedure is started through process mapping and data collection and continued with model development, verification, validation and experimentation. At the end, final results and possible future improvements are demonstrated.

  16. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  17. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example. PMID:20345550

  18. A multiprocessor operating system simulator

    SciTech Connect

    Johnston, G.M.; Campbell, R.H. . Dept. of Computer Science)

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  19. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  20. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  1. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    SciTech Connect

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-06-19

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described.

  2. Impulsive events in the evolution of a forced nonlinear system

    SciTech Connect

    Longcope, D.W.; Sudan, R.N. )

    1992-03-16

    Long-time numerical solutions of a low-dimensional model of the reduced MHD equations show that, when this system is driven quasistatically, the response is punctuated by impulsive events. The statistics of these events indicate a Poisson process; the frequency of these events scales as {Delta}{ital E}{sub {ital M}}{sup {minus}1}, where {Delta}{ital E}{sub {ital M}} is the energy released in one event.

  3. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  4. AP1000 Design Basis Event Simulation at the APEX-1000 Test Facility

    SciTech Connect

    Wright, Richard F.; Groome, John

    2004-07-01

    The AP1000 is a 1000 MWe advanced nuclear power plant that uses passive safety features to enhance plant safety and to provide significant and measurable improvements in plant simplification, reliability, investment protection and plant costs. The AP1000 relies heavily on the 600 MWe AP600 which received design certification in 1999. A critical part of the AP600 design certification process involved the testing of the passive safety systems. A one-fourth height, one-fourth pressure test facility, APEX-600, was constructed at the Oregon State University to study design basis events, and to provide a body of data to be used to validate the computer models used to analyze the AP600. This facility was extensively modified to reflect the design changes for AP1000 including higher power in the electrically heated rods representing the reactor core, and changes in the size of the pressurizer, core makeup tanks and automatic depressurization system. Several design basis events are being simulated at APEX-1000 including a double-ended direct vessel injection (DEDVI) line break and a 2-inch cold leg break. These tests show that the core remains covered with ample margin until gravity injection is established regardless of the initiating event. The tests also show that liquid entrainment from the upper plenum which is proportional to the reactor power does not impact the ability of the passive core cooling system to keep the core covered. (authors)

  5. A Performance Study of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event processing engines are used in diverse mission-critical scenarios such as fraud detection, traffic monitoring, or intensive care units. However, these scenarios have very different operational requirements in terms of, e.g., types of events, queries/patterns complexity, throughput, latency and number of sources and sinks. What are the performance bottlenecks? Will performance degrade gracefully with increasing loads? In this paper we make a first attempt to answer these questions by running several micro-benchmarks on three different engines, while we vary query parameters like window size, window expiration type, predicate selectivity, and data values. We also perform some experiments to assess engines scalability with respect to number of queries and propose ways for evaluating their ability in adapting to changes in load conditions. Lastly, we show that similar queries have widely different performances on the same or different engines and that no engine dominates the other two in all scenarios.

  6. Simulation of heavy rainfall events over Indian region: a benchmark skill with a GCM

    NASA Astrophysics Data System (ADS)

    Goswami, Prashant; Kantha Rao, B.

    2015-10-01

    Extreme rainfall events (ERE) contribute a significant component of the Indian summer monsoon rainfall. Thus an important requirement for regional climate simulations is to attain desirable quality and reliability in simulating the extreme rainfall events. While the global circulation model (GCM) with coarse resolution are not preferred for simulation of extreme events, it is expected that the global domain in a GCM would allow better representation of scale interactions, resulting in adequate skill in simulating localized events in spite of lower resolution. At the same time, a GCM with skill in simulation of extreme events will provide a more reliable tool for seamless prediction. The present work provides an assessment of a GCM for simulating 40 ERE that occurred over India during 1998-2013. It is found that, expectedly, the GCM forecasts underestimate the observed (TRMM) rainfall in most cases, but not always. Somewhat surprisingly, the forecasts of location are quite accurate in spite of low resolution (~50 km). An interesting result is that the highest skill of the forecasts is realized at 48 h lead rather than at 24 or 96 h lead. Diagnostics of dynamical fields like convergence shows that the forecasts can capture contrasting features on pre-event, event and post-event days. The forecast configuration used is similar to one that has been used for long-range monsoon forecasting and tropical cyclones in earlier studies; the present results on ERE forecasting, therefore, provide an indication for the potential application of the model for seamless prediction.

  7. Stochastic Event Counter for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Tae-Sic Yoo; Humberto E. Garcia

    2008-06-01

    This paper addresses the issues of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). First, we develop a noble recursive procedure that updates active counter information state sequentially with available observations. In general, the cardinality of active counter information state is unbounded, which makes the exact recursion infeasible computationally. To overcome this difficulty, we develop an approximated recursive procedure that regulates and bounds the size of active counter information state. Using the approximated active counting information state, we give an approximated minimum mean square error (MMSE) counter. The developed algorithms are then applied to count special routing events in a material flow system.

  8. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  9. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    NASA Astrophysics Data System (ADS)

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  10. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    PubMed Central

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-01-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the dermis upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  11. The influence of spectral nudging in simulating Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2015-04-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  12. The influence of spectral nudging in simulating individual Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2014-05-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  13. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  14. MCNP6. Simulating Correlated Data in Fission Events

    SciTech Connect

    Rising, Michael Evan; Sood, Avneet

    2015-12-03

    This report is a series of slides discussing the MCNP6 code and its status in simulating fission. Applications of interest include global security and nuclear nonproliferation, detection of special nuclear material (SNM), passive and active interrogation techniques, and coincident neutron and photon leakage.

  15. A System for Interactive Behaviour Simulation.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    A psycho-ecological model is used as the basis for a simulation of interactive behavior strategies. The basic unit is an event, and each event has been recorded on closed circuit television videotape. The three basic paradigms of behavioral science--association, structure, and process--are used to anchor the simulations. The empirical foundation…

  16. Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events

    PubMed Central

    2015-01-01

    Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed. PMID:26390294

  17. Modeling and simulation of single-event effect in CMOS circuit

    NASA Astrophysics Data System (ADS)

    Suge, Yue; Xiaolin, Zhang; Yuanfu, Zhao; Lin, Liu; Hanning, Wang

    2015-11-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed.

  18. Mesoscale Simulations of a Wind Ramping Event for Wind Energy Prediction

    SciTech Connect

    Rhodes, M; Lundquist, J K

    2011-09-21

    Ramping events, or rapid changes of wind speed and wind direction over a short period of time, present challenges to power grid operators in regions with significant penetrations of wind energy in the power grid portfolio. Improved predictions of wind power availability require adequate predictions of the timing of ramping events. For the ramping event investigated here, the Weather Research and Forecasting (WRF) model was run at three horizontal resolutions in 'mesoscale' mode: 8100m, 2700m, and 900m. Two Planetary Boundary Layer (PBL) schemes, the Yonsei University (YSU) and Mellor-Yamada-Janjic (MYJ) schemes, were run at each resolution as well. Simulations were not 'tuned' with nuanced choices of vertical resolution or tuning parameters so that these simulations may be considered 'out-of-the-box' tests of a numerical weather prediction code. Simulations are compared with sodar observations during a wind ramping event at a 'West Coast North America' wind farm. Despite differences in the boundary-layer schemes, no significant differences were observed in the abilities of the schemes to capture the timing of the ramping event. As collaborators have identified, the boundary conditions of these simulations probably dominate the physics of the simulations. They suggest that future investigations into characterization of ramping events employ ensembles of simulations, and that the ensembles include variations of boundary conditions. Furthermore, the failure of these simulations to capture not only the timing of the ramping event but the shape of the wind profile during the ramping event (regardless of its timing) indicates that the set-up and execution of such simulations for wind power forecasting requires skill and tuning of the simulations for a specific site.

  19. Systems Engineering Simulator (SES) Simulator Planning Guide

    NASA Technical Reports Server (NTRS)

    McFarlane, Michael

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  20. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  1. Efficient event-driven simulations shed new light on microtubule organization in the plant cortical array

    NASA Astrophysics Data System (ADS)

    Tindemans, Simon H.; Deinum, Eva E.; Lindeboom, Jelmer J.; Mulder, Bela M.

    2014-04-01

    The dynamics of the plant microtubule cytoskeleton is a paradigmatic example of the complex spatiotemporal processes characterising life at the cellular scale. This system is composed of large numbers of spatially extended particles, each endowed with its own intrinsic stochastic dynamics, and is capable of non-equilibrium self-organisation through collisional interactions of these particles. To elucidate the behaviour of such a complex system requires not only conceptual advances, but also the development of appropriate computational tools to simulate it. As the number of parameters involved is large and the behaviour is stochastic, it is essential that these simulations be fast enough to allow for an exploration of the phase space and the gathering of sufficient statistics to accurately pin down the average behaviour as well as the magnitude of fluctuations around it. Here we describe a simulation approach that meets this requirement by adopting an event-driven methodology that encompasses both the spontaneous stochastic changes in microtubule state as well as the deterministic collisions. In contrast with finite time step simulations this technique is intrinsically exact, as well as several orders of magnitude faster, which enables ordinary PC hardware to simulate systems of ˜ 10^3 microtubules on a time scale ˜ 10^{3} faster than real time. In addition we present new tools for the analysis of microtubule trajectories on curved surfaces. We illustrate the use of these methods by addressing a number of outstanding issues regarding the importance of various parameters on the transition from an isotropic to an aligned and oriented state.

  2. Simulation of January 1-7, 1978 events

    NASA Technical Reports Server (NTRS)

    Chao, J. K.; Moldwin, M. B.; Akasofu, S.-I.

    1987-01-01

    The solar wind disturbances of January 1 to 7, 1978 are reconstructed by a modeling method. First, the interplanetary magnetic field (IMF) background pattern, including a corotating shock, is reproduced using the Stanford source surface map. Then, two solar flares with their onset times on January 1, 0717 UT at S17 deg E10 deg and 2147 UT S17 deg E32 deg, respectively, are selected to generate two interplanetary transient shocks. It is shown that these two shocks interacted with the corotating shock, resulting in a series of interplanetary events observed by four spacecraft, Helios 1 and 2, IMP-8 (Interplanetary Monitoring Platform 8), and Voyager 2. Results show that these three shock waves interact and coalesce in interplanetary space such that Helios 2 and Voyager 2 observed only one shock and Helios 1 and IMP-8 observed two shocks. All shocks observed by the four spacecraft, except the corotating shock at Helios 1, are either a transient shock or a shock which is formed from coalescing of the transient shocks with the corotating shock. The method is useful in reconstructing a very complicated chain of interplanetary events observed by a number of spacecraft.

  3. Simulation of LHC events on a millions threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2015-12-01

    Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.

  4. Analysis of Extreme Events in Regional Climate Model Simulations for the Pacific Northwest using weatherathome

    NASA Astrophysics Data System (ADS)

    Mera, R. J.; Mote, P.; Weber, J.

    2011-12-01

    One of the most prominent impacts of climate change over the Pacific Northwest is the potential for an elevated number of extreme precipitation events over the region. Planning for natural hazards such as increasing number of floods related to high-precipitation events have, in general, focused on avoiding development in floodplains and conditioning development to withstand inundation with a minimum of losses. Nationwide, the Federal Emergency Management Agency (FEMA) estimates that about one quarter of its payments cover damage that has occurred outside mapped floodplains. It is clear that traditional flood-based planning will not be sufficient to predict and avoid future losses resulting from climate-related hazards such as high-precipitation events. In order to address this problem, the present study employs regional climate model output for future climate change scenarios to aid with the development of a map-based inventory of future hazard risks that can contribute to the development of a "planning-scale" decision support system for the Oregon Department of Land Conservation and Development (DLCD). Climate model output is derived from the climateprediction.net (CPDN) weatherathome project, an innovative climate science experiment that utilizes volunteer computers from users worldwide to produce hundreds of thousands superensembles of regional climate simulations of the Western United States climate from 1950 to 2050. The spatial and temporal distribution of extreme weather events are analyzed for the Pacific Northwest to diagnose the model's capabilities as an input for map products such as impacts on hydrology. Special attention is given to intensity and frequency of Atmospheric River events in historical and future climate contexts.

  5. Constructive episodic simulation: temporal distance and detail of past and future events modulate hippocampal engagement.

    PubMed

    Addis, Donna Rose; Schacter, Daniel L

    2008-01-01

    Behavioral, lesion and neuroimaging evidence show striking commonalities between remembering past events and imagining future events. In a recent event-related fMRI study, we instructed participants to construct a past or future event in response to a cue. Once an event was in mind, participants made a button press, then generated details (elaboration) and rated them. The elaboration of past and future events recruited a common neural network. However, regions within this network may respond differentially to event characteristics, such as the amount of detail generated and temporal distance, depending on whether the event is in the past or future. To investigate this further, we conducted parametric modulation analyses, with temporal distance and detail as covariates, and focused on the medial temporal lobes and frontopolar cortex. The analysis of detail (independent of temporal distance) showed that the left posterior hippocampus was responsive to the amount of detail comprising both past and future events. In contrast, the left anterior hippocampus responded differentially to the amount of detail comprising future events, possibly reflecting the recombination of details into a novel future event. The analysis of temporal distance revealed that the increasing recency of past events correlated with activity in the right parahippocampus gyrus (Brodmann area (BA) 35/36), while activity in the bilateral hippocampus was significantly correlated with the increasing remoteness of future events. We propose that the hippocampal response to the distance of future events reflects the increasing disparateness of details likely included in remote future events, and the intensive relational processing required for integrating such details into a coherent episodic simulation of the future. These findings provide further support for the constructive episodic simulation hypothesis (Schacter and Addis (2007) Philos Trans R Soc Lond B Biol Sci 362:773-786) and highlight the

  6. How well do CORDEX models simulate extreme rainfall events over the East Coast of South Africa?

    NASA Astrophysics Data System (ADS)

    Abba Omar, Sabina; Abiodun, Babatunde J.

    2016-01-01

    This study assesses the capability of regional climate models (RCMs) in simulating the characteristics of widespread extreme rainfall events over the East Coast of South Africa. Simulations of nine RCMs from the Coordinated Regional Downscaling Experiment (CORDEX) were analyzed for the study. All the simulations cover 12 years (1996-2008). Using the 95th percentile of daily rainfall as the threshold of extreme events and the simultaneous occurrence of extreme events over 50 % of the East Coast as widespread extreme events (WERE), we compared the characteristics of simulated WERE with observations (GPCP and TRMM) and with the reanalysis (ERAINT) that forced the simulations. Most RCMs perform well in simulating the seasonal variation of WEREs over the East Coast but perform poorly in simulating the interannual variability. Based on their rainfall synoptic patterns over Southern Africa, the WEREs in the East Coast can be generally classified into four groups. The first group connects the WEREs with tropical rainfall activities over the subcontinent. The second group links WEREs with frontal rainfall south of the subcontinent. The third group links the WEREs with both tropical and temperate rainfall activities while the fourth group represents isolated WEREs. The RCMs show different capabilities in simulating the frequency of WERE in each group, some perform better than ERAINT while some perform worse. Results of this study could provide information on the usability of RCMs in downscaling the impact of climate change on widespread extreme rainfall events over South Africa.

  7. Statistics of Record-Breaking Events in the Self-Organized Critical Systems

    NASA Astrophysics Data System (ADS)

    Shcherbakov, R.; Newman, W. I.; Turcotte, D. L.; Davidsen, J.; Tiampo, K.; Rundle, J. B.

    2010-12-01

    Record-breaking events generated by the dynamics of driven nonlinear threshold systems are extracted and analyzed. They are compared to the record-breaking events extracted from the sequences of independent identically distributed (i.i.d.) random variables drawn from the Weibull distribution. Several statistical measures of record-breaking events are derived analytically and confirmed through numerical simulations for Weibull and power-law distributed random variables. Driven nonlinear threshold systems usually exhibit avalanche type behavior, where slow buildup of energy is punctuated by an abrupt release of energy through avalanche events which usually follow scale invariant statistics. From the simulations of these systems it is possible to extract a sequence of record-breaking avalanches, where each subsequent record-breaking event is larger in magnitude than the previous one and all events in between are smaller than the current record-breaking event and the previous one. In the present work, several cellular automata are analyzed among them the sandpile model, Manna model, Olami-Feder-Christensen (OFC) model, and the forest-fire model to investigate the record-breaking statistics of model avalanches exhibiting temporal and spatial correlations. It is found that the statistics of record-breaking events for the above cellular automata exhibit behavior different from that observed for i.i.d. random variables which signifies their complex spatio-temporal dynamics. The most pronounced deviations are observed in the case of the OFC model with a strong dependence on the conservation parameter of the model.

  8. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired. PMID:27286268

  9. Simulations of Diffusion in Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Pei, C.; Jokipii, J.; Giacalone, J.

    2007-12-01

    New observations by high-sensitivity instruments onboard the ACE spacecraft show that Fe and O may share similar injection profiles close the solar surface, and that diffusion dominates the transport of these particles (Mason et al 2006). Multi-spacecraft observations by Helios and IMP-8 also confirm the spatial diffusion is important (Wibberenz & Cane 2006). The "reservoir" phenomenon or "spatial invariance" states that during the decay phase of individual gradual solar energetic particle events, the intensities measured by different spacecraft are nearly equal, even if these spacecraft are separated by several AU in radius and by 70 degrees in latitude. Results from our multidimensional numerical model, based on Parker's transport equation, with reasonable values of κ\\perp and κ\\| are compared with observations from Ulysses, IMP-8, and ACE. We demonstrate that most of the features of the "reservoir" phenomenon can be reproduced by a transport model which includes drift, energy loss, and spatial diffusion.

  10. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  11. An event generator for simulations of complex β-decay experiments

    NASA Astrophysics Data System (ADS)

    Jordan, D.; Algora, A.; Tain, J. L.

    2016-08-01

    This article describes a Monte Carlo event generator for the design, optimization and performance characterization of beta decay spectroscopy experimental set-ups. The event generator has been developed within the Geant4 simulation architecture and provides new features and greater flexibility in comparison with the current available decay generator.

  12. Systems simulations supporting NASA telerobotics

    NASA Technical Reports Server (NTRS)

    Harrison, F. W., Jr.; Pennington, J. E.

    1987-01-01

    Two simulation and analysis environments have been developed to support telerobotics research at the Langley Research Center. One is a high-fidelity, nonreal-time, interactive model called ROBSIM, which combines user-generated models of workspace environment, robots, and loads into a working system and simulates the interaction among the system components. Models include user-specified actuator, sensor, and control parameters, as well as kinematic and dynamic characteristics. Kinematic, dynamic, and response analyses can be selected, with system configuration, task trajectories, and arm states displayed using computer graphics. The second environment is a real-time, manned Telerobotic Systems Simulation (TRSS) which uses the facilities of the Intelligent Systems Research Laboratory (ISRL). It utilizes a hierarchical structure of functionally distributed computers communicating over both parallel and high-speed serial data paths to enable studies of advanced telerobotic systems. Multiple processes perform motion planning, operator communications, forward and inverse kinematics, control/sensor fusion, and I/O processing while communicating via common memory. Both ROBSIM and TRSS, including their capability, status, and future plans are discussed. Also described is the architecture of ISRL and recent telerobotic system studies in ISRL.

  13. The ISOPHOT Mapping Simulation System

    NASA Astrophysics Data System (ADS)

    Gabriel, C.; Hur, M.

    2002-12-01

    From version 9.0 onwards, the ISOPHOT Interactive Anal- ysis (PIA) package offers its users an integrated mapping simu- lation system, capable of generating sky images including sev- eral point / extended sources on a flat / gradient background, simulating what ISOPHOT would have recorded under certain instrument and spacecraft raster configurations. While the ben- efits of performing simulations for accessing the efficiency, ac- curacy, confusion level, etc., on different mapping algorithms and deconvolution techniques in and outside PIA are mostly of interest to calibrators and instrument specialists, it is also very important for a general observer because this highly user friendly system provides the possibility of simulating his / her observation by matching the selected observing mode.

  14. Calculation of 239Pu fission observables in an event-by-event simulation

    SciTech Connect

    Vogt, R; Randrup, J; Pruet, J; Younes, W

    2010-03-31

    The increased interest in more exclusive fission observables has demanded more detailed models. We describe a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including any interesting correlations. The various model assumptions are described and the potential utility of the model is illustrated. As a concrete example, we use formal statistical methods, experimental data on neutron production in neutron-induced fission of {sup 239}Pu, along with FREYA, to develop quantitative insights into the relation between reaction observables and detailed microscopic aspects of fission. Current measurements of the mean number of prompt neutrons emitted in fission taken together with less accurate current measurements for the prompt post-fission neutron energy spectrum, up to the threshold for multi-chance fission, place remarkably fine constraints on microscopic theories.

  15. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family

    NASA Astrophysics Data System (ADS)

    Custodio, M. D. S.; Ambrizzi, T.; Da Rocha, R.

    2015-12-01

    The increased horizontal resolution of climate models aims to improve the simulations accuracy and to understand the non-linear processes during interactions between different spatial scales within the climate system. Up to this moment, these interactions did not have a good representation on low horizontal resolution GCMs. The variations of extreme climatic events had been described and analyzed in the scientific literature. In a scenario of global warming it is necessary understanding and explaining extreme events and to know if global models may represent these events. The purpose of this study was to understand the impact of the horizontal resolution in high resolution coupled and atmospheric global models of HiGEM project in simulating atmospheric patterns and processes of interaction between spatial scales. Moreover, evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model in capturing the signal of interannual and intraseasonal variability of precipitation over Amazon region. The results indicated that the grid refinement and ocean-atmosphere coupling contributes to a better representation of seasonal patterns, both precipitation and temperature, on the Amazon region. Besides, the climatic models analyzed represent better than other models (regional and global) the climatic characteristics of this region. This indicates a breakthrough in the development of high resolution climate models. Both coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The interannual variability analysis showed that coupled simulations intensify the impact of the ENSO in the Amazon. In the intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. The simulation of ENSO in GCMs can be attributed to their high

  16. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  17. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  18. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings. PMID:26310949

  19. Development of a robust and automated infrasound event catalogue using the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Arrowsmith, Stephen; Euler, Garrett; Marcillo, Omar; Blom, Philip; Whitaker, Rod; Randall, George

    2015-03-01

    Methods for detecting, associating and locating infrasound events recorded on the global International Monitoring System (IMS) infrasound network are presented. By using likelihood arguments, and reducing the use of empirically determined parameters, our techniques enable us to formally quantify the false alarm rate at both station and network levels, and to calculate confidence areas for event localization. We outline a new association technique that uses graph theory for associating arrivals at multiple spatially separated stations, and perform Monte Carlo simulations to quantify the performance of the scheme under different scenarios. The detection, association and location techniques are applied to 10 large events in the Reviewed Event Bulletin of the Comprehensive Nuclear Test Ban Treaty Organization. Out of 10 events, a total of seven were automatically detected and associated. By analysing the three missed events, we identify improvements that might be made to improve the algorithms.

  20. Experience producing simulated events for the DZero experiment on the SAM-Grid

    SciTech Connect

    Garzoglio, G.; Terekhov, I.; Snow, J.; Jain, S.; Nishandar, A.; /Texas U., Arlington

    2004-12-01

    Most of the simulated events for the DZero experiment at Fermilab have been historically produced by the ''remote'' collaborating institutions. One of the principal challenges reported concerns the maintenance of the local software infrastructure, which is generally different from site to site. As the understanding of the distributed computing community over distributively owned and shared resources progresses, the adoption of grid technologies to address the production of Monte Carlo events for high energy physics experiments becomes increasingly interesting. SAM-Grid is a software system developed at Fermilab, which integrates standard grid technologies for job and information management with SAM, the data handling system of the DZero and CDF experiments. During the past few months, this grid system has been tailored for the Monte Carlo production of DZero. Since the initial phase of deployment, this experience has exposed an interesting series of requirements to the SAM-Grid services, the standard middleware, the resources and their management and to the analysis framework of the experiment. As of today, the inefficiency due to the grid infrastructure has been reduced to as little as 1%. In this paper, we present our statistics and the ''lessons learned'' in running large high energy physics applications on a grid infrastructure.

  1. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  2. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  3. Probabilities for large events in driven threshold systems

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Holliday, James R.; Graves, William R.; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William

    2012-08-01

    Many driven threshold systems display a spectrum of avalanche event sizes, often characterized by power-law scaling. An important problem is to compute probabilities of the largest events (“Black Swans”). We develop a data-driven approach to the problem by transforming to the event index frame, and relating this to Shannon information. For earthquakes, we find the 12-month probability for magnitude m>6 earthquakes in California increases from about 30% after the last event, to 40%-50% prior to the next one.

  4. Aided targeting system simulation evaluation

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Becker, Curtis

    1994-01-01

    Simulation research was conducted at the Crew Station Research and Development Facility on the effectiveness and ease of use of three targeting systems. A manual system required the aviator to scan a target array area with a simulated second generation forward looking infrared (FLIR) sensor, locate and categorize targets, and construct a target hand-off list. The interface between the aviator and the system was like that of an advanced scout helicopter (manual mode). Two aided systems detected and categorized targets automatically. One system used only the FLIR sensor and the second used FLIR fused with Longbow radar. The interface for both was like that of an advanced scout helicopter aided mode. Exposure time while performing the task was reduced substantially with the aided systems, with no loss of target hand-off list accuracy. The fused sensor system showed lower time to construct the target hand-off list and a slightly lower false alarm rate than the other systems. A number of issues regarding system sensitivity and criterion, and operator interface design are discussed.

  5. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  6. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    NASA Astrophysics Data System (ADS)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  7. An intelligent simulation training system

    NASA Technical Reports Server (NTRS)

    Biegel, John E.

    1990-01-01

    The Department of Industrial Engineering at the University of Central Florida, Embry-Riddle Aeronautical University and General Electric (SCSD) have been funded by the State of Florida to build an Intelligent Simulation Training System. The objective was and is to make the system generic except for the domain expertise. Researchers accomplished this objective in their prototype. The system is modularized and therefore it is easy to make any corrections, expansions or adaptations. The funding by the state of Florida has exceeded $3 million over the past three years and through the 1990 fiscal year. UCF has expended in excess of 15 work years on the project. The project effort has been broken into three major tasks. General Electric provides the simulation. Embry-Riddle Aeronautical University provides the domain expertise. The University of Central Florida has constructed the generic part of the system which is comprised of several modules that perform the tutoring, evaluation, communication, status, etc. The generic parts of the Intelligent Simulation Training Systems (ISTS) are described.

  8. Simulator verification techniques study. Integrated simulator self test system concepts

    NASA Technical Reports Server (NTRS)

    Montoya, G.; Wenglinski, T. H.

    1974-01-01

    Software and hardware requirements for implementing hardware self tests are presented in support of the development of training and procedures development simulators for the space shuttle program. Self test techniques for simulation hardware and the validation of simulation performance are stipulated. The requirements of an integrated simulator self system are analyzed. Readiness tests, fault isolation tests, and incipient fault detection tests are covered.

  9. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, D.P.C.; Sala, O.E.; Allen, C.D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas. ?? The Ecological Society of America.

  10. Active magnetic bearing-supported rotor with misaligned cageless backup bearings: A dropdown event simulation model

    NASA Astrophysics Data System (ADS)

    Halminen, Oskari; Kärkkäinen, Antti; Sopanen, Jussi; Mikkola, Aki

    2015-01-01

    Active magnetic bearings (AMB) offer considerable benefits compared to regular mechanical bearings. On the other hand, they require backup bearings to avoid damage resulting from a failure in the component itself, or in the power or control system. During a rotor-bearing contact event - when the magnetic field has disappeared and the rotor drops on the backup bearings - the structure of the backup bearings has an impact on the dynamic actions of the rotor. In this paper, the dynamics of an active magnetic bearing-supported rotor during contact with backup bearings is studied with a simulation model. Modeling of the backup bearings is done using a comprehensive cageless ball bearing model. The elasticity of the rotor is described using the finite element method (FEM) and the degrees of freedom (DOF) of the system are reduced using component mode synthesis. Verification of the misaligned cageless backup bearings model is done by comparing the simulation results against the measurement results. The verified model with misaligned cageless backup bearings is found to correspond to the features of a real system.

  11. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  12. Stochastic simulation in systems biology

    PubMed Central

    Székely, Tamás; Burrage, Kevin

    2014-01-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest. PMID:25505503

  13. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  14. The Impact of Land Cover Change on a Simulated Storm Event in the Sydney Basin

    NASA Astrophysics Data System (ADS)

    Gero, A. F.; Pitman, A. J.

    2006-02-01

    The Regional Atmospheric Modeling System (RAMS) was run at a 1-km grid spacing over the Sydney basin in Australia to assess the impact of land cover change on a simulated storm event. The simulated storm used NCEP NCAR reanalysis data, first with natural (i.e., pre-European settlement in 1788) land cover and then with satellite-derived land cover representing Sydney's current land use pattern. An intense convective storm develops in the model in close proximity to Sydney's dense urban central business district under current land cover. The storm is absent under natural land cover conditions. A detailed investigation of why the change in land cover generates a storm was performed using factorial analysis, which revealed the storm to be sensitive to the presence of agricultural land in the southwest of the domain. This area interacts with the sea breeze and affects the horizontal divergence and moisture convergence—the triggering mechanisms of the storm. The existence of the storm over the dense urban area of Sydney is therefore coincidental. The results herein support efforts to develop parameterization of urban surfaces in high-resolution simulations of Sydney's meteorological environment but also highlight the need to improve the parameterization of other types of land cover change at the periphery of the urban area, given that these types dominate the explanation of the results.

  15. Solar system events at high spatial resolution

    SciTech Connect

    Baines, K H; Gavel, D T; Getz, A M; Gibbartd, S G; MacIntosh, B; Max, C E; McKay, C P; Young, E F; de Pater, I

    1999-02-19

    Until relatively recent advances in technology, astronomical observations from the ground were limited in image resolution by the blurring effects of earth's atmosphere. The blur extent, ranging typically from 0.5 to 2 seconds of arc at the best astronomical sights, precluded ground-based observations of the details of the solar system's moons, asteroids, and outermost planets. With the maturing of a high resolution image processing technique called speckle imaging the resolution limitation of the atmosphere can now be largely overcome. Over the past three years they have used speckle imaging to observe Titan, a moon of Saturn with an atmospheric density comparable to Earth's, Io, the volcanically active innermost moon of Jupiter, and Neptune, a gas giant outer planet which has continually changing planet-encircling storms. These observations were made at the world's largest telescope, the Keck telescope in Hawaii and represent the highest resolution infrared images of these objects ever taken.

  16. Performance and efficiency of geotextile-supported erosion control measures during simulated rainfall events

    NASA Astrophysics Data System (ADS)

    Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin

    2013-04-01

    Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show

  17. Improving outpatient phlebotomy service efficiency and patient experience using discrete-event simulation.

    PubMed

    Yip, Kenneth; Pang, Suk-King; Chan, Kui-Tim; Chan, Chi-Kuen; Lee, Tsz-Leung

    2016-08-01

    Purpose - The purpose of this paper is to present a simulation modeling application to reconfigure the outpatient phlebotomy service of an acute regional and teaching hospital in Hong Kong, with an aim to improve service efficiency, shorten patient queuing time and enhance workforce utilization. Design/methodology/approach - The system was modeled as an inhomogeneous Poisson process and a discrete-event simulation model was developed to simulate the current setting, and to evaluate how various performance metrics would change if switched from a decentralized to a centralized model. Variations were then made to the model to test different workforce arrangements for the centralized service, so that managers could decide on the service's final configuration via an evidence-based and data-driven approach. Findings - This paper provides empirical insights about the relationship between staffing arrangement and system performance via a detailed scenario analysis. One particular staffing scenario was chosen by manages as it was considered to strike the best balance between performance and workforce scheduled. The resulting centralized phlebotomy service was successfully commissioned. Practical implications - This paper demonstrates how analytics could be used for operational planning at the hospital level. The authors show that a transparent and evidence-based scenario analysis, made available through analytics and simulation, greatly facilitates management and clinical stakeholders to arrive at the ideal service configuration. Originality/value - The authors provide a robust method in evaluating the relationship between workforce investment, queuing reduction and workforce utilization, which is crucial for managers when deciding the delivery model for any outpatient-related service. PMID:27477930

  18. Optimized Hypervisor Scheduler for Parallel Discrete Event Simulations on Virtual Machine Platforms

    SciTech Connect

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2013-01-01

    With the advent of virtual machine (VM)-based platforms for parallel computing, it is now possible to execute parallel discrete event simulations (PDES) over multiple virtual machines, in contrast to executing in native mode directly over hardware as is traditionally done over the past decades. While mature VM-based parallel systems now offer new, compelling benefits such as serviceability, dynamic reconfigurability and overall cost effectiveness, the runtime performance of parallel applications can be significantly affected. In particular, most VM-based platforms are optimized for general workloads, but PDES execution exhibits unique dynamics significantly different from other workloads. Here we first present results from experiments that highlight the gross deterioration of the runtime performance of VM-based PDES simulations when executed using traditional VM schedulers, quantitatively showing the bad scaling properties of the scheduler as the number of VMs is increased. The mismatch is fundamental in nature in the sense that any fairness-based VM scheduler implementation would exhibit this mismatch with PDES runs. We also present a new scheduler optimized specifically for PDES applications, and describe its design and implementation. Experimental results obtained from running PDES benchmarks (PHOLD and vehicular traffic simulations) over VMs show over an order of magnitude improvement in the run time of the PDES-optimized scheduler relative to the regular VM scheduler, with over 20 reduction in run time of simulations using up to 64 VMs. The observations and results are timely in the context of emerging systems such as cloud platforms and VM-based high performance computing installations, highlighting to the community the need for PDES-specific support, and the feasibility of significantly reducing the runtime overhead for scalable PDES on VM platforms.

  19. Event-triggered consensus tracking of multi-agent systems with Lur'e nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Na; Duan, Zhisheng; Wen, Guanghui; Zhao, Yu

    2016-05-01

    In this paper, distributed consensus tracking problem for networked Lur'e systems is investigated based on event-triggered information interactions. An event-triggered control algorithm is designed with the advantages of reducing controller update frequency and sensor energy consumption. By using tools of ?-procedure and Lyapunov functional method, some sufficient conditions are derived to guarantee that consensus tracking is achieved under a directed communication topology. Meanwhile, it is shown that Zeno behaviour of triggering time sequences is excluded for the proposed event-triggered rule. Finally, some numerical simulations on coupled Chua's circuits are performed to illustrate the effectiveness of the theoretical algorithms.

  20. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  1. MERTIS: system theory and simulation

    NASA Astrophysics Data System (ADS)

    Paproth, Carsten; Säuberlich, Thomas; Jahn, Herbert; Helbert, Jörn

    2010-09-01

    The deep-space ESA mission BepiColombo to planet Mercury will contain the advanced infrared remote sensing instrument MERTIS (MErcury Radiometer and Thermal infrared Imaging Spectrometer). The mission has the goal to explore the planets inner and surface structure and its environment. With MERTIS investigations of Mercury's surface layer within a spectral range of 7-14μm shall be conducted to specify and map Mercury's mineralogical composition with a spatial resolution of 500m. Due to the limited mass and power budget the used micro-bolometer detector array will only have a temperature-stabilization and will not be cooled. The theoretical description of the instrument is necessary to estimate the performance of the instrument especially the signal to noise ratio. For that purpose theoretical models are derived from system theory. For a better evaluation and understanding of the instrument performance simulations are performed to compute the passage of the radiation of a hypothetical mineralogical surface composition through the optical system, the influence of the inner instrument radiation and the conversion of the overall radiation into a detector voltage and digital output signal. The results of the simulation can support the optimization process of the instrument parameters and could also assist the analysis of gathered scientific data. The simulation tool can be used as well for performance estimations of MERTIS-like systems for future projects.

  2. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  3. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  4. A Nonlinear Propulsion System Simulation Technique for Piloted Simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.

    1981-01-01

    In the past, propulsion system simulations used in flight simulators have been extremely simple. This resulted in a loss of simulation realism since significant engine and aircraft interactions were neglected and important internal engine parameters were not computed. More detailed propulsion system simulators are needed to permit evaluations of modern aircraft propulsion systems in a simulated flight environment. A real time digital simulation technique has been developed which provides the capabilities needed to evaluate propulsion system performance and aircraft system interaction on manned flight simulators. A parameter correlation technique is used with real and pseudo dynamics in a stable integration convergence loop. The technique has been applied to a multivariable propulsion system for use in a piloted NASA flight simulator program. Cycle time is 2.0 ms on a Univac 1110 computer and 5.7 ms on the simulator computer, a Xerox Sigma 8. The model is stable and accurate with time steps up to 50 ms. The program evaluated the simulation technique and the propulsion system digital control. The simulation technique and model used in that program are described and results from the simulation are presented.

  5. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  6. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  7. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP. PMID:26285220

  8. Simulation of debris flow events in Sicily by cellular automata model SCIDDICA_SS3

    NASA Astrophysics Data System (ADS)

    Cancelliere, A.; Lupiano, V.; Peres, D. J.; Stancanelli, L.; Avolio, M.; Foti, E.; Di Gregorio, S.

    2013-12-01

    Debris flow models are widely used for hazard mapping or for evaluating the effectiveness of risk mitigation measures. Several models analyze the dynamics of debris flow runout solving Partial Differential Equations. In use of such models, difficulties arise in estimating kinematic geotechnical soil parameters for real phenomena. In order to overcome such difficulties, alternative semi-empirical approaches can be employed, such as macroscopic Cellular Automata (CA). In particular, for CA simulation purposes, the runout of debris flows emerges from local interactions in a dynamical system, subdivided into elementary parts, whose state evolves within a spatial and temporal discretum. The attributes of each cell (substates) describe physical characteristics. For computational reasons, the natural phenomenon is splitted into a number of elementary processes, whose proper composition makes up the CA transition function. By simultaneously applying this function to all the cells, the evolution of the phenomenon can be simulated in terms of modifications of the substates. In this study, we present an application of the macroscopic CA semi-empirical model SCIDDICA_SS3 to the Peloritani Mountains area in Sicily island, Italy. The model was applied using detailed data from the 1 October 2009 debris flow event, which was triggered by a rainfall event of about 250 mm falling in 9 hours, that caused the death of 37 persons. This region is characterized by river valleys with large hillslope angles (30°-60°), catchment basins of small extensions (0.5-12 km2) and soil composed by metamorphic material, which is easy to be eroded. CA usage implies a calibration phase, that identifies an optimal set of parameters capable of adequately play back the considered case, and a validation phase, that tests the model on a sufficient (and different) number of cases similar in terms of physical and geomorphological properties. The performance of the model can be measured in terms of a fitness

  9. Characteristics and dependencies of error in satellite-based flood event simulations

    NASA Astrophysics Data System (ADS)

    Mei, Yiwen; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Zoccatelli, Davide; Borga, Marco

    2016-04-01

    The error in satellite precipitation driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. rainfall and runoff cumulative depth and time series shape). Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite-precipitation exhibits good agreement with reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of time series shows significant dampening effect. The random error dampening effect is less pronounced for the flash flood events, and the rain flood events with high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  10. Simulation System for Training in Laparoscopic Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao

    2003-01-01

    A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set

  11. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. PMID:27153981

  12. Simulation of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Schnepfleitner, Harald; Rode, Matthias; Sass, Oliver

    2014-05-01

    Rock moisture distribution during freeze-thaw events is the key to understanding frost weathering and subsequent rockfall. Data on moisture levels of natural rock walls are scarce and difficult to measure. An innovative and cheap way to avoid these problems is the use of simulation calculations. Although they are an abstraction of the real system they are widely used in natural science. A novel way to simulate moisture in natural rock walls is the use of the software WUFI which has been developed to understand the moisture behavior in building materials. However, the enormous know-how behind these commercial applications has not been exploited for geomorphological research to date. Necessary input data for the simulation are climate data in hourly resolution (temperature, rainfall, wind, irradiation) and material properties (porosity, sorption and diffusivity parameters) of the prevailing rock. Two different regions were analysed, the Gesäuse (Johnsbachtal: 700 m, limestone and dolomite) and the Sonnblick (3000 m, gneiss and granite). We aimed at comparing the two regions in terms of general susceptibility to frost weathering, as well as the influence of aspect, inclination and rock parameters and the possible impact of climate change. The calculated 1D-moisture profiles and temporal progress of rock moisture - in combination with temperature data - allow to detect possible periods of active weathering and resulting rockfalls. These results were analyzed based on two different frost weathering theories, the "classical" frost shattering theory (requiring high number of freeze-thaw cycles and a pore saturation of 90%) and the segregation ice theory (requiring a long freezing period and a pore saturation threshold of approx. 60%). An additionally considered critical factor for both theories was the frost depth, namely the duration of the "frost cracking window" (between -3 and -10°C) at each site. The results shows that in both areas, north-facing rocks are

  13. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  14. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  15. Discrete-event simulation of a wide-area health care network.

    PubMed Central

    McDaniel, J G

    1995-01-01

    OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646

  16. Estimating Flood Quantiles on the Basis of Multi-Event Rainfall Simulation - Case Study

    NASA Astrophysics Data System (ADS)

    Jarosińska, Elżbieta; Pierzga, Katarzyna

    2015-12-01

    This paper presents an approach to estimating the probability distribution of annual discharges Q based on rainfall-runoff modelling using multiple rainfall events. The approach is based on the prior knowledge about the probability distribution of annual maximum daily totals of rainfall P in a natural catchment, random disaggregation of the totals into hourly values, and rainfall-runoff modelling. The presented Multi-Event Simulation of Extreme Flood method (MESEF) combines design event method based on single-rainfall event modelling, and continuous simulation method used for estimating the maximum discharges of a given exceedance probability using rainfall-runoff models. In the paper, the flood quantiles were estimated using the MESEF method, and then compared to the flood quantiles estimated using classical statistical method based on observed data.

  17. Simulation of linear mechanical systems

    NASA Technical Reports Server (NTRS)

    Sirlin, S. W.

    1993-01-01

    A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.

  18. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  19. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  20. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  1. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  2. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  3. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  4. An abrupt climate event in a coupled ocean-atmosphere simulation without external forcing.

    PubMed

    Hall, A; Stouffer, R J

    2001-01-11

    Temperature reconstructions from the North Atlantic region indicate frequent abrupt and severe climate fluctuations during the last glacial and Holocene periods. The driving forces for these events are unclear and coupled atmosphere-ocean models of global circulation have only simulated such events by inserting large amounts of fresh water into the northern North Atlantic Ocean. Here we report a drastic cooling event in a 15,000-yr simulation of global circulation with present-day climate conditions without the use of such external forcing. In our simulation, the annual average surface temperature near southern Greenland spontaneously fell 6-10 standard deviations below its mean value for a period of 30-40 yr. The event was triggered by a persistent northwesterly wind that transported large amounts of buoyant cold and fresh water into the northern North Atlantic Ocean. Oceanic convection shut down in response to this flow, concentrating the entire cooling of the northern North Atlantic by the colder atmosphere in the uppermost ocean layer. Given the similarity between our simulation and observed records of rapid cooling events, our results indicate that internal atmospheric variability alone could have generated the extreme climate disruptions in this region. PMID:11196636

  5. BEEC: An event generator for simulating the Bc meson production at an e+e- collider

    NASA Astrophysics Data System (ADS)

    Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You

    2013-12-01

    The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in

  6. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    USGS Publications Warehouse

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  7. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event. PMID:18302490

  8. On computer-intensive simulation and estimation methods for rare-event analysis in epidemic models.

    PubMed

    Clémençon, Stéphan; Cousien, Anthony; Felipe, Miraine Dávila; Tran, Viet Chi

    2015-12-10

    This article focuses, in the context of epidemic models, on rare events that may possibly correspond to crisis situations from the perspective of public health. In general, no close analytic form for their occurrence probabilities is available, and crude Monte Carlo procedures fail. We show how recent intensive computer simulation techniques, such as interacting branching particle methods, can be used for estimation purposes, as well as for generating model paths that correspond to realizations of such events. Applications of these simulation-based methods to several epidemic models fitted from real datasets are also considered and discussed thoroughly. PMID:26242476

  9. Simulated seismic event release fraction data: Progress report, April 1986-April 1987

    SciTech Connect

    Langer, G.; Deitesfeld, C.A.

    1987-11-15

    The object of this project is to obtain experimental data on the release of airborne particles during seismic events involving plutonium handling facilities. In particular, cans containing plutonium oxide powder may be involved and some of the powder may become airborne. No release fraction data for such scenarios are available and risk assessment calculations for such events lacked specificity describing the physical processes involved. This study has provided initial data based on wind tunnel tests simulating the impact of the debris on simulated cans of plutonium oxide powder. The release fractions are orders of magnitude smaller than previously available estimates. 8 refs., 3 figs., 2 tabs.

  10. Decentralised consensus for multiple Lagrangian systems based on event-triggered strategy

    NASA Astrophysics Data System (ADS)

    Liu, Xiangdong; Du, Changkun; Lu, Pingli; Yang, Dapeng

    2016-06-01

    This paper considers the decentralised event-triggered consensus problem for multi-agent systems with Lagrangian dynamics under undirected graphs. First, a distributed, leaderless, and event-triggered consensus control algorithm is presented based on the definition of generalised positions and velocities for all agents. There is only one triggering function for both the generalised positions and velocities and no Zeno behaviour exhibited under the proposed consensus strategy. Second, an adaptive event-triggered consensus control algorithm is proposed for such multi-agent systems with unknown constant parameters. Third, based on sliding-mode method, an event-triggered consensus control algorithm is considered for the case with external disturbance. Finally, simulation results are given to illustrate the theoretical results.