Science.gov

Sample records for event system simulation

  1. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  2. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  3. Rare event simulation of the chaotic Lorenz 96 dynamical system

    NASA Astrophysics Data System (ADS)

    Wouters, Jeroen; Bouchet, Freddy

    2015-04-01

    The simulation of rare events is becoming increasingly important in the climate sciences. Several sessions are devoted to rare and extreme events at this meeting and the IPCC has devoted a special report to risk management of extreme events (SREX). Brute force simulation of rare events can however be very costly. To obtain satisfactory statistics on a 1/1000y event, one needs to perform simulations over several thousands of years. Recently, a class of rare event simulation algorithms has been introduced that could yield significant increases in performance with respect to brute force simulations (see e.g. [1]). In these algorithms an ensemble of simulations is evolved in parallel, while at certain interaction times ensemble members are killed and cloned so as to have better statistics in the region of phase space that is relevant to the rare event of interest. We will discuss the implementational issues and performance gains for these algorithms. We also present results on a first application of a rare event simulation algorithm to a toy model for chaos in the atmosphere, the Lorenz 96 model. We demonstrate that for the estimation of the histogram tail of the energy observable, the algorithm gives a significant error reduction. We will furthermore discuss first results and an outlook on the application of rare event simulation algorithms to study blocking atmospheric circulation and heat wave events in the PlaSim climate model [2]. [1] Del Moral, P. & Garnier, J. Genealogical particle analysis of rare events. The Annals of Applied Probability 15, 2496-2534 (2005). [2] http://www.mi.uni-hamburg.de/Planet-Simul.216.0.html

  4. Enhancing Complex System Performance Using Discrete-Event Simulation

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    2010-01-01

    In this paper, we utilize discrete-event simulation (DES) merged with human factors analysis to provide the venue within which the separation and deconfliction of the system/human operating principles can occur. A concrete example is presented to illustrate the performance enhancement gains for an aviation cargo flow and security inspection system achieved through the development and use of a process DES. The overall performance of the system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and total number of pallets waiting for inspection in the queue. These metrics are performance indicators of the system's ability to service current needs and respond to additional requests. We studied and analyzed different scenarios by changing various model parameters such as the number of pieces per pallet ratio, number of inspectors and cargo handling personnel, number of forklifts, number and types of detection systems, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures identified effective ways to meet inspection requirements while maintaining or reducing overall operational cost and eliminating any shipping delays associated with any proposed changes in inspection requirements. With this understanding effective operational strategies can be developed to optimally use personnel while still maintaining plant efficiency, reducing process interruptions, and holding or reducing costs.

  5. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  6. An Early Warning System for Loan Risk Assessment Based on Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Zhou, Hong; Qiu, Yue; Wu, Yueqin

    System simulation is one of important tool for risk assessment. In this paper, a new method is presented to deal with credit risk assessment problems for commercial banks based on rare event simulation. The failure probability of repaying loans of listed company is taken as the criterion to measure the level of credit risk. The rare-event concept is adopted to construct the model of credit risk identification in commercial banks, and cross-entropy scheme is designed to implement the rare event simulation, based on which the loss probability can be assessed. Numerical experiments have shown that the method has a strong capability to identify the credit risk for commercial banks and offers a good tool for early warning.

  7. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  8. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    NASA Astrophysics Data System (ADS)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  9. Simulation of a 36 h solar particle event at LLUMC using a proton beam scanning system

    NASA Astrophysics Data System (ADS)

    Coutrakon, G. B.; Benton, E. R.; Gridley, D. S.; Hickey, T.; Hubbard, J.; Koss, P.; Moyers, M. F.; Nelson, G. A.; Pecaut, M. J.; Sanders, E.; Shahnazi, K.

    2007-08-01

    A radiation biology experiment was performed in the research room of the proton therapy facility at Loma Linda University Medical Center to simulate the proton exposure produced by a solar particle event. The experiment used two scanning magnets for X and Y deflection of the proton beam and covered a usable target area of nearly 1 m2. The magnet scanning control system consisted of Lab View 6.0 software running on a PC. The goal of this experiment was to study the immune system response of 48 mice simultaneously exposed to 2 Gy of protons that simulated the dose rate and energy spectrum of the September 1989 solar particle event. The 2 Gy dose was delivered to the entrance of the mice cages over 36 h. Both ion chamber and TLD measurements indicated that the dose delivered was within 9% of the intended value. A spot scanning technique using one spot per accelerator cycle (2.2 s) was used to deliver doses as low as 1 μGy per beam spot. Rapid beam termination (less than 5 ms) on each spot was obtained by energizing a quadrupole in the proton synchrotron once the dose limit was reached for each spot. A parallel plate ion chamber placed adjacent to the mice cages provided fluence (or dose) measurements for each beam energy during each hour of the experiment. An intensity modulated spot scanning technique can be used in a variety of ways for radiation biology and a second experiment is being designed with this proton beam scanning system to simultaneously irradiate four groups of mice with different dose rates within the 1 m2 area. Also, large electronic devices being tested for radiation damage have been exposed in this beam without the use of patch fields. The same scanning system has potential application for intensity modulated proton therapy (IMPT) as well. This paper discusses the beam delivery system and dosimetry of the irradiation.

  10. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  11. The IDES framework: A case study in development of a parallel discrete-event simulation system

    SciTech Connect

    Nicol, D.M.; Johnson, M.M.; Yoshimura, A.S.

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  12. Spontaneous onset of a Madden-Julian oscillation event in a cloud-system-resolving simulation

    NASA Astrophysics Data System (ADS)

    Miura, Hiroaki; Satoh, Masaki; Katsumata, Masaki

    2009-07-01

    Spontaneous onset of a Madden-Julian Oscillation (MJO) event in November 2006 was reproduced at a proper location and time by a global cloud-resolving model (CRM) used with a relatively coarse horizontal grid. Preconditioning of moisture was simulated about 4-days prior to the onset in the Indian Ocean, which agreed with data obtained in an in-situ observation. To investigate influence of zonal Sea Surface Temperature (SST) gradient in the Indian Ocean, we conducted a sensitivity study comparing composites made from five ensemble simulations. It was found that the eastward-moving signal of this MJO event could be obscured if SST were zonally uniform in the western Indian Ocean. Zonal SST gradient has not been considered important in the previous studies about the MJO onset, but SST distribution locating cooler SST in the west side possibly help enhance convection in slow eastward-moving envelopes of the MJO.

  13. Dynamic simulation recalls condensate piping event

    SciTech Connect

    Farrell, R.J.; Reneberg, K.O. ); Moy, H.C. )

    1994-05-01

    This article describes how experience gained from simulating and reconstructing a condensate piping event will be used by Consolidated Edison to analyze control system problems. A cooperative effort by Con Edison and the Chemical Engineering Department at Polytechnic University used modular modeling system to investigate the probable cause of a Con Edison condensate piping event. Con Edison commissioned the work to serve as a case study for the more general problem of control systems analysis using dynamic simulation and MMS.

  14. Simulation of Heinrich events and their climate impact with an Earth system model

    NASA Astrophysics Data System (ADS)

    Ganopolski, A.; Calov, R.

    2003-04-01

    Heinrich events related to large-scale surges of the Laurentide Ice Sheet into the Atlantic Ocean represent one of the most dramatic types of abrupt climate change occurring during the glacial age. MacAyeal proposed a "binge/purge" free oscillatory mechanism, explaining HEs as transitions between two modes of operation of ice sheets: slow movement of ice over a frozen base and a fast sliding mode when the ice bed is at the melting point. This type of self-sustained multi-millennial oscillation has been simulated in simplified 2-D ice sheet models, but in realistic 3-D models such a large-scale instability of the LIS has not so far been reproduced. Here using a coupled atmosphere-ocean-vegetation-ice sheet model, we simulate quasi-periodic large-scale rapid surges from the Laurentide Ice Sheet under typical glacial climate conditions. The average time between simulated events is about 7,000 yrs, while the surging phase of each event lasts only several hundred years, with a total ice volume discharge corresponding to 5--10 m of sea level rise. The crucial factor needed for existence of mega-surges in our model is employment of different sliding laws over hard bed (rocks) and over soft water-saturated sediments, like those in the Hudson Bay and Hudson Strait. The area of deformable sediments served as a geological template for the mega-surges. During each HE, the elevation drops by more than one km over the Hudson Bay, and the Laurentide ice sheet changes from one-dome to a two-dome structure -- one dome being located over the southeast of Alberta and another over the southwest of Quebec. In our model the ice surges represent internal oscillations of the ice sheet related to rapid transitions between two metastable modes of ice sheet dynamics over the area covered by deformable sediments. At the same time, we demonstrate the possibility of both internal and external synchronization between instabilities of different ice sheets, as indicated in palaeoclimate records

  15. StochKit2: software for discrete stochastic simulation of biochemical systems with events

    PubMed Central

    Sanft, Kevin R.; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R.

    2011-01-01

    Summary: StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. Availability: StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. Contact: petzold@engineering.ucsb.edu PMID:21727139

  16. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  17. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    SciTech Connect

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  18. The global event system

    SciTech Connect

    Winans, J.

    1994-03-02

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  19. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  20. Weighted next reaction method and parameter selection for efficient simulation of rare events in biochemical reaction systems

    PubMed Central

    2011-01-01

    The weighted stochastic simulation algorithm (wSSA) recently developed by Kuwahara and Mura and the refined wSSA proposed by Gillespie et al. based on the importance sampling technique open the door for efficient estimation of the probability of rare events in biochemical reaction systems. In this paper, we first apply the importance sampling technique to the next reaction method (NRM) of the stochastic simulation algorithm and develop a weighted NRM (wNRM). We then develop a systematic method for selecting the values of importance sampling parameters, which can be applied to both the wSSA and the wNRM. Numerical results demonstrate that our parameter selection method can substantially improve the performance of the wSSA and the wNRM in terms of simulation efficiency and accuracy. PMID:21910924

  1. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  2. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  3. Numerical Simulations of Two Wildfire Events Using a Combined Modeling System (HIGRAD/BEHAVE)

    SciTech Connect

    Reisner, J.; Bossert, J.; Winterkamp, J.

    1997-12-31

    The ability to accurately forecast the spread of a wildfire would significantly reduce human suffering and loss of life, the destruction of property, and expenditures for assessment and recovery. To help achieve this goal we have developed a model which accurately simulates the interactions between winds and the heat source associated with a wildfire. We have termed our new model HIGRAD or High resolution model for strong GRA-Dient applications. HIGRAD employs a sophisticated numerical technique to prevent numerical Oscillations from occurring in the vicinity of the lire. Of importance for fire modeling, HIGRAD uses a numerical technique which allows for the use of a compressible equation set, but without the time-step restrictions associated with the propagation of sound-waves.

  4. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  5. A Simbol-X Event Simulator

    SciTech Connect

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-05-11

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  6. Running Parallel Discrete Event Simulators on Sierra

    SciTech Connect

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  7. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  8. Simulating Heinrich event 1 with interactive icebergs

    NASA Astrophysics Data System (ADS)

    Jongma, J. I.; Renssen, H.; Roche, D. M.

    2013-03-01

    During the last glacial, major abrupt climate events known as Heinrich events left distinct fingerprints of ice rafted detritus, and are thus associated with iceberg armadas; the release of many icebergs into the North Atlantic Ocean. We simulated the impact of a large armada of icebergs on glacial climate in a coupled atmosphere-ocean model. In our model, dynamic-thermodynamic icebergs influence the climate through two direct effects. First, melting of the icebergs causes freshening of the upper ocean, and second, the latent heat used in the phase-transition of ice to water results in cooling of the iceberg surroundings. This cooling effect of icebergs is generally neglected in models. We investigated the role of the latent heat by performing a sensitivity experiment in which the cooling effect is switched off. At the peak of the simulated Heinrich event, icebergs lacking the latent heat flux are much less efficient in shutting down the meridional overturning circulation than icebergs that include both the freshening and the cooling effects. The cause of this intriguing result must be sought in the involvement of a secondary mechanism: facilitation of sea-ice formation, which can disturb deep water production at key convection sites, with consequences for the thermohaline circulation. We performed additional sensitivity experiments, designed to explore the effect of the more plausible distribution of the dynamic icebergs' melting fluxes compared to a classic hosing approach with homogeneous spreading of the melt fluxes over a section in the mid-latitude North Atlantic (NA) Ocean. The early response of the climate system is much stronger in the iceberg experiments than in the hosing experiments, which must be a distribution-effect: the dynamically distributed icebergs quickly affect western NADW formation, which synergizes with direct sea-ice facilitation, causing an earlier sea-ice expansion and climatic response. Furthermore, compared to dynamic

  9. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  10. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  11. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  12. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  13. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  14. Scanning picosecond tunable laser system for simulating MeV heavy ion-induced charge collection events as a function of temperature

    NASA Astrophysics Data System (ADS)

    Laird, Jamie Stuart; Chen, Yuan; Scheick, Leif; Vo, Tuan; Johnston, Allan

    2008-08-01

    A new methodology for using scanning picosecond laser microscopy to simulate cosmic ray induced radiation effects as a function of temperature is described in detail. The built system is centered on diffraction-limited focusing of the output from a broadband (690-960 nm) ultrafast Ti:sapphire Tsunami laser pumped by a 532 nm Millennia laser. An acousto-optic modulator is used to provide pulse picking down to event rates necessary for the technologies and effects under study. The temperature dependence of the charge generation process for ions and photons is briefly reviewed and the need for wavelength tunability is discussed. An appropriate wavelength selection is critical for proper emulation of ion events over a wide temperature range. The system developed is detailed and illustrated by way of example on a deep-submicron complementary metal-oxide semiconductor test structure.

  15. MHD simulation of the Bastille day event

    NASA Astrophysics Data System (ADS)

    Linker, Jon; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-01

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 1033 ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  16. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  17. Complete event simulations of nuclear fission

    NASA Astrophysics Data System (ADS)

    Vogt, Ramona

    2015-10-01

    For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.

  18. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models. PMID:27140558

  19. Distributed discrete event simulation. Final report

    SciTech Connect

    De Vries, R.C.

    1988-02-01

    The presentation given here is restricted to discrete event simulation. The complexity of and time required for many present and potential discrete simulations exceeds the reasonable capacity of most present serial computers. The desire, then, is to implement the simulations on a parallel machine. However, certain problems arise in an effort to program the simulation on a parallel machine. In one category of methods deadlock care arise and some method is required to either detect deadlock and recover from it or to avoid deadlock through information passing. In the second category of methods, potentially incorrect simulations are allowed to proceed. If the situation is later determined to be incorrect, recovery from the error must be initiated. In either case, computation and information passing are required which would not be required in a serial implementation. The net effect is that the parallel simulation may not be much better than a serial simulation. In an effort to determine alternate approaches, important papers in the area were reviewed. As a part of that review process, each of the papers was summarized. The summary of each paper is presented in this report in the hopes that those doing future work in the area will be able to gain insight that might not otherwise be available, and to aid in deciding which papers would be most beneficial to pursue in more detail. The papers are broken down into categories and then by author. Conclusions reached after examining the papers and other material, such as direct talks with an author, are presented in the last section. Also presented there are some ideas that surfaced late in the research effort. These promise to be of some benefit in limiting information which must be passed between processes and in better understanding the structure of a distributed simulation. Pursuit of these ideas seems appropriate.

  20. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  1. Precision Event Simulation for Hadron Colliders

    NASA Astrophysics Data System (ADS)

    Hoeche, Stefan

    2016-03-01

    Hadron colliders are workhorses of particle physics, enabling scientific breakthroughs such as the discovery of the Higgs boson. Hadron beams reach the highest energies, but they also produce very complex collisions. Studying the underlying dynamics requires involved multi-particle calculations. Over the past decades Monte-Carlo simulation programs were developed to tackle this task. They have by now evolved into precision tools for theorists and experimenters alike. This talk will give an introduction to event generators and discuss the current status of development.

  2. Rare event simulation in radiation transport

    SciTech Connect

    Kollman, C.

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.

  3. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  4. Event Index — an LHCb Event Search System

    NASA Astrophysics Data System (ADS)

    Ustyuzhanin, A.; Artemov, A.; Kazeev, N.; Redkin, A.

    2015-12-01

    During LHC Run 1, the LHCb experiment recorded around 1011 collision events. This paper describes Event Index — an event search system. Its primary function is to quickly select subsets of events from a combination of conditions, such as the estimated decay channel or number of hits in a subdetector. Event Index is essentially Apache Lucene [1] optimized for read-only indexes distributed over independent shards on independent nodes.

  5. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  6. Empirical study of simulated two-planet microlensing events

    SciTech Connect

    Zhu, Wei; Gould, Andrew; Penny, Matthew; Mao, Shude; Gendron, Rieul

    2014-10-10

    We undertake the first study of two-planet microlensing models recovered from simulations of microlensing events generated by realistic multiplanet systems in which 292 planetary events, including 16 two-planet events, were detected from 6690 simulated light curves. We find that when two planets are recovered, their parameters are usually close to those of the two planets in the system most responsible for the perturbations. However, in 1 of the 16 examples, the apparent mass of both detected planets was more than doubled by the unmodeled influence of a third, massive planet. This fraction is larger than but statistically consistent with the roughly 1.5% rate of serious mass errors due to unmodeled planetary companions for the 274 cases from the same simulation in which a single planet is recovered. We conjecture that an analogous effect due to unmodeled stellar companions may occur more frequently. For 7 out of 23 cases in which two planets in the system would have been detected separately, only one planet was recovered because the perturbations due to the two planets had similar forms. This is a small fraction (7/274) of all recovered single-planet models, but almost a third of all events that might plausibly have led to two-planet models. Still, in these cases, the recovered planet tends to have parameters similar to one of the two real planets most responsible for the anomaly.

  7. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. PMID:26922594

  8. Event-based Simulation Model for Quantum Optics Experiments

    SciTech Connect

    De Raedt, H.; Michielsen, K.

    2011-03-28

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. We also discuss the possibility to refute our corpuscular model.

  9. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  10. Data Systems Dynamic Simulator

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Clark, Melana; Davenport, Bill; Message, Philip

    1993-01-01

    The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique.

  11. Numerical Simulations of Hot Vertical Displacement Events

    NASA Astrophysics Data System (ADS)

    Bunkers, K. J.; Sovinec, C. R.

    2015-11-01

    Loss of vertical positioning control in tokamaks leads to instability where hot confined plasma rests against the chamber wall. Resistive-MHD modeling with the NIMROD code is applied to model these events. After divertor-coil current is perturbed, resistive diffusion through the non-ideal wall sets the timescale as the simulated tokamak evolves from a diverted equilibrium to a limited configuration. Results show that plasma outflow along opening magnetic surfaces, just outside the confinement zone, approaches the local ion-acoustic speed. The projection of the plasma flow velocity into the surface-normal direction (n . V) near the surface exceeds the local E × B drift speed; near surfaces n × E is approximately the same as n ×Ewall in the nearly steady conditions. The safety factor of flux surfaces that remain intact is approximately constant over the evolution time, which is much shorter than the plasma resistive diffusion time. Assessment of external-kink stability and initial findings from 3D nonlinear computations are presented. This effort is supported by the U.S. Dept. of Energy, award numbers DE-FG02-06ER54850 and DE-FC02-08ER54975.

  12. Calculation of fission observables through event-by-event simulation

    NASA Astrophysics Data System (ADS)

    Randrup, Jørgen; Vogt, Ramona

    2009-08-01

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  13. Calculation of Fission Observables Through Event-by-Event Simulation

    SciTech Connect

    Randrup, J; Vogt, R

    2009-06-04

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to met this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  14. Single event effects and laser simulation studies

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Schwartz, H.; Mccarty, K.; Coss, J.; Barnes, C.

    1993-01-01

    The single event upset (SEU) linear energy transfer threshold (LETTH) of radiation hardened 64K Static Random Access Memories (SRAM's) was measured with a picosecond pulsed dye laser system. These results were compared with standard heavy ion accelerator (Brookhaven National Laboratory (BNL)) measurements of the same SRAM's. With heavy ions, the LETTH of the Honeywell HC6364 was 27 MeV-sq cm/mg at 125 C compared with a value of 24 MeV-sq cm/mg obtained with the laser. In the case of the second type of 64K SRAM, the IBM640lCRH no upsets were observed at 125 C with the highest LET ions used at BNL. In contrast, the pulsed dye laser tests indicated a value of 90 MeV-sq cm/mg at room temperature for the SEU-hardened IBM SRAM. No latchups or multiple SEU's were observed on any of the SRAM's even under worst case conditions. The results of this study suggest that the laser can be used as an inexpensive laboratory SEU prescreen tool in certain cases.

  15. New strategies for the simulation of rare events

    NASA Astrophysics Data System (ADS)

    Rahman, Jay Abid

    2002-01-01

    Rare events play an important role in numerous physical, chemical, and biological processes, from protein folding, relaxation in glasses, nucleation, isomerization, and diffusion. Understanding the dynamic and equilibrium characteristics of such processes has important implications for drug design, materials development, and catalysis. Simulations are used to obtain properties of these systems that are inaccessible through current experimental techniques. Unfortunately, simulations on these systems have been historically difficult not only due to the large system size, but also the rugged, multidimensional nature of the potential energy landscapes. Standard simulation methods fail for these systems as they generally become trapped in a local minimum, resulting in incomplete and inaccurate mapping of the potential energy surface. While there has been significant work in this area over the past several decades, the problem is still largely unresolved. In this work, we introduce a number of approaches for solving this problem. The first method, puddle skimming, adds a "puddle" potential to the surface, effectively raising the bottoms of the potential wells as though the surface were being filled with water. This reduces the amount of energy required to escape the well, decreasing the likelihood of trapping. This method is found to be best suited to smaller dimensional systems where the barrier heights are similar to each other. The second method, puddle jumping, adds additional "puddles" to the system and allows transitions between puddles. This method allows more complicated systems to be studied, both in terms of number of degrees of freedom and variety of barrier heights. We also apply the puddle strategy to transition path sampling, greatly extending the range of systems for which this method can be used to compute reaction rate constants with minimum additional work. Finally, we combine the puddle jumping method with parallel tempering, a state-of-the-art rare

  16. An adaptive synchronization protocol for parallel discrete event simulation

    SciTech Connect

    Bisset, K.R.

    1998-12-01

    Simulation, especially discrete event simulation (DES), is used in a variety of disciplines where numerical methods are difficult or impossible to apply. One problem with this method is that a sufficiently detailed simulation may take hours or days to execute, and multiple runs may be needed in order to generate the desired results. Parallel discrete event simulation (PDES) has been explored for many years as a method to decrease the time taken to execute a simulation. Many protocols have been developed which work well for particular types of simulations, but perform poorly when used for other types of simulations. Often it is difficult to know a priori whether a particular protocol is appropriate for a given problem. In this work, an adaptive synchronization method (ASM) is developed which works well on an entire spectrum of problems. The ASM determines, using an artificial neural network (ANN), the likelihood that a particular event is safe to process.

  17. The Advanced Photon Source event system

    SciTech Connect

    Lenkszus, F.R.; Laird, R.

    1995-12-31

    The Advanced Photon Source, like many other facilities, requires a means of transmitting timing information to distributed control system 1/0 controllers. The APS event system provides the means of distributing medium resolution/accuracy timing events throughout the facility. It consists of VME event generators and event receivers which are interconnected with 10OMbit/sec fiber optic links at distances of up to 650m in either a star or a daisy chain configuration. The systems event throughput rate is 1OMevents/sec with a peak-to-peak timing jitter down to lOOns depending on the source of the event. It is integrated into the EPICS-based A.PS control system through record and device support. Event generators broadcast timing events over fiber optic links to event receivers which are programmed to decode specific events. Event generators generate events in response to external inputs, from internal programmable event sequence RAMS, and from VME bus writes. The event receivers can be programmed to generate both pulse and set/reset level outputs to synchronize hardware, and to generate interrupts to initiate EPICS record processing. In addition, each event receiver contains a time stamp counter which is used to provide synchronized time stamps to EPICS records.

  18. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D

  19. Laser simulation of single event upsets

    SciTech Connect

    Buchner, S.P.; Wilson, D.; Kang, K.; Gill, D.; Mazer, J.A.; Raburn, W.D.; Campbell, A.B.; Knudson, A.R.

    1987-12-01

    A pulsed picosecond laser was used to produce upsets in both a commercial bipolar logic circuit and a specially designed CMOS SRAM test structure. Comparing the laser energy necessary for producing upsets in transistors that have different upset sensitivities with the single event upset (SEU) level predicted from circuit analysis showed that a picosecond laser could measure circuit sensitivity to SEUs. The technique makes it possible not only to test circuits rapidly for upset sensitivity but also, because the beam can be focussed down to a small spot size, to identify sensitive transistors.

  20. Computer simulation of underwater nuclear events

    SciTech Connect

    Kamegai, M.

    1986-09-01

    This report describes the computer simulation of two underwater nuclear explosions, Operation Wigwam and a modern hypothetical explosion of greater yield. The computer simulations were done in spherical geometry with the LASNEX computer code. Comparison of the LASNEX calculation with Snay's analytical results and the Wigwam measurements shows that agreement in the shock pressure versus range in water is better than 5%. The results of the calculations are also consistent with the cube root scaling law for an underwater blast wave. The time constant of the wave front was determined from the wave profiles taken at several points. The LASNEX time-constant calculation and Snay's theoretical results agree to within 20%. A time-constant-versus-range relation empirically fitted by Snay is valid only within a limited range at low pressures, whereas a time-constant formula based on Sedov's similarity solution holds at very high pressures. This leaves the intermediate pressure range with neither an empirical nor a theoretical formula for the time constant. These one-dimensional simulations demonstrate applicability of the computer code to investigations of this nature, and justify the use of this technique for more complex two-dimensional problems, namely, surface effects on underwater nuclear explosions. 16 refs., 8 figs., 2 tabs.

  1. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, D; Tromp, J; Rodgers, A

    2007-07-16

    Comprehensive test ban monitoring in terms of location and discrimination has progressed significantly in recent years. However, the characterization of sources and the estimation of low yields remains a particular challenge. As the recent Korean shot demonstrated, we can probably expect to have a small set of teleseismic, far-regional and high-frequency regional data to analyze in estimating the yield of an event. Since stacking helps to bring signals out of the noise, it becomes useful to conduct comparable analyses on neighboring events, earthquakes in this case. If these auxiliary events have accurate moments and source descriptions, we have a means of directly comparing effective source strengths. Although we will rely on modeling codes, 1D, 2D, and 3D, we will also apply a broadband calibration procedure to use longer periods (P>5s) waveform data to calibrate short-period (P between .5 to 2 Hz) and high-frequency (P between 2 to 10 Hz) as path specify station corrections from well-known regional sources. We have expanded our basic Cut-and-Paste (CAP) methodology to include not only timing shifts but also amplitude (f) corrections at recording sites. The name of this method was derived from source inversions that allow timing shifts between 'waveform segments' (or cutting the seismogram up and re-assembling) to correct for crustal variation. For convenience, we will refer to these f-dependent refinements as CAP+ for (SP) and CAP++ for still higher frequency. These methods allow the retrieval of source parameters using only P-waveforms where radiation patterns are obvious as demonstrated in this report and are well suited for explosion P-wave data. The method is easily extended to all distances because it uses Green's function although there may be some changes required in t* to adjust for offsets between local vs. teleseismic distances. In short, we use a mixture of model-dependent and empirical corrections to tackle the path effects. Although we reply on the

  2. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGESBeta

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  3. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  4. Threat radar system simulations

    NASA Astrophysics Data System (ADS)

    Miller, L.

    The capabilities, requirements, and goals of radar emitter simulators are discussed. Simulators are used to evaluate competing receiver designs, to quantify the performance envelope of a radar system, and to model the characteristics of a transmitted signal waveform. A database of candidate threat systems is developed and, in concert with intelligence data on a given weapons system, permits upgrading simulators to new projected threat capabilities. Four currently available simulation techniques are summarized, noting the usefulness of developing modular software for fast controlled-cost upgrades of simulation capabilities.

  5. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  6. Event simulation for colliders — A basic overview

    NASA Astrophysics Data System (ADS)

    Reuschle, Christian

    2015-05-01

    In this article we will discuss the basic calculational concepts to simulate particle physics events at high energy colliders. We will mainly focus on the physics in hadron colliders and particularly on the simulation of the perturbative parts, where we will in turn focus on the next-to-leading order QCD corrections.

  7. Event-by-event simulation of a quantum delayed-choice experiment

    NASA Astrophysics Data System (ADS)

    Donker, Hylke C.; De Raedt, Hans; Michielsen, Kristel

    2014-12-01

    The quantum delayed-choice experiment of Tang et al. (2012) is simulated on the level of individual events without making reference to concepts of quantum theory or without solving a wave equation. The simulation results are in excellent agreement with the quantum theoretical predictions of this experiment. The implication of the work presented in the present paper is that the experiment of Tang et al. can be explained in terms of cause-and-effect processes in an event-by-event manner.

  8. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  9. Simulating Single-Event Upsets in Bipolar RAM's

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.

    1986-01-01

    Simulation technique saves testing. Uses interactive version of SPICE (Simulation Program with Integrated Circuit Emphasis). Device and subcircuit models available in software used to construct macromodel for an integrated bipolar transistor. Time-dependent current generators placed inside transistor macromodel to simulate charge collection from ion track. Significant finding of experiments is standard design practice of reducing power in unaddressed bipolar RAM cell increases sensitivity of cell to single-event upsets.

  10. Designing Simulation Systems

    ERIC Educational Resources Information Center

    Twelker, Paul A.

    1969-01-01

    "The purpose of this paper is to outline the approach to designing instructional simulation systems developed at Teaching Research. The 13 phases of simulation design will be summarized, and an effort will be made to expose the vital decision points that confront the designer as he develops simulation experiences. (Author)

  11. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  12. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  13. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  14. Event-by-event fission simulation code, generates complete fission events

    Energy Science and Technology Software Center (ESTSC)

    2013-04-01

    FREYA is a computer code that generates complete fission events. The output includes the energy and momentum of these final state particles: fission products, prompt neutrons and prompt photons. The version of FREYA that is to be released is a module for MCNP6.

  15. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    PubMed

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically. PMID:26961779

  16. Simulation and study of small numbers of random events

    NASA Technical Reports Server (NTRS)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  17. Event-by-event simulation of experiments to create entanglement and violate Bell inequalities

    NASA Astrophysics Data System (ADS)

    Michielsen, K.; De Raedt, H.

    2013-10-01

    We discuss a discrete-event, particle-based simulation approach which reproduces the statistical distributions of Maxwell's theory and quantum theory by generating detection events one-by-one. This event-based approach gives a unified causeand- effect description of quantum optics experiments such as single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, quantum eraser, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments, and various neutron interferometry experiments. We illustrate the approach by application to single-photon Einstein-Podolsky- Rosen-Bohm experiments and single-neutron interferometry experiments that violate a Bell inequality.

  18. Route to extreme events in excitable systems

    NASA Astrophysics Data System (ADS)

    Karnatak, Rajat; Ansmann, Gerrit; Feudel, Ulrike; Lehnertz, Klaus

    2014-08-01

    Systems of FitzHugh-Nagumo units with different coupling topologies are capable of self-generating and -terminating strong deviations from their regular dynamics that can be regarded as extreme events due to their rareness and recurrent occurrence. Here we demonstrate the crucial role of an interior crisis in the emergence of extreme events. In parameter space we identify this interior crisis as the organizing center of the dynamics by employing concepts of mixed-mode oscillations and of leaking chaotic systems. We find that extreme events occur in certain regions in parameter space, and we show the robustness of this phenomenon with respect to the system size.

  19. Variability of simulants used in recreating stab events.

    PubMed

    Carr, D J; Wainwright, A

    2011-07-15

    Forensic investigators commonly use simulants/backing materials to mount fabrics and/or garments on when recreating damage due to stab events. Such work may be conducted in support of an investigation to connect a particular knife to a stabbing event by comparing the severance morphology obtained in the laboratory to that observed in the incident. There does not appear to have been a comparison of the effect of simulant type on the morphology of severances in fabrics and simulants, nor on the variability of simulants. This work investigates three simulants (pork, gelatine, expanded polystyrene), two knife blades (carving, bread), and how severances in the simulants and an apparel fabric typically used to manufacture T-shirts (single jersey) were affected by (i) simulant type and (ii) blade type. Severances were formed using a laboratory impact apparatus to ensure a consistent impact velocity and hence impact energy independently of the other variables. The impact velocity was chosen so that the force measured was similar to that measured in human performance trials. Force-time and energy-time curves were analysed and severance morphology (y, z directions) investigated. Simulant type and knife type significantly affected the critical forensic measurements of severance length (y direction) in the fabric and 'skin' (Tuftane). The use of EPS resulted in the lowest variability in data, further the severances recorded in both the fabric and Tuftane more accurately reflected the dimensions of the impacting knives. PMID:21371835

  20. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  1. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  2. Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Prive, Nikki

    2015-01-01

    This presentation gives an overview of Observing System Simulation Experiments (OSSEs). The components of an OSSE are described, along with discussion of the process for validating, calibrating, and performing experiments. a.

  3. Fission Reaction Event Yield Algorithm, FREYA - For event-by-event simulation of fission

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Randrup, J.; Vogt, R.

    2015-06-01

    From nuclear materials accountability to detection of special nuclear material, SNM, the need for better modeling of fission has grown over the past decades. Current radiation transport codes compute average quantities with great accuracy and performance, but performance and averaging come at the price of limited interaction-by-interaction modeling. For fission applications, these codes often lack the capability of modeling interactions exactly: energy is not conserved, energies of emitted particles are uncorrelated, prompt fission neutron and photon multiplicities are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g. the neutron multiplicity) and correlations between neutrons and photons. The new computational model, FREYA (Fission Reaction Event Yield Algorithm), aims to meet this need by modeling complete fission events. Thus it automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum. FREYA has been integrated into the LLNL Fission Library, and will soon be part of MCNPX2.7.0, MCNP6, TRIPOLI-4.9, and Geant4.10.

  4. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  5. Advanced Simulation of Coupled Earthquake and Tsunami Events (ASCETE) - Simulation Techniques for Realistic Tsunami Process Studies

    NASA Astrophysics Data System (ADS)

    Behrens, Joern; Bader, Michael; Breuer, Alexander N.; van Dinther, Ylona; Gabriel, Alice-A.; Galvez Barron, Percy E.; Rahnema, Kaveh; Vater, Stefan; Wollherr, Stephanie

    2015-04-01

    At the End of phase 1 of the ASCETE project a simulation framework for coupled physics-based rupture generation with tsunami propagation and inundation is available. Adaptive mesh tsunami propagation and inundation by discontinuous Galerkin Runge-Kutta methods allows for accurate and conservative inundation schemes. Combined with a tree-based refinement strategy to highly optimize the code for high-performance computing architectures, a modeling tool for high fidelity tsunami simulations has been constructed. Validation results demonstrate the capacity of the software. Rupture simulation is performed by an unstructured tetrahedral discontinuous Galerking ADER discretization, which allows for accurate representation of complex geometries. The implemented code was nominated for and was selected as a finalist for the Gordon Bell award in high-performance computing. Highly realistic rupture events can be simulated with this modeling tool. The coupling of rupture induced wave activity and displacement with hydrodynamic equations still poses a major problem due to diverging time and spatial scales. Some insight from the ASCETE set-up could be gained and the presentation will focus on the coupled behavior of the simulation system. Finally, an outlook to phase 2 of the ASCETE project will be given in which further development of detailed physical processes as well as near-realistic scenario computations are planned. ASCETE is funded by the Volkswagen Foundation.

  6. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  7. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  8. Simulating an Extreme Wind Event in a Topographically Complex Region

    NASA Astrophysics Data System (ADS)

    Lennard, Christopher

    2014-07-01

    Complex topography modifies local weather characteristics such as air temperature, rainfall and airflow within a larger regional extent. The Cape Peninsula around Cape Town, South Africa, is a complex topographical feature responsible for the modification of rainfall and wind fields largely downstream of the Peninsula. During the passage of a cold front on 2 October 2002, an extreme wind event associated with tornado-like damage occurred in the suburb of Manenberg, however synoptic conditions did not indicate convective activity typically associated with a tornado. A numerical regional climate model was operated at very high horizontal resolution (500 m) to investigate the dynamics of the event. The model simulated an interaction between the topography of the peninsula and an airflow direction change associated with the passage of the cold front. A small region of cyclonic circulation was simulated over Manenberg that was embedded in an area of negative vorticity and a leeward gravity wave. The feature lasted 14 min and moved in a north to south direction. Vertically, it was not evident above 220 m. The model assessment describes this event as a shallow but intense cyclonic vortex generated in the lee of the peninsula through an interaction between the peninsula and a change in wind direction as the cold front made landfall. The model did not simulate wind speeds associated with the observed damage suggesting that the horizontal grid resolution ought to be at the scale of the event to more completely understand such microscale airflow phenomena.

  9. "Orpheus" cardiopulmonary bypass simulation system.

    PubMed

    Morris, Richard W; Pybus, David A

    2007-12-01

    In this paper we describe a high-fidelity perfusion simulation system intended for use in the training and continuing education of perfusionists. The system comprises a hydraulic simulator, an electronic interface unit and a controlling computer with associated real-time computer models. It is designed for use within an actual operating theatre, or within a specialized simulation facility. The hydraulic simulator can be positioned on an operating table and physically connected to the circuit of the institutional heart-lung machine. The institutional monitoring system is used to display the arterial and central venous pressures, the ECG and the nasopharyngeal temperature using appropriate connections. The simulator is able to reproduce the full spectrum of normal and abnormal events that may present during the course of cardiopulmonary bypass. The system incorporates a sophisticated blood gas model that accurately predicts the behavior of a modern, hollow-fiber oxygenator. Output from this model is displayed in the manner of an in-line blood gas electrode and is updated every 500 msecs. The perfusionist is able to administer a wide variety of drugs during a simulation session including: vasoconstrictors (metaraminol, epinephrine and phenylephrine), a vasodilator (sodium nitroprusside), chronotropes (epinephrine and atropine), an inotrope (epinephrine) and modifiers of coagulation (heparin and protamine). Each drug has a pharmacokinetic profile based on a three-compartment model plus an effect compartment. The simulation system has potential roles in the skill training of perfusionists, the development of crisis management protocols, the certification and accreditation of perfusionists and the evaluation of new perfusion equipment and/or techniques. PMID:18293807

  10. Top Event Matrix Analysis Code System.

    Energy Science and Technology Software Center (ESTSC)

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  11. SPICE: Simulation Package for Including Flavor in Collider Events

    NASA Astrophysics Data System (ADS)

    Engelhard, Guy; Feng, Jonathan L.; Galon, Iftah; Sanford, David; Yu, Felix

    2010-01-01

    We describe SPICE: Simulation Package for Including Flavor in Collider Events. SPICE takes as input two ingredients: a standard flavor-conserving supersymmetric spectrum and a set of flavor-violating slepton mass parameters, both of which are specified at some high "mediation" scale. SPICE then combines these two ingredients to form a flavor-violating model, determines the resulting low-energy spectrum and branching ratios, and outputs HERWIG and SUSY Les Houches files, which may be used to generate collider events. The flavor-conserving model may be any of the standard supersymmetric models, including minimal supergravity, minimal gauge-mediated supersymmetry breaking, and anomaly-mediated supersymmetry breaking supplemented by a universal scalar mass. The flavor-violating contributions may be specified in a number of ways, from specifying charges of fields under horizontal symmetries to completely specifying all flavor-violating parameters. SPICE is fully documented and publicly available, and is intended to be a user-friendly aid in the study of flavor at the Large Hadron Collider and other future colliders. Program summaryProgram title: SPICE Catalogue identifier: AEFL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 8153 No. of bytes in distributed program, including test data, etc.: 67 291 Distribution format: tar.gz Programming language: C++ Computer: Personal computer Operating system: Tested on Scientific Linux 4.x Classification: 11.1 External routines: SOFTSUSY [1,2] and SUSYHIT [3] Nature of problem: Simulation programs are required to compare theoretical models in particle physics with present and future data at particle colliders. SPICE determines the masses and decay branching ratios of

  12. Towards Flexible Exascale Stream Processing System Simulation

    SciTech Connect

    Li, Cheng-Hong; Nair, Ravi; Ohba, Noboyuki; Shvadron, Uzi; Zaks, Ayal; Schenfeld, Eugen

    2012-01-01

    Stream processing is an important emerging computational model for performing complex operations on and across multi-source, high-volume, unpredictable dataflows. We present Flow, a platform for parallel and distributed stream processing system simulation that provides a flexible modeling environment for analyzing stream processing applications. The Flow stream processing system simulator is a high-performance, scalable simulator that automatically parallelizes chunks of the model space and incurs near-zero synchronization overhead for acyclic stream application graphs. We show promising parallel and distributed event rates exceeding 149 million events per second on a cluster with 512 processor cores.

  13. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  14. Device simulation of charge collection and single-event upset

    SciTech Connect

    Dodd, P.E.

    1996-04-01

    In this paper the author reviews the current status of device simulation of ionizing-radiation-induced charge collection and single-event upset (SEU), with an emphasis on significant results of recent years. The author presents an overview of device-modeling techniques applicable to the SEU problem and the unique challenges this task presents to the device modeler. He examines unloaded simulations of radiation-induced charge collection in simple p/n diodes, SEU in dynamic random access memories (DRAM`s), and SEU in static random access memories (SRAM`s). The author concludes with a few thoughts on future issues likely to confront the SEU device modeler.

  15. Extreme events evaluation over African cities with regional climate simulations

    NASA Astrophysics Data System (ADS)

    Bucchignani, Edoardo; Mercogliano, Paola; Simonis, Ingo; Engelbrecht, Francois

    2013-04-01

    The warming of the climate system in recent decades is evident from observations and is mainly related to the increase of anthropogenic greenhouse gas concentrations (IPCC, 2012). Given the expected climate change conditions on the African continent, as underlined in different publications, and their associated socio-economic impacts, an evaluation of the specific effects on some strategic African cities on the medium and long-term is of crucial importance with regard to the development of adaptation strategies. Assessments usually focus on averages climate properties rather than on variability or extremes, but often these last ones have more impacts on the society than averages values. Global Coupled Models (GCM) are generally used to simulate future climate scenarios as they guarantee physical consistency between variables; however, due to the coarse spatial resolution, their output cannot be used for impact studies on local scales, which makes necessary the generation of higher resolution climate change data. Regional Climate Models (RCM) describe better the phenomena forced by orography or by coastal lines, or that are related to convection. Therefore they can provide more detailed information on climate extremes that are hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws. The normal bias of the RCM to represent the local climatology is reduced using adequate statistical techniques based on the comparison of the simulated results with long observational time series. In the framework of the EU-FP7 CLUVA (Climate Change and Urban Vulnerability in Africa) project, regional projections of climate change at high resolution (about 8 km), have been performed for selected areas surrounding five African cities. At CMCC, the regional climate model COSMO-CLM has been employed: it is a non-hydrostatic model. For each domain, two simulations have been performed, considering the RCP4.5 and RCP8.5 emission

  16. Flash heat simulation events in the north Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Mazon, Jordi; Pino, David

    2013-04-01

    According to the definition of flash heat event proposed by Mazon et al. in the European Meteorology Meeting (2011 and 2012) from the studied case produced in the Northeast of the Iberian peninsula on 27th August 20120, some other flash heat events have been detected by automatic weather stations around the in the Mediterranean basin (South Italy, Crete island, South Greece and the northeast of the Iberian peninsula). Flash heat event covers those events in which a large increase of temperature last a spatial and temporal scale between heat wave (defined by the WMO as a phenomenon in which the daily maximum temperature of more than five consecutive days exceeds the average maximum temperature by 5°C, with respect to the 1961-1990 period) and heat burst (defined by the AMS as a rare atmospheric event characterized by gusty winds and a rapid increase in temperature and decrease in humidity that can last some minutes). Thus flash heat event may be considered as a rapid modification of the temperature that last several hours, lower than 48 hours, but usually less than 24 hours. Two different flash heat events have been simulated with the WRF mesoscale model in the Mediterranean basin. The results show that two different mechanisms are the main causes of these flash heat events. The first one occurred on 23rd March 2008 in Crete Island due to a strong Foehn effect caused by a strong south and southeast wind, in which the maximum temperature increased during some hours on the night at 32°C. The second one occurred on 1st August 2012 in the northeast of the Iberian Peninsula, caused by a rapid displacement of warm a ridge from North Africa that lasted around 24 hours.

  17. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  18. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  19. Analyses Of Transient Events In Complex Valve and Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Cavallo, Peter; Daines, Russell

    2005-01-01

    Valve systems in rocket propulsion systems and testing facilities are constantly subject to dynamic events resulting from the timing of valve motion leading to unsteady fluctuations in pressure and mass flow. Such events can also be accompanied by cavitation, resonance, system vibration leading to catastrophic failure. High-fidelity dynamic computational simulations of valve operation can yield important information of valve response to varying flow conditions. Prediction of transient behavior related to valve motion can serve as guidelines for valve scheduling, which is of crucial importance in engine operation and testing. In this paper, we present simulations of the diverse unsteady phenomena related to valve and feed systems that include valve stall, valve timing studies as well as cavitation instabilities in components utilized in the test loop.

  20. Attribution of extreme weather and climate events overestimated by unreliable climate simulations

    NASA Astrophysics Data System (ADS)

    Bellprat, Omar; Doblas-Reyes, Francisco

    2016-03-01

    Event attribution aims to estimate the role of an external driver after the occurrence of an extreme weather and climate event by comparing the probability that the event occurs in two counterfactual worlds. These probabilities are typically computed using ensembles of climate simulations whose simulated probabilities are known to be imperfect. The implications of using imperfect models in this context are largely unknown, limited by the number of observed extreme events in the past to conduct a robust evaluation. Using an idealized framework, this model limitation is studied by generating large number of simulations with variable reliability in simulated probability. The framework illustrates that unreliable climate simulations are prone to overestimate the attributable risk to climate change. Climate model ensembles tend to be overconfident in their representation of the climate variability which leads to systematic increase in the attributable risk to an extreme event. Our results suggest that event attribution approaches comprising of a single climate model would benefit from ensemble calibration in order to account for model inadequacies similarly as operational forecasting systems.

  1. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  2. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  3. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  4. Earthquake Simulations and Historical Patterns of Events: Forecasting the Next Great Earthquake in California

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Rundle, J. B.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M.; Kellogg, L. H.

    2013-12-01

    The fault system in California combined with some of the United States most densely populated regions is a recipe for devastation. It has been estimated that a repeat of the 1906 m=7.8 San Francisco earthquake could cause as much as $84 billion in damage. Earthquake forecasting can help alleviate the effects of these events by targeting disaster relief and preparedness in regions that will need it the most. However, accurate earthquake forecasting has proven difficult. We present a forecasting technique that uses simulated earthquake catalogs generated by Virtual California and patterns of historical events. As background, we also describe internal details of the Virtual California earthquake simulator.

  5. Discrete-Event Simulation Models of Plasmodium falciparum Malaria

    PubMed Central

    McKenzie, F. Ellis; Wong, Roger C.; Bossert, William H.

    2008-01-01

    We develop discrete-event simulation models using a single “timeline” variable to represent the Plasmodium falciparum lifecycle in individual hosts and vectors within interacting host and vector populations. Where they are comparable our conclusions regarding the relative importance of vector mortality and the durations of host immunity and parasite development are congruent with those of classic differential-equation models of malaria, epidemiology. However, our results also imply that in regions with intense perennial transmission, the influence of mosquito mortality on malaria prevalence in humans may be rivaled by that of the duration of host infectivity. PMID:18668185

  6. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  7. Parallel system simulation

    SciTech Connect

    Tai, H.M.; Saeks, R.

    1984-03-01

    A relaxation algorithm for solving large-scale system simulation problems in parallel is proposed. The algorithm, which is composed of both a time-step parallel algorithm and a component-wise parallel algorithm, is described. The interconnected nature of the system, which is characterized by the component connection model, is fully exploited by this approach. A technique for finding an optimal number of the time steps is also described. Finally, this algorithm is illustrated via several examples in which the possible trade-offs between the speed-up ratio, efficiency, and waiting time are analyzed.

  8. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    SciTech Connect

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  9. 3D Simulation of External Flooding Events for the RISMC Pathway

    SciTech Connect

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  10. Cardiovascular Events in Systemic Lupus Erythematosus

    PubMed Central

    Fernández-Nebro, Antonio; Rúa-Figueroa, Íñigo; López-Longo, Francisco J.; Galindo-Izquierdo, María; Calvo-Alén, Jaime; Olivé-Marqués, Alejandro; Ordóñez-Cañizares, Carmen; Martín-Martínez, María A.; Blanco, Ricardo; Melero-González, Rafael; Ibáñez-Rúan, Jesús; Bernal-Vidal, José Antonio; Tomero-Muriel, Eva; Uriarte-Isacelaya, Esther; Horcada-Rubio, Loreto; Freire-González, Mercedes; Narváez, Javier; Boteanu, Alina L.; Santos-Soler, Gregorio; Andreu, José L.; Pego-Reigosa, José M.

    2015-01-01

    Abstract This article estimates the frequency of cardiovascular (CV) events that occurred after diagnosis in a large Spanish cohort of patients with systemic lupus erythematosus (SLE) and investigates the main risk factors for atherosclerosis. RELESSER is a nationwide multicenter, hospital-based registry of SLE patients. This is a cross-sectional study. Demographic and clinical variables, the presence of traditional risk factors, and CV events were collected. A CV event was defined as a myocardial infarction, angina, stroke, and/or peripheral artery disease. Multiple logistic regression analysis was performed to investigate the possible risk factors for atherosclerosis. From 2011 to 2012, 3658 SLE patients were enrolled. Of these, 374 (10.9%) patients suffered at least a CV event. In 269 (7.4%) patients, the CV events occurred after SLE diagnosis (86.2% women, median [interquartile range] age 54.9 years [43.2–66.1], and SLE duration of 212.0 months [120.8–289.0]). Strokes (5.7%) were the most frequent CV event, followed by ischemic heart disease (3.8%) and peripheral artery disease (2.2%). Multivariate analysis identified age (odds ratio [95% confidence interval], 1.03 [1.02–1.04]), hypertension (1.71 [1.20–2.44]), smoking (1.48 [1.06–2.07]), diabetes (2.2 [1.32–3.74]), dyslipidemia (2.18 [1.54–3.09]), neurolupus (2.42 [1.56–3.75]), valvulopathy (2.44 [1.34–4.26]), serositis (1.54 [1.09–2.18]), antiphospholipid antibodies (1.57 [1.13–2.17]), low complement (1.81 [1.12–2.93]), and azathioprine (1.47 [1.04–2.07]) as risk factors for CV events. We have confirmed that SLE patients suffer a high prevalence of premature CV disease. Both traditional and nontraditional risk factors contribute to this higher prevalence. Although it needs to be verified with future studies, our study also shows—for the first time—an association between diabetes and CV events in SLE patients. PMID:26200625

  11. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  12. The Flexible Rare Event Sampling Harness System (FRESHS)

    NASA Astrophysics Data System (ADS)

    Kratzer, Kai; Berryman, Joshua T.; Taudt, Aaron; Zeman, Johannes; Arnold, Axel

    2014-07-01

    We present the software package FRESHS (http://www.freshs.org) for parallel simulation of rare events using sampling techniques from the ‘splitting’ family of methods. Initially, Forward Flux Sampling (FFS) and Stochastic Process Rare Event Sampling (SPRES) have been implemented. These two methods together make rare event sampling available for both quasi-static and full non-equilibrium regimes. Our framework provides a plugin system for software implementing the underlying physics of the system of interest. At present, example plugins exist for our framework to steer the popular MD packages GROMACS, LAMMPS and ESPResSo, but due to the simple interface of our plugin system, it is also easy to attach other simulation software or self-written code. Use of our framework does not require recompilation of the simulation program. The modular structure allows the flexible implementation of further sampling methods or physics engines and creates a basis for objective comparison of different sampling algorithms. Our code is designed to make optimal use of available compute resources. System states are managed using standard database technology so as to allow checkpointing, scaling and flexible analysis. The communication within the framework uses plain TCP/IP networking and is therefore suited to high-performance parallel hardware as well as to distributed or even heterogeneous networks of inexpensive machines. For FFS we implemented an automatic interface placement that ensures optimal, nearly constant flux through the interfaces. We introduce ‘ghost’ (or ‘look-ahead’) runs that remedy the bottleneck which occurs when progressing to the next interface. FRESHS is open-source, providing a publicly available parallelized rare event sampling system.

  13. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  14. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  15. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  16. Simulated cold events in the northern North Atlantic during the last millennium

    NASA Astrophysics Data System (ADS)

    Moreno-Chamarro, Eduardo; Zanchettin, Davide; Lohmann, Katja; Jungclaus, Johann

    2014-05-01

    Paleoceanographic data show large inter-decadal cold excursions in sea-surface temperatures (SSTs) in the western subpolar gyre region and north of Iceland throughout the last millennium. A series of such events could have contributed to demise the Norse settlements over Greenland during the 13th to the 15th century due to associated deteriorating environmental conditions in the region. However, spatial extent, attribution and mechanism(s) of these cold events are not known. In this contribution, we use climate model simulations to clarify the role of the ocean and of coupled ocean-atmosphere dynamics in triggering these cold events, and to assess whether they can be explained by internal climate variability alone. Specifically, we investigate the North Atlantic-Arctic climate variability in a 1000-year control run describing an unperturbed pre-industrial climate, and in a 3-member ensemble of full-forcing transient simulations of the last millennium. Simulations are performed with the Max Planck Institute-Earth System Model for paleo-applications. In the control and transient simulations, we identified cold events of similar amplitude and duration to the reconstructed data. Spatial patterns and temporal evolutions of simulated cold events are similar in both simulation types. In the transient runs, furthermore, they do not robustly coincide with periods of strong external forcing (e.g. of major volcanic eruptions). We therefore conclude that such events can emerge because of internally-generated regional climate variability alone. Local ocean-atmosphere coupled processes in the North Atlantic subpolar gyre region appear as key part of the mechanism of simulated cold events. In particular, they are typically associated with the onset of prolonged positive sea-level pressure anomalies over the North Atlantic and associated weaker and south-eastward displaced subpolar gyre. The salt transport reduction by the Irminger Current together with an intensification of the

  17. Simulation of a Storm Surge Event at the North Sea (Germany) Using a Fully Coupled Approach

    NASA Astrophysics Data System (ADS)

    Yang, J.; Graf, T.

    2012-04-01

    Tidal fluctuation and storm surge events lead to saltwater intrusion into a coastal aquifer. Tidal fluctuation causes dynamic boundary conditions of the seaside boundary, where submerged zones are of Dirichlet-type, and where aerial zones are of Neumann type. In a storm surge event, saltwater will flow on the land surface towards the inland and cover parts of the land surface. Saltwater will eventually infiltrate the unsaturated soil and percolate downwards towards the groundwater table. To simulate that dynamic coastal flow system, a fully integrated approach based on the numerical "HydroGeoSphere" model is being developed, where the coastal zone is treated as a hydraulically coupled surface-subsurface system. That new approach will allow simulation of: (i) surface flow, (ii) variably saturated, density-dependent groundwater flow, (iii) salt transport in the surface and in the subsurface, and (iv) water and salt interaction between surface and subsurface. In the new approach, tide and storm surge events induce a time variant head that is applied to nodes of the surface domain thus tide or storm surge force will be applied to the system through surface domain. The hydraulic interaction between the surface domain and the subsurface domain simplify the flow and transport boundary conditions caused by tidal fluctuation and storm surge events. This newly proposed approach is the first conceptual model of a fully coupled surface-subsurface coastal flow domain. It allows simulation of tidal activity and storm surges at a heretofore impossible complexity.

  18. Coupling expert systems and simulation

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G.; Padalkar, S.; Rodriguez-Moscoso, J.; Hsieh, B. J.; Vinz, F.; Fernandez, K. R.

    1988-01-01

    A prototype coupled system called NESS (NASA Expert Simulation System) is described. NESS assists the user in running digital simulations of dynamic systems, interprets the output data to performance specifications, and recommends a suitable series compensator to be added to the simulation model.

  19. Simulation of rare events in quantum error correction

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey; Vargo, Alexander

    2013-12-01

    We consider the problem of calculating the logical error probability for a stabilizer quantum code subject to random Pauli errors. To access the regime of large code distances where logical errors are extremely unlikely we adopt the splitting method widely used in Monte Carlo simulations of rare events and Bennett's acceptance ratio method for estimating the free energy difference between two canonical ensembles. To illustrate the power of these methods in the context of error correction, we calculate the logical error probability PL for the two-dimensional surface code on a square lattice with a pair of holes for all code distances d≤20 and all error rates p below the fault-tolerance threshold. Our numerical results confirm the expected exponential decay PL˜exp[-α(p)d] and provide a simple fitting formula for the decay rate α(p). Both noiseless and noisy syndrome readout circuits are considered.

  20. Transportation Anslysis Simulation System

    SciTech Connect

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at the level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not account

  1. Transportation Anslysis Simulation System

    Energy Science and Technology Software Center (ESTSC)

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at themore » level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not

  2. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  3. Rare-event simulation methods for equilibrium and non-equilibrium events

    NASA Astrophysics Data System (ADS)

    Ziff, Robert

    2014-03-01

    Rare events are those that occur with a very low probability in experiment, or are common but difficult to sample using standard computer simulation techniques. Such processes require advanced methods in order to obtain useful results in reasonable amounts of computer time. We discuss some of those techniques here, including the ``barrier'' method, splitting methods, and a Forward-Flux Sampling in Time (FFST) algorithm, and apply them to measure the nucleation times of the first-order transition in the Ziff-Gulari-Barshad model of surface catalysis, including nucleation in finite equilibrium states, which are measured to occur with probabilities as low as 10°C(-40). We also study the transitions in the Maier-Stein model of chemical kinetics, and use the methods to find the harmonic measure in percolation and Diffusion-Limited Aggregation (DLA) clusters. co-authors: David Adams, Google, and Leonard Sander, University of Michigan.

  4. Production of Nitrogen Oxides by Laboratory Simulated Transient Luminous Events

    NASA Astrophysics Data System (ADS)

    Peterson, H.; Bailey, M.; Hallett, J.; Beasley, W.

    2007-12-01

    Restoration of the polar stratospheric ozone layer has occurred at rates below those originally expected following reductions in chlorofluorocarbon (CFC) usage. Additional reactions affecting ozone depletion now must also be considered. This research examines nitrogen oxides (NOx) produced in the middle atmosphere by transient luminous events (TLEs), with NOx production in this layer contributing to the loss of stratospheric ozone. In particular, NOx produced by sprites in the mesosphere would be transported to the polar stratosphere via the global meridional circulation and downward diffusion. A pressure-controlled vacuum chamber was used to simulate middle atmosphere pressures, while a power supply and in-chamber electrodes were used to simulate TLEs in the pressure controlled environment. Chemiluminescence NOx analyzers were used to sample NOx produced by the chamber discharges- originally a Monitor Labs Model 8440E, later a Thermo Environment Model 42. Total NOx production for each discharge as well as NOx per ampere of current and NOx per Joule of discharge energy were plotted. Absolute NOx production was greatest for discharge environments with upper tropospheric pressures (100-380 torr), while NOx/J was greatest for discharge environments with stratospheric pressures (around 10 torr). The different production efficiencies in NOx/J as a function of pressure pointed to three different production regimes, each with its own reaction mechanisms: one for tropospheric pressures, one for stratospheric pressures, and one for upper stratospheric to mesospheric pressures (no greater than 1 torr).

  5. Simulating neural systems with Xyce.

    SciTech Connect

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting; Warrender, Christina E.; Aimone, James Bradley; Teeter, Corinne; Duda, Alex M.

    2012-12-01

    Sandia's parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  6. Cellular Dynamic Simulator: An Event Driven Molecular Simulation Environment for Cellular Physiology

    PubMed Central

    Byrne, Michael J.; Waxham, M. Neal; Kubota, Yoshihisa

    2010-01-01

    In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations. PMID:20361275

  7. Numerical simulation of the October 2002 dust event in Australia

    NASA Astrophysics Data System (ADS)

    Shao, Yaping; Leys, John F.; McTainsh, Grant H.; Tews, Kenn

    2007-04-01

    In comparison to the major dust sources in the Northern Hemisphere, Australia is a relatively minor contributor to the global dust budget. However, severe dust storms do occur in Australia, especially in drought years. In this study, we simulate the 22-23 October 2002 dust storm using an integrated dust model, which is probably the most severe dust storm in Australia in at least the past 40 years. The model results are compared with synoptic visibility data and satellite images and for several stations, with high-volume sampler measurements. The model simulations are then used to estimate dust load, emission, and deposition, both for over the continent and for over the ocean. The main dust sources and sinks are identified. Dust sources include the desert areas in northern South Australia, the grazing lands in western New South Wales (NSW), and the farm lands in NSW, Victoria, and Western Australia, as well as areas in Queensland and Northern Territory. The desert areas appear to be the strongest source. The maximum dust emission is around 2000 μg m-2 s-1, and the maximum net dust emission is around 500 μg m-2 s-1. The total amount of dust eroded from the Australian continent during this dust event is around 95.8 Mt, of which 93.67 Mt is deposited on the continent and 2.13 Mt in the ocean. The maximum total dust load over the simulation domain is around 5 Mt. The magnitude of this Australian dust storm corresponds to a northeast Asian dust storm of moderate size.

  8. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  9. Aging and brain rejuvenation as systemic events

    PubMed Central

    Bouchard, Jill; Villeda, Saul A

    2015-01-01

    The effects of aging were traditionally thought to be immutable, particularly evident in the loss of plasticity and cognitive abilities occurring in the aged central nervous system (CNS). However, it is becoming increasingly apparent that extrinsic systemic manipulations such as exercise, caloric restriction, and changing blood composition by heterochronic parabiosis or young plasma administration can partially counteract this age-related loss of plasticity in the aged brain. In this review, we discuss the process of aging and rejuvenation as systemic events. We summarize genetic studies that demonstrate a surprising level of malleability in organismal lifespan, and highlight the potential for systemic manipulations to functionally reverse the effects of aging in the CNS. Based on mounting evidence, we propose that rejuvenating effects of systemic manipulations are mediated, in part, by blood-borne ‘pro-youthful’ factors. Thus, systemic manipulations promoting a younger blood composition provide effective strategies to rejuvenate the aged brain. As a consequence, we can now consider reactivating latent plasticity dormant in the aged CNS as a means to rejuvenate regenerative, synaptic, and cognitive functions late in life, with potential implications even for extending lifespan. PMID:25327899

  10. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    SciTech Connect

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  11. Bioaccumulation and Aquatic System Simulator

    EPA Science Inventory

    BASS (Bioaccumulation and Aquatic )System Simulator) is a Fortran 95 simulation program that predicts the population and bioaccumulation dynamics of age-structured fish assemblages that are exposed to hydrophobic organic pollutants and class B and bord...

  12. Hydrogen Event Containment Response Code System.

    Energy Science and Technology Software Center (ESTSC)

    1999-11-23

    Version: 00 Distribution is restricted to the United States Only. HECTR1.5 (Hydrogen Event-Containment Transient Response) is a lumped-volume containment analysis program that is most useful for performing parametric studies. Its main purpose is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other containment problems. Six gases; steam, nitrogen, oxygen, hydrogen, carbon monoxide, and carbonmore » dioxide are modified along with sumps containing liquid water. HECTR can model virtually all the containment systems of importance in ice condenser, large dry and Mark III containments. A postprocessor, ACHILES1.5, is included. It processes the time-dependent variable output (compartment pressures, flow junction velocities, surface temperatures, etc.) produced by HECTR. ACHILES can produce tables and graphs of these data.« less

  13. Theorising interventions as events in systems.

    PubMed

    Hawe, Penelope; Shiell, Alan; Riley, Therese

    2009-06-01

    Conventional thinking about preventive interventions focuses over simplistically on the "package" of activities and/or their educational messages. An alternative is to focus on the dynamic properties of the context into which the intervention is introduced. Schools, communities and worksites can be thought of as complex ecological systems. They can be theorised on three dimensions: (1) their constituent activity settings (e.g., clubs, festivals, assemblies, classrooms); (2) the social networks that connect the people and the settings; and (3) time. An intervention may then be seen as a critical event in the history of a system, leading to the evolution of new structures of interaction and new shared meanings. Interventions impact on evolving networks of person-time-place interaction, changing relationships, displacing existing activities and redistributing and transforming resources. This alternative view has significant implications for how interventions should be evaluated and how they could be made more effective. We explore this idea, drawing on social network analysis and complex systems theory. PMID:19390961

  14. Discrete Event Supervisory Control Applied to Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Shah, Neerav

    2005-01-01

    The theory of discrete event supervisory (DES) control was applied to the optimal control of a twin-engine aircraft propulsion system and demonstrated in a simulation. The supervisory control, which is implemented as a finite-state automaton, oversees the behavior of a system and manages it in such a way that it maximizes a performance criterion, similar to a traditional optimal control problem. DES controllers can be nested such that a high-level controller supervises multiple lower level controllers. This structure can be expanded to control huge, complex systems, providing optimal performance and increasing autonomy with each additional level. The DES control strategy for propulsion systems was validated using a distributed testbed consisting of multiple computers--each representing a module of the overall propulsion system--to simulate real-time hardware-in-the-loop testing. In the first experiment, DES control was applied to the operation of a nonlinear simulation of a turbofan engine (running in closed loop using its own feedback controller) to minimize engine structural damage caused by a combination of thermal and structural loads. This enables increased on-wing time for the engine through better management of the engine-component life usage. Thus, the engine-level DES acts as a life-extending controller through its interaction with and manipulation of the engine s operation.

  15. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  16. Small-World Synchronized Computing Networks for Scalable Parallel Discrete-Event Simulations

    NASA Astrophysics Data System (ADS)

    Guclu, Hasan; Korniss, Gyorgy; Toroczkai, Zoltan; Novotny, Mark A.

    We study the scalability of parallel discrete-event simulations for arbitrary short-range interacting systems with asynchronous dynamics. When the synchronization topology mimics that of the short-range interacting underlying system, the virtual time horizon (corresponding to the progress of the processing elements) exhibits Kardar-Parisi-Zhang-like kinetic roughening. Although the virtual times, on average, progress at a nonzero rate, their statistical spread diverges with the number of processing elements, hindering efficient data collection. We show that when the synchronization topology is extended to include quenched random communication links between the processing elements, they make a close-to-uniform progress with a nonzero rate, without global synchronization. We discuss in detail a coarse-grained description for the small-world synchronized virtual time horizon and compare the findings to those obtained by simulating the simulations based on the exact algorithmic rules.

  17. Numerical simulation diagnostics of a flash flood event in Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Samman, Ahmad

    On 26 January 2011, a severe storm hit the city of Jeddah, the second largest city in the Kingdom of Saudi Arabia. The storm resulted in heavy rainfall, which produced a flash flood in a short period of time. This event caused at least eleven fatalities and more than 114 injuries. Unfortunately, the observed rainfall data are limited to the weather station at King Abdul Aziz International airport, which is north of the city, while the most extreme precipitation occurred over the southern part of the city. This observation was useful to compare simulation result even though it does not reflect the severity of the event. The Regional Atmospheric Modeling System (RAMS) developed at Colorado State University was used to study this storm event. RAMS simulations indicted that a quasi-stationary Mesoscale convective system developed over the city of Jeddah and lasted for several hours. It was the source of the huge amount of rainfall. The model computed a total rainfall of more than 110 mm in the southern part of the city, where the flash flood occurred. This precipitation estimation was confirmed by the actual observation of the weather radar. While the annual rainfall in Jeddah during the winter varies from 50 to 100 mm, the amount of the rainfall resulting from this storm event exceeded the climatological total annual rainfall. The simulation of this event showed that warm sea surface temperature, combined with high humidity in the lower atmosphere and a large amount of convective available potential energy (CAPE) provided a favorable environment for convection. It also showed the presence of a cyclonic system over the north and eastern parts of the Mediterranean Sea, and a subtropical anti-cyclone over Northeastern Africa that contributed to cold air advection bringing cold air to the Jeddah area. In addition, an anti-cyclone (blocking) centered over east and southeastern parts of the Arabian Peninsula and the Arabian Sea produced a low level jet over the southern

  18. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  19. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  20. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  1. Using WIRED to study Simulated Linear Collider Detector Events

    SciTech Connect

    George, A

    2004-02-05

    The purpose of this project is to enhance the properties of the LCD WIRED Event Display. By extending the functionality of the display, physicists will be able to view events with more detail and interpret data faster. Poor characteristics associated with WIRED can severely affect the way we understand events, but by bringing attention to specific attributes we open doors to new ideas. Events displayed inside of the LCD have many different properties; this is why scientists need to be able to distinguish data using a plethora of symbols and other graphics. This paper will explain how we can view events differently using clustering and displaying results with track finding. Different source codes extracted from HEP libraries will be analyzed and tested to see which codes display the information needed. It is clear that, through these changes certain aspects of WIRED will be recognized more often allowing good event display which lead to better physics results.

  2. System time-domain simulation

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Eggleston, T. W.; Goris, A. C.; Fashano, M.; Paynter, D.; Tranter, W. H.

    1980-01-01

    Complex systems are simulated by engineers without extensive computer experience. Analyst uses free-form engineering-oriented language to input "black box" description. System Time Domain (SYSTID) Simulation Program generates appropriate algorithms and proceeds with simulation. Program is easily linked to postprocessing routines. SYSTID program is written in FORTRAN IV for batch execution and has been implemented on UNIVAC 1110 under control of EXEC 8, Level 31.

  3. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  4. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Licensee event report system. 50.73 Section 50.73 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF PRODUCTION AND UTILIZATION FACILITIES Inspections, Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder of an operating license under this...

  5. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  6. Regional Climate Simulation of the Anomalous Events of 1998 using a Stretched-Grid GCM with Multiple Areas of Interest

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The GEOS (Goddard Earth Observing System) stretched-grid (SG) GCM developed and thoroughly tested over the last few years, is used for simulating the major anomalous regional climate events of 1998. The anomalous regional climate events are simulated simultaneously during the 13 months long (November-1997 - December-1998) SG-GCM simulation due to using the new SG-design with multiple (four) areas of interest. The following areas/regions of interest (one at each global quadrant) are implemented: U.S./Northern Mexico, the El-Nino/Brazil area, India-China, and Eastern Indian Ocean/Australia.

  7. Decision support system for managing oil spill events.

    PubMed

    Keramitsoglou, Iphigenia; Cartalis, Constantinos; Kassomenos, Pavlos

    2003-08-01

    The Mediterranean environment is exposed to various hazards, including oil spills, forest fires, and floods, making the development of a decision support system (DSS) for emergency management an objective of utmost importance. The present work presents a complete DSS for managing marine pollution events caused by oil spills. The system provides all the necessary tools for early detection of oil-spills from satellite images, monitoring of their evolution, estimation of the accident consequences and provision of support to responsible Public Authorities during clean-up operations. The heart of the system is an image processing-geographic information system and other assistant individual software tools that perform oil spill evolution simulation and all other necessary numerical calculations as well as cartographic and reporting tasks related to a specific management of the oil spill event. The cartographic information is derived from the extant general maps representing detailed information concerning several regional environmental and land-cover characteristics as well as financial activities of the application area. Early notification of the authorities with up-to-date accurate information on the position and evolution of the oil spill, combined with the detailed coastal maps, is of paramount importance for emergency assessment and effective clean-up operations that would prevent environmental hazard. An application was developed for the Region of Crete, an area particularly vulnerable to oil spills due to its location, ecological characteristics, and local economic activities. PMID:14753653

  8. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  9. Event Plane Resolution Simulations for The Fast Interaction Trigger Detector of ALICE at the LHC

    NASA Astrophysics Data System (ADS)

    Sulaimon, Isiaka; Harton, Austin; Garcia, Edmundo; Alice-Fit Collaboration

    2016-03-01

    CERN (European Center for Nuclear Research) is a global laboratory that studies proton and heavy ion collisions at the Large Hadron Collider (LHC). ALICE (A Large Ion Collider Experiment) is one of four large experiments of the LHC. ALICE is dedicated to the study of the transition of matter to Quark Gluon Plasma in heavy ion collisions. In the present ALICE detector there are two sub-detectors, (the T0 and V0), that provide minimum bias trigger, multiplicity trigger, beam-gas event rejection, collision time for other sub detectors, on line multiplicity and event plane determination. In order to adapt these functionalities to the collision rates expected for the LHC upgrade after 2020, it is planned to replace these systems by a single detector system, called the Fast Interaction Trigger (FIT). In this presentation we describe the performance parameters of the FIT upgrade; show the proposed characteristics of the T0-Plus and the simulations that support the conceptual design of this detector. In particular we describe the performance simulations of the event plane resolution. This material is based upon work supported by the National Science Foundation under Grants NSF-PHY-0968903 and NSF-PHY-1305280.

  10. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  11. Parallelized event chain algorithm for dense hard sphere and polymer systems

    SciTech Connect

    Kampmann, Tobias A. Boltz, Horst-Holger; Kierfeld, Jan

    2015-01-15

    We combine parallelization and cluster Monte Carlo for hard sphere systems and present a parallelized event chain algorithm for the hard disk system in two dimensions. For parallelization we use a spatial partitioning approach into simulation cells. We find that it is crucial for correctness to ensure detailed balance on the level of Monte Carlo sweeps by drawing the starting sphere of event chains within each simulation cell with replacement. We analyze the performance gains for the parallelized event chain and find a criterion for an optimal degree of parallelization. Because of the cluster nature of event chain moves massive parallelization will not be optimal. Finally, we discuss first applications of the event chain algorithm to dense polymer systems, i.e., bundle-forming solutions of attractive semiflexible polymers.

  12. Connecting Macroscopic Observables and Microscopic Assembly Events in Amyloid Formation Using Coarse Grained Simulations

    PubMed Central

    Bieler, Noah S.; Knowles, Tuomas P. J.; Frenkel, Daan; Vácha, Robert

    2012-01-01

    The pre-fibrillar stages of amyloid formation have been implicated in cellular toxicity, but have proved to be challenging to study directly in experiments and simulations. Rational strategies to suppress the formation of toxic amyloid oligomers require a better understanding of the mechanisms by which they are generated. We report Dynamical Monte Carlo simulations that allow us to study the early stages of amyloid formation. We use a generic, coarse-grained model of an amyloidogenic peptide that has two internal states: the first one representing the soluble random coil structure and the second one the -sheet conformation. We find that this system exhibits a propensity towards fibrillar self-assembly following the formation of a critical nucleus. Our calculations establish connections between the early nucleation events and the kinetic information available in the later stages of the aggregation process that are commonly probed in experiments. We analyze the kinetic behaviour in our simulations within the framework of the theory of classical nucleated polymerisation, and are able to connect the structural events at the early stages in amyloid growth with the resulting macroscopic observables such as the effective nucleus size. Furthermore, the free-energy landscapes that emerge from these simulations allow us to identify pertinent properties of the monomeric state that could be targeted to suppress oligomer formation. PMID:23071427

  13. Towards High Performance Discrete-Event Simulations of Smart Electric Grids

    SciTech Connect

    Perumalla, Kalyan S; Nutaro, James J; Yoginath, Srikanth B

    2011-01-01

    Future electric grid technology is envisioned on the notion of a smart grid in which responsive end-user devices play an integral part of the transmission and distribution control systems. Detailed simulation is often the primary choice in analyzing small network designs, and the only choice in analyzing large-scale electric network designs. Here, we identify and articulate the high-performance computing needs underlying high-resolution discrete event simulation of smart electric grid operation large network scenarios such as the entire Eastern Interconnect. We focus on the simulator's most computationally intensive operation, namely, the dynamic numerical solution for the electric grid state, for both time-integration as well as event-detection. We explore solution approaches using general-purpose dense and sparse solvers, and propose a scalable solver specialized for the sparse structures of actual electric networks. Based on experiments with an implementation in the THYME simulator, we identify performance issues and possible solution approaches for smart grid experimentation in the large.

  14. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    PubMed Central

    van Beek, Johannes H. G. M.; Supandi, Farahaniza; Gavai, Anand K.; de Graaf, Albert A.; Binsl, Thomas W.; Hettling, Hannes

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible. PMID:21969677

  15. Healthcare system simulation using Witness

    NASA Astrophysics Data System (ADS)

    Khakdaman, Masoud; Zeinahvazi, Milad; Zohoori, Bahareh; Nasiri, Fardokht; Yew Wong, Kuan

    2013-02-01

    Simulation techniques have a proven track record in manufacturing industry as well as other areas such as healthcare system improvement. In this study, simulation model of a health center in Malaysia is developed through the application of WITNESS simulation software which has shown its flexibility and capability in manufacturing industry. Modelling procedure is started through process mapping and data collection and continued with model development, verification, validation and experimentation. At the end, final results and possible future improvements are demonstrated.

  16. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  17. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example. PMID:20345550

  18. A multiprocessor operating system simulator

    SciTech Connect

    Johnston, G.M.; Campbell, R.H. . Dept. of Computer Science)

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  19. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  20. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  1. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    SciTech Connect

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-06-19

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described.

  2. Impulsive events in the evolution of a forced nonlinear system

    SciTech Connect

    Longcope, D.W.; Sudan, R.N. )

    1992-03-16

    Long-time numerical solutions of a low-dimensional model of the reduced MHD equations show that, when this system is driven quasistatically, the response is punctuated by impulsive events. The statistics of these events indicate a Poisson process; the frequency of these events scales as {Delta}{ital E}{sub {ital M}}{sup {minus}1}, where {Delta}{ital E}{sub {ital M}} is the energy released in one event.

  3. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  4. AP1000 Design Basis Event Simulation at the APEX-1000 Test Facility

    SciTech Connect

    Wright, Richard F.; Groome, John

    2004-07-01

    The AP1000 is a 1000 MWe advanced nuclear power plant that uses passive safety features to enhance plant safety and to provide significant and measurable improvements in plant simplification, reliability, investment protection and plant costs. The AP1000 relies heavily on the 600 MWe AP600 which received design certification in 1999. A critical part of the AP600 design certification process involved the testing of the passive safety systems. A one-fourth height, one-fourth pressure test facility, APEX-600, was constructed at the Oregon State University to study design basis events, and to provide a body of data to be used to validate the computer models used to analyze the AP600. This facility was extensively modified to reflect the design changes for AP1000 including higher power in the electrically heated rods representing the reactor core, and changes in the size of the pressurizer, core makeup tanks and automatic depressurization system. Several design basis events are being simulated at APEX-1000 including a double-ended direct vessel injection (DEDVI) line break and a 2-inch cold leg break. These tests show that the core remains covered with ample margin until gravity injection is established regardless of the initiating event. The tests also show that liquid entrainment from the upper plenum which is proportional to the reactor power does not impact the ability of the passive core cooling system to keep the core covered. (authors)

  5. A Performance Study of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event processing engines are used in diverse mission-critical scenarios such as fraud detection, traffic monitoring, or intensive care units. However, these scenarios have very different operational requirements in terms of, e.g., types of events, queries/patterns complexity, throughput, latency and number of sources and sinks. What are the performance bottlenecks? Will performance degrade gracefully with increasing loads? In this paper we make a first attempt to answer these questions by running several micro-benchmarks on three different engines, while we vary query parameters like window size, window expiration type, predicate selectivity, and data values. We also perform some experiments to assess engines scalability with respect to number of queries and propose ways for evaluating their ability in adapting to changes in load conditions. Lastly, we show that similar queries have widely different performances on the same or different engines and that no engine dominates the other two in all scenarios.

  6. Simulation of heavy rainfall events over Indian region: a benchmark skill with a GCM

    NASA Astrophysics Data System (ADS)

    Goswami, Prashant; Kantha Rao, B.

    2015-10-01

    Extreme rainfall events (ERE) contribute a significant component of the Indian summer monsoon rainfall. Thus an important requirement for regional climate simulations is to attain desirable quality and reliability in simulating the extreme rainfall events. While the global circulation model (GCM) with coarse resolution are not preferred for simulation of extreme events, it is expected that the global domain in a GCM would allow better representation of scale interactions, resulting in adequate skill in simulating localized events in spite of lower resolution. At the same time, a GCM with skill in simulation of extreme events will provide a more reliable tool for seamless prediction. The present work provides an assessment of a GCM for simulating 40 ERE that occurred over India during 1998-2013. It is found that, expectedly, the GCM forecasts underestimate the observed (TRMM) rainfall in most cases, but not always. Somewhat surprisingly, the forecasts of location are quite accurate in spite of low resolution (~50 km). An interesting result is that the highest skill of the forecasts is realized at 48 h lead rather than at 24 or 96 h lead. Diagnostics of dynamical fields like convergence shows that the forecasts can capture contrasting features on pre-event, event and post-event days. The forecast configuration used is similar to one that has been used for long-range monsoon forecasting and tropical cyclones in earlier studies; the present results on ERE forecasting, therefore, provide an indication for the potential application of the model for seamless prediction.

  7. Stochastic Event Counter for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Tae-Sic Yoo; Humberto E. Garcia

    2008-06-01

    This paper addresses the issues of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). First, we develop a noble recursive procedure that updates active counter information state sequentially with available observations. In general, the cardinality of active counter information state is unbounded, which makes the exact recursion infeasible computationally. To overcome this difficulty, we develop an approximated recursive procedure that regulates and bounds the size of active counter information state. Using the approximated active counting information state, we give an approximated minimum mean square error (MMSE) counter. The developed algorithms are then applied to count special routing events in a material flow system.

  8. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    NASA Astrophysics Data System (ADS)

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  9. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  10. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    PubMed Central

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-01-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the dermis upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  11. The influence of spectral nudging in simulating Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2015-04-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  12. The influence of spectral nudging in simulating individual Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2014-05-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  13. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  14. MCNP6. Simulating Correlated Data in Fission Events

    SciTech Connect

    Rising, Michael Evan; Sood, Avneet

    2015-12-03

    This report is a series of slides discussing the MCNP6 code and its status in simulating fission. Applications of interest include global security and nuclear nonproliferation, detection of special nuclear material (SNM), passive and active interrogation techniques, and coincident neutron and photon leakage.

  15. Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events

    PubMed Central

    2015-01-01

    Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed. PMID:26390294

  16. Modeling and simulation of single-event effect in CMOS circuit

    NASA Astrophysics Data System (ADS)

    Suge, Yue; Xiaolin, Zhang; Yuanfu, Zhao; Lin, Liu; Hanning, Wang

    2015-11-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed.

  17. A System for Interactive Behaviour Simulation.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    A psycho-ecological model is used as the basis for a simulation of interactive behavior strategies. The basic unit is an event, and each event has been recorded on closed circuit television videotape. The three basic paradigms of behavioral science--association, structure, and process--are used to anchor the simulations. The empirical foundation…

  18. Mesoscale Simulations of a Wind Ramping Event for Wind Energy Prediction

    SciTech Connect

    Rhodes, M; Lundquist, J K

    2011-09-21

    Ramping events, or rapid changes of wind speed and wind direction over a short period of time, present challenges to power grid operators in regions with significant penetrations of wind energy in the power grid portfolio. Improved predictions of wind power availability require adequate predictions of the timing of ramping events. For the ramping event investigated here, the Weather Research and Forecasting (WRF) model was run at three horizontal resolutions in 'mesoscale' mode: 8100m, 2700m, and 900m. Two Planetary Boundary Layer (PBL) schemes, the Yonsei University (YSU) and Mellor-Yamada-Janjic (MYJ) schemes, were run at each resolution as well. Simulations were not 'tuned' with nuanced choices of vertical resolution or tuning parameters so that these simulations may be considered 'out-of-the-box' tests of a numerical weather prediction code. Simulations are compared with sodar observations during a wind ramping event at a 'West Coast North America' wind farm. Despite differences in the boundary-layer schemes, no significant differences were observed in the abilities of the schemes to capture the timing of the ramping event. As collaborators have identified, the boundary conditions of these simulations probably dominate the physics of the simulations. They suggest that future investigations into characterization of ramping events employ ensembles of simulations, and that the ensembles include variations of boundary conditions. Furthermore, the failure of these simulations to capture not only the timing of the ramping event but the shape of the wind profile during the ramping event (regardless of its timing) indicates that the set-up and execution of such simulations for wind power forecasting requires skill and tuning of the simulations for a specific site.

  19. Systems Engineering Simulator (SES) Simulator Planning Guide

    NASA Technical Reports Server (NTRS)

    McFarlane, Michael

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  20. Simulation of January 1-7, 1978 events

    NASA Technical Reports Server (NTRS)

    Chao, J. K.; Moldwin, M. B.; Akasofu, S.-I.

    1987-01-01

    The solar wind disturbances of January 1 to 7, 1978 are reconstructed by a modeling method. First, the interplanetary magnetic field (IMF) background pattern, including a corotating shock, is reproduced using the Stanford source surface map. Then, two solar flares with their onset times on January 1, 0717 UT at S17 deg E10 deg and 2147 UT S17 deg E32 deg, respectively, are selected to generate two interplanetary transient shocks. It is shown that these two shocks interacted with the corotating shock, resulting in a series of interplanetary events observed by four spacecraft, Helios 1 and 2, IMP-8 (Interplanetary Monitoring Platform 8), and Voyager 2. Results show that these three shock waves interact and coalesce in interplanetary space such that Helios 2 and Voyager 2 observed only one shock and Helios 1 and IMP-8 observed two shocks. All shocks observed by the four spacecraft, except the corotating shock at Helios 1, are either a transient shock or a shock which is formed from coalescing of the transient shocks with the corotating shock. The method is useful in reconstructing a very complicated chain of interplanetary events observed by a number of spacecraft.

  1. Simulation of LHC events on a millions threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2015-12-01

    Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.

  2. Efficient event-driven simulations shed new light on microtubule organization in the plant cortical array

    NASA Astrophysics Data System (ADS)

    Tindemans, Simon H.; Deinum, Eva E.; Lindeboom, Jelmer J.; Mulder, Bela M.

    2014-04-01

    The dynamics of the plant microtubule cytoskeleton is a paradigmatic example of the complex spatiotemporal processes characterising life at the cellular scale. This system is composed of large numbers of spatially extended particles, each endowed with its own intrinsic stochastic dynamics, and is capable of non-equilibrium self-organisation through collisional interactions of these particles. To elucidate the behaviour of such a complex system requires not only conceptual advances, but also the development of appropriate computational tools to simulate it. As the number of parameters involved is large and the behaviour is stochastic, it is essential that these simulations be fast enough to allow for an exploration of the phase space and the gathering of sufficient statistics to accurately pin down the average behaviour as well as the magnitude of fluctuations around it. Here we describe a simulation approach that meets this requirement by adopting an event-driven methodology that encompasses both the spontaneous stochastic changes in microtubule state as well as the deterministic collisions. In contrast with finite time step simulations this technique is intrinsically exact, as well as several orders of magnitude faster, which enables ordinary PC hardware to simulate systems of ˜ 10^3 microtubules on a time scale ˜ 10^{3} faster than real time. In addition we present new tools for the analysis of microtubule trajectories on curved surfaces. We illustrate the use of these methods by addressing a number of outstanding issues regarding the importance of various parameters on the transition from an isotropic to an aligned and oriented state.

  3. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  4. Analysis of Extreme Events in Regional Climate Model Simulations for the Pacific Northwest using weatherathome

    NASA Astrophysics Data System (ADS)

    Mera, R. J.; Mote, P.; Weber, J.

    2011-12-01

    One of the most prominent impacts of climate change over the Pacific Northwest is the potential for an elevated number of extreme precipitation events over the region. Planning for natural hazards such as increasing number of floods related to high-precipitation events have, in general, focused on avoiding development in floodplains and conditioning development to withstand inundation with a minimum of losses. Nationwide, the Federal Emergency Management Agency (FEMA) estimates that about one quarter of its payments cover damage that has occurred outside mapped floodplains. It is clear that traditional flood-based planning will not be sufficient to predict and avoid future losses resulting from climate-related hazards such as high-precipitation events. In order to address this problem, the present study employs regional climate model output for future climate change scenarios to aid with the development of a map-based inventory of future hazard risks that can contribute to the development of a "planning-scale" decision support system for the Oregon Department of Land Conservation and Development (DLCD). Climate model output is derived from the climateprediction.net (CPDN) weatherathome project, an innovative climate science experiment that utilizes volunteer computers from users worldwide to produce hundreds of thousands superensembles of regional climate simulations of the Western United States climate from 1950 to 2050. The spatial and temporal distribution of extreme weather events are analyzed for the Pacific Northwest to diagnose the model's capabilities as an input for map products such as impacts on hydrology. Special attention is given to intensity and frequency of Atmospheric River events in historical and future climate contexts.

  5. Constructive episodic simulation: temporal distance and detail of past and future events modulate hippocampal engagement.

    PubMed

    Addis, Donna Rose; Schacter, Daniel L

    2008-01-01

    Behavioral, lesion and neuroimaging evidence show striking commonalities between remembering past events and imagining future events. In a recent event-related fMRI study, we instructed participants to construct a past or future event in response to a cue. Once an event was in mind, participants made a button press, then generated details (elaboration) and rated them. The elaboration of past and future events recruited a common neural network. However, regions within this network may respond differentially to event characteristics, such as the amount of detail generated and temporal distance, depending on whether the event is in the past or future. To investigate this further, we conducted parametric modulation analyses, with temporal distance and detail as covariates, and focused on the medial temporal lobes and frontopolar cortex. The analysis of detail (independent of temporal distance) showed that the left posterior hippocampus was responsive to the amount of detail comprising both past and future events. In contrast, the left anterior hippocampus responded differentially to the amount of detail comprising future events, possibly reflecting the recombination of details into a novel future event. The analysis of temporal distance revealed that the increasing recency of past events correlated with activity in the right parahippocampus gyrus (Brodmann area (BA) 35/36), while activity in the bilateral hippocampus was significantly correlated with the increasing remoteness of future events. We propose that the hippocampal response to the distance of future events reflects the increasing disparateness of details likely included in remote future events, and the intensive relational processing required for integrating such details into a coherent episodic simulation of the future. These findings provide further support for the constructive episodic simulation hypothesis (Schacter and Addis (2007) Philos Trans R Soc Lond B Biol Sci 362:773-786) and highlight the

  6. How well do CORDEX models simulate extreme rainfall events over the East Coast of South Africa?

    NASA Astrophysics Data System (ADS)

    Abba Omar, Sabina; Abiodun, Babatunde J.

    2016-01-01

    This study assesses the capability of regional climate models (RCMs) in simulating the characteristics of widespread extreme rainfall events over the East Coast of South Africa. Simulations of nine RCMs from the Coordinated Regional Downscaling Experiment (CORDEX) were analyzed for the study. All the simulations cover 12 years (1996-2008). Using the 95th percentile of daily rainfall as the threshold of extreme events and the simultaneous occurrence of extreme events over 50 % of the East Coast as widespread extreme events (WERE), we compared the characteristics of simulated WERE with observations (GPCP and TRMM) and with the reanalysis (ERAINT) that forced the simulations. Most RCMs perform well in simulating the seasonal variation of WEREs over the East Coast but perform poorly in simulating the interannual variability. Based on their rainfall synoptic patterns over Southern Africa, the WEREs in the East Coast can be generally classified into four groups. The first group connects the WEREs with tropical rainfall activities over the subcontinent. The second group links WEREs with frontal rainfall south of the subcontinent. The third group links the WEREs with both tropical and temperate rainfall activities while the fourth group represents isolated WEREs. The RCMs show different capabilities in simulating the frequency of WERE in each group, some perform better than ERAINT while some perform worse. Results of this study could provide information on the usability of RCMs in downscaling the impact of climate change on widespread extreme rainfall events over South Africa.

  7. Statistics of Record-Breaking Events in the Self-Organized Critical Systems

    NASA Astrophysics Data System (ADS)

    Shcherbakov, R.; Newman, W. I.; Turcotte, D. L.; Davidsen, J.; Tiampo, K.; Rundle, J. B.

    2010-12-01

    Record-breaking events generated by the dynamics of driven nonlinear threshold systems are extracted and analyzed. They are compared to the record-breaking events extracted from the sequences of independent identically distributed (i.i.d.) random variables drawn from the Weibull distribution. Several statistical measures of record-breaking events are derived analytically and confirmed through numerical simulations for Weibull and power-law distributed random variables. Driven nonlinear threshold systems usually exhibit avalanche type behavior, where slow buildup of energy is punctuated by an abrupt release of energy through avalanche events which usually follow scale invariant statistics. From the simulations of these systems it is possible to extract a sequence of record-breaking avalanches, where each subsequent record-breaking event is larger in magnitude than the previous one and all events in between are smaller than the current record-breaking event and the previous one. In the present work, several cellular automata are analyzed among them the sandpile model, Manna model, Olami-Feder-Christensen (OFC) model, and the forest-fire model to investigate the record-breaking statistics of model avalanches exhibiting temporal and spatial correlations. It is found that the statistics of record-breaking events for the above cellular automata exhibit behavior different from that observed for i.i.d. random variables which signifies their complex spatio-temporal dynamics. The most pronounced deviations are observed in the case of the OFC model with a strong dependence on the conservation parameter of the model.

  8. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired. PMID:27286268

  9. Simulations of Diffusion in Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Pei, C.; Jokipii, J.; Giacalone, J.

    2007-12-01

    New observations by high-sensitivity instruments onboard the ACE spacecraft show that Fe and O may share similar injection profiles close the solar surface, and that diffusion dominates the transport of these particles (Mason et al 2006). Multi-spacecraft observations by Helios and IMP-8 also confirm the spatial diffusion is important (Wibberenz & Cane 2006). The "reservoir" phenomenon or "spatial invariance" states that during the decay phase of individual gradual solar energetic particle events, the intensities measured by different spacecraft are nearly equal, even if these spacecraft are separated by several AU in radius and by 70 degrees in latitude. Results from our multidimensional numerical model, based on Parker's transport equation, with reasonable values of κ\\perp and κ\\| are compared with observations from Ulysses, IMP-8, and ACE. We demonstrate that most of the features of the "reservoir" phenomenon can be reproduced by a transport model which includes drift, energy loss, and spatial diffusion.

  10. An event generator for simulations of complex β-decay experiments

    NASA Astrophysics Data System (ADS)

    Jordan, D.; Algora, A.; Tain, J. L.

    2016-08-01

    This article describes a Monte Carlo event generator for the design, optimization and performance characterization of beta decay spectroscopy experimental set-ups. The event generator has been developed within the Geant4 simulation architecture and provides new features and greater flexibility in comparison with the current available decay generator.

  11. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  12. Calculation of 239Pu fission observables in an event-by-event simulation

    SciTech Connect

    Vogt, R; Randrup, J; Pruet, J; Younes, W

    2010-03-31

    The increased interest in more exclusive fission observables has demanded more detailed models. We describe a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including any interesting correlations. The various model assumptions are described and the potential utility of the model is illustrated. As a concrete example, we use formal statistical methods, experimental data on neutron production in neutron-induced fission of {sup 239}Pu, along with FREYA, to develop quantitative insights into the relation between reaction observables and detailed microscopic aspects of fission. Current measurements of the mean number of prompt neutrons emitted in fission taken together with less accurate current measurements for the prompt post-fission neutron energy spectrum, up to the threshold for multi-chance fission, place remarkably fine constraints on microscopic theories.

  13. Systems simulations supporting NASA telerobotics

    NASA Technical Reports Server (NTRS)

    Harrison, F. W., Jr.; Pennington, J. E.

    1987-01-01

    Two simulation and analysis environments have been developed to support telerobotics research at the Langley Research Center. One is a high-fidelity, nonreal-time, interactive model called ROBSIM, which combines user-generated models of workspace environment, robots, and loads into a working system and simulates the interaction among the system components. Models include user-specified actuator, sensor, and control parameters, as well as kinematic and dynamic characteristics. Kinematic, dynamic, and response analyses can be selected, with system configuration, task trajectories, and arm states displayed using computer graphics. The second environment is a real-time, manned Telerobotic Systems Simulation (TRSS) which uses the facilities of the Intelligent Systems Research Laboratory (ISRL). It utilizes a hierarchical structure of functionally distributed computers communicating over both parallel and high-speed serial data paths to enable studies of advanced telerobotic systems. Multiple processes perform motion planning, operator communications, forward and inverse kinematics, control/sensor fusion, and I/O processing while communicating via common memory. Both ROBSIM and TRSS, including their capability, status, and future plans are discussed. Also described is the architecture of ISRL and recent telerobotic system studies in ISRL.

  14. The ISOPHOT Mapping Simulation System

    NASA Astrophysics Data System (ADS)

    Gabriel, C.; Hur, M.

    2002-12-01

    From version 9.0 onwards, the ISOPHOT Interactive Anal- ysis (PIA) package offers its users an integrated mapping simu- lation system, capable of generating sky images including sev- eral point / extended sources on a flat / gradient background, simulating what ISOPHOT would have recorded under certain instrument and spacecraft raster configurations. While the ben- efits of performing simulations for accessing the efficiency, ac- curacy, confusion level, etc., on different mapping algorithms and deconvolution techniques in and outside PIA are mostly of interest to calibrators and instrument specialists, it is also very important for a general observer because this highly user friendly system provides the possibility of simulating his / her observation by matching the selected observing mode.

  15. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family

    NASA Astrophysics Data System (ADS)

    Custodio, M. D. S.; Ambrizzi, T.; Da Rocha, R.

    2015-12-01

    The increased horizontal resolution of climate models aims to improve the simulations accuracy and to understand the non-linear processes during interactions between different spatial scales within the climate system. Up to this moment, these interactions did not have a good representation on low horizontal resolution GCMs. The variations of extreme climatic events had been described and analyzed in the scientific literature. In a scenario of global warming it is necessary understanding and explaining extreme events and to know if global models may represent these events. The purpose of this study was to understand the impact of the horizontal resolution in high resolution coupled and atmospheric global models of HiGEM project in simulating atmospheric patterns and processes of interaction between spatial scales. Moreover, evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model in capturing the signal of interannual and intraseasonal variability of precipitation over Amazon region. The results indicated that the grid refinement and ocean-atmosphere coupling contributes to a better representation of seasonal patterns, both precipitation and temperature, on the Amazon region. Besides, the climatic models analyzed represent better than other models (regional and global) the climatic characteristics of this region. This indicates a breakthrough in the development of high resolution climate models. Both coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The interannual variability analysis showed that coupled simulations intensify the impact of the ENSO in the Amazon. In the intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. The simulation of ENSO in GCMs can be attributed to their high

  16. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  17. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings. PMID:26310949

  18. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  19. Development of a robust and automated infrasound event catalogue using the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Arrowsmith, Stephen; Euler, Garrett; Marcillo, Omar; Blom, Philip; Whitaker, Rod; Randall, George

    2015-03-01

    Methods for detecting, associating and locating infrasound events recorded on the global International Monitoring System (IMS) infrasound network are presented. By using likelihood arguments, and reducing the use of empirically determined parameters, our techniques enable us to formally quantify the false alarm rate at both station and network levels, and to calculate confidence areas for event localization. We outline a new association technique that uses graph theory for associating arrivals at multiple spatially separated stations, and perform Monte Carlo simulations to quantify the performance of the scheme under different scenarios. The detection, association and location techniques are applied to 10 large events in the Reviewed Event Bulletin of the Comprehensive Nuclear Test Ban Treaty Organization. Out of 10 events, a total of seven were automatically detected and associated. By analysing the three missed events, we identify improvements that might be made to improve the algorithms.

  20. Experience producing simulated events for the DZero experiment on the SAM-Grid

    SciTech Connect

    Garzoglio, G.; Terekhov, I.; Snow, J.; Jain, S.; Nishandar, A.; /Texas U., Arlington

    2004-12-01

    Most of the simulated events for the DZero experiment at Fermilab have been historically produced by the ''remote'' collaborating institutions. One of the principal challenges reported concerns the maintenance of the local software infrastructure, which is generally different from site to site. As the understanding of the distributed computing community over distributively owned and shared resources progresses, the adoption of grid technologies to address the production of Monte Carlo events for high energy physics experiments becomes increasingly interesting. SAM-Grid is a software system developed at Fermilab, which integrates standard grid technologies for job and information management with SAM, the data handling system of the DZero and CDF experiments. During the past few months, this grid system has been tailored for the Monte Carlo production of DZero. Since the initial phase of deployment, this experience has exposed an interesting series of requirements to the SAM-Grid services, the standard middleware, the resources and their management and to the analysis framework of the experiment. As of today, the inefficiency due to the grid infrastructure has been reduced to as little as 1%. In this paper, we present our statistics and the ''lessons learned'' in running large high energy physics applications on a grid infrastructure.

  1. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  2. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  3. Probabilities for large events in driven threshold systems

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Holliday, James R.; Graves, William R.; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William

    2012-08-01

    Many driven threshold systems display a spectrum of avalanche event sizes, often characterized by power-law scaling. An important problem is to compute probabilities of the largest events (“Black Swans”). We develop a data-driven approach to the problem by transforming to the event index frame, and relating this to Shannon information. For earthquakes, we find the 12-month probability for magnitude m>6 earthquakes in California increases from about 30% after the last event, to 40%-50% prior to the next one.

  4. Aided targeting system simulation evaluation

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Becker, Curtis

    1994-01-01

    Simulation research was conducted at the Crew Station Research and Development Facility on the effectiveness and ease of use of three targeting systems. A manual system required the aviator to scan a target array area with a simulated second generation forward looking infrared (FLIR) sensor, locate and categorize targets, and construct a target hand-off list. The interface between the aviator and the system was like that of an advanced scout helicopter (manual mode). Two aided systems detected and categorized targets automatically. One system used only the FLIR sensor and the second used FLIR fused with Longbow radar. The interface for both was like that of an advanced scout helicopter aided mode. Exposure time while performing the task was reduced substantially with the aided systems, with no loss of target hand-off list accuracy. The fused sensor system showed lower time to construct the target hand-off list and a slightly lower false alarm rate than the other systems. A number of issues regarding system sensitivity and criterion, and operator interface design are discussed.

  5. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    NASA Astrophysics Data System (ADS)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  6. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  7. An intelligent simulation training system

    NASA Technical Reports Server (NTRS)

    Biegel, John E.

    1990-01-01

    The Department of Industrial Engineering at the University of Central Florida, Embry-Riddle Aeronautical University and General Electric (SCSD) have been funded by the State of Florida to build an Intelligent Simulation Training System. The objective was and is to make the system generic except for the domain expertise. Researchers accomplished this objective in their prototype. The system is modularized and therefore it is easy to make any corrections, expansions or adaptations. The funding by the state of Florida has exceeded $3 million over the past three years and through the 1990 fiscal year. UCF has expended in excess of 15 work years on the project. The project effort has been broken into three major tasks. General Electric provides the simulation. Embry-Riddle Aeronautical University provides the domain expertise. The University of Central Florida has constructed the generic part of the system which is comprised of several modules that perform the tutoring, evaluation, communication, status, etc. The generic parts of the Intelligent Simulation Training Systems (ISTS) are described.

  8. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, D.P.C.; Sala, O.E.; Allen, C.D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas. ?? The Ecological Society of America.

  9. Active magnetic bearing-supported rotor with misaligned cageless backup bearings: A dropdown event simulation model

    NASA Astrophysics Data System (ADS)

    Halminen, Oskari; Kärkkäinen, Antti; Sopanen, Jussi; Mikkola, Aki

    2015-01-01

    Active magnetic bearings (AMB) offer considerable benefits compared to regular mechanical bearings. On the other hand, they require backup bearings to avoid damage resulting from a failure in the component itself, or in the power or control system. During a rotor-bearing contact event - when the magnetic field has disappeared and the rotor drops on the backup bearings - the structure of the backup bearings has an impact on the dynamic actions of the rotor. In this paper, the dynamics of an active magnetic bearing-supported rotor during contact with backup bearings is studied with a simulation model. Modeling of the backup bearings is done using a comprehensive cageless ball bearing model. The elasticity of the rotor is described using the finite element method (FEM) and the degrees of freedom (DOF) of the system are reduced using component mode synthesis. Verification of the misaligned cageless backup bearings model is done by comparing the simulation results against the measurement results. The verified model with misaligned cageless backup bearings is found to correspond to the features of a real system.

  10. Simulator verification techniques study. Integrated simulator self test system concepts

    NASA Technical Reports Server (NTRS)

    Montoya, G.; Wenglinski, T. H.

    1974-01-01

    Software and hardware requirements for implementing hardware self tests are presented in support of the development of training and procedures development simulators for the space shuttle program. Self test techniques for simulation hardware and the validation of simulation performance are stipulated. The requirements of an integrated simulator self system are analyzed. Readiness tests, fault isolation tests, and incipient fault detection tests are covered.

  11. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  12. Solar system events at high spatial resolution

    SciTech Connect

    Baines, K H; Gavel, D T; Getz, A M; Gibbartd, S G; MacIntosh, B; Max, C E; McKay, C P; Young, E F; de Pater, I

    1999-02-19

    Until relatively recent advances in technology, astronomical observations from the ground were limited in image resolution by the blurring effects of earth's atmosphere. The blur extent, ranging typically from 0.5 to 2 seconds of arc at the best astronomical sights, precluded ground-based observations of the details of the solar system's moons, asteroids, and outermost planets. With the maturing of a high resolution image processing technique called speckle imaging the resolution limitation of the atmosphere can now be largely overcome. Over the past three years they have used speckle imaging to observe Titan, a moon of Saturn with an atmospheric density comparable to Earth's, Io, the volcanically active innermost moon of Jupiter, and Neptune, a gas giant outer planet which has continually changing planet-encircling storms. These observations were made at the world's largest telescope, the Keck telescope in Hawaii and represent the highest resolution infrared images of these objects ever taken.

  13. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  14. The Impact of Land Cover Change on a Simulated Storm Event in the Sydney Basin

    NASA Astrophysics Data System (ADS)

    Gero, A. F.; Pitman, A. J.

    2006-02-01

    The Regional Atmospheric Modeling System (RAMS) was run at a 1-km grid spacing over the Sydney basin in Australia to assess the impact of land cover change on a simulated storm event. The simulated storm used NCEP NCAR reanalysis data, first with natural (i.e., pre-European settlement in 1788) land cover and then with satellite-derived land cover representing Sydney's current land use pattern. An intense convective storm develops in the model in close proximity to Sydney's dense urban central business district under current land cover. The storm is absent under natural land cover conditions. A detailed investigation of why the change in land cover generates a storm was performed using factorial analysis, which revealed the storm to be sensitive to the presence of agricultural land in the southwest of the domain. This area interacts with the sea breeze and affects the horizontal divergence and moisture convergence—the triggering mechanisms of the storm. The existence of the storm over the dense urban area of Sydney is therefore coincidental. The results herein support efforts to develop parameterization of urban surfaces in high-resolution simulations of Sydney's meteorological environment but also highlight the need to improve the parameterization of other types of land cover change at the periphery of the urban area, given that these types dominate the explanation of the results.

  15. Performance and efficiency of geotextile-supported erosion control measures during simulated rainfall events

    NASA Astrophysics Data System (ADS)

    Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin

    2013-04-01

    Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show

  16. Stochastic simulation in systems biology

    PubMed Central

    Székely, Tamás; Burrage, Kevin

    2014-01-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest. PMID:25505503

  17. Improving outpatient phlebotomy service efficiency and patient experience using discrete-event simulation.

    PubMed

    Yip, Kenneth; Pang, Suk-King; Chan, Kui-Tim; Chan, Chi-Kuen; Lee, Tsz-Leung

    2016-08-01

    Purpose - The purpose of this paper is to present a simulation modeling application to reconfigure the outpatient phlebotomy service of an acute regional and teaching hospital in Hong Kong, with an aim to improve service efficiency, shorten patient queuing time and enhance workforce utilization. Design/methodology/approach - The system was modeled as an inhomogeneous Poisson process and a discrete-event simulation model was developed to simulate the current setting, and to evaluate how various performance metrics would change if switched from a decentralized to a centralized model. Variations were then made to the model to test different workforce arrangements for the centralized service, so that managers could decide on the service's final configuration via an evidence-based and data-driven approach. Findings - This paper provides empirical insights about the relationship between staffing arrangement and system performance via a detailed scenario analysis. One particular staffing scenario was chosen by manages as it was considered to strike the best balance between performance and workforce scheduled. The resulting centralized phlebotomy service was successfully commissioned. Practical implications - This paper demonstrates how analytics could be used for operational planning at the hospital level. The authors show that a transparent and evidence-based scenario analysis, made available through analytics and simulation, greatly facilitates management and clinical stakeholders to arrive at the ideal service configuration. Originality/value - The authors provide a robust method in evaluating the relationship between workforce investment, queuing reduction and workforce utilization, which is crucial for managers when deciding the delivery model for any outpatient-related service. PMID:27477930

  18. Optimized Hypervisor Scheduler for Parallel Discrete Event Simulations on Virtual Machine Platforms

    SciTech Connect

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2013-01-01

    With the advent of virtual machine (VM)-based platforms for parallel computing, it is now possible to execute parallel discrete event simulations (PDES) over multiple virtual machines, in contrast to executing in native mode directly over hardware as is traditionally done over the past decades. While mature VM-based parallel systems now offer new, compelling benefits such as serviceability, dynamic reconfigurability and overall cost effectiveness, the runtime performance of parallel applications can be significantly affected. In particular, most VM-based platforms are optimized for general workloads, but PDES execution exhibits unique dynamics significantly different from other workloads. Here we first present results from experiments that highlight the gross deterioration of the runtime performance of VM-based PDES simulations when executed using traditional VM schedulers, quantitatively showing the bad scaling properties of the scheduler as the number of VMs is increased. The mismatch is fundamental in nature in the sense that any fairness-based VM scheduler implementation would exhibit this mismatch with PDES runs. We also present a new scheduler optimized specifically for PDES applications, and describe its design and implementation. Experimental results obtained from running PDES benchmarks (PHOLD and vehicular traffic simulations) over VMs show over an order of magnitude improvement in the run time of the PDES-optimized scheduler relative to the regular VM scheduler, with over 20 reduction in run time of simulations using up to 64 VMs. The observations and results are timely in the context of emerging systems such as cloud platforms and VM-based high performance computing installations, highlighting to the community the need for PDES-specific support, and the feasibility of significantly reducing the runtime overhead for scalable PDES on VM platforms.

  19. Event-triggered consensus tracking of multi-agent systems with Lur'e nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Na; Duan, Zhisheng; Wen, Guanghui; Zhao, Yu

    2016-05-01

    In this paper, distributed consensus tracking problem for networked Lur'e systems is investigated based on event-triggered information interactions. An event-triggered control algorithm is designed with the advantages of reducing controller update frequency and sensor energy consumption. By using tools of ?-procedure and Lyapunov functional method, some sufficient conditions are derived to guarantee that consensus tracking is achieved under a directed communication topology. Meanwhile, it is shown that Zeno behaviour of triggering time sequences is excluded for the proposed event-triggered rule. Finally, some numerical simulations on coupled Chua's circuits are performed to illustrate the effectiveness of the theoretical algorithms.

  20. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  1. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  2. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  3. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  4. MERTIS: system theory and simulation

    NASA Astrophysics Data System (ADS)

    Paproth, Carsten; Säuberlich, Thomas; Jahn, Herbert; Helbert, Jörn

    2010-09-01

    The deep-space ESA mission BepiColombo to planet Mercury will contain the advanced infrared remote sensing instrument MERTIS (MErcury Radiometer and Thermal infrared Imaging Spectrometer). The mission has the goal to explore the planets inner and surface structure and its environment. With MERTIS investigations of Mercury's surface layer within a spectral range of 7-14μm shall be conducted to specify and map Mercury's mineralogical composition with a spatial resolution of 500m. Due to the limited mass and power budget the used micro-bolometer detector array will only have a temperature-stabilization and will not be cooled. The theoretical description of the instrument is necessary to estimate the performance of the instrument especially the signal to noise ratio. For that purpose theoretical models are derived from system theory. For a better evaluation and understanding of the instrument performance simulations are performed to compute the passage of the radiation of a hypothetical mineralogical surface composition through the optical system, the influence of the inner instrument radiation and the conversion of the overall radiation into a detector voltage and digital output signal. The results of the simulation can support the optimization process of the instrument parameters and could also assist the analysis of gathered scientific data. The simulation tool can be used as well for performance estimations of MERTIS-like systems for future projects.

  5. A Nonlinear Propulsion System Simulation Technique for Piloted Simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.

    1981-01-01

    In the past, propulsion system simulations used in flight simulators have been extremely simple. This resulted in a loss of simulation realism since significant engine and aircraft interactions were neglected and important internal engine parameters were not computed. More detailed propulsion system simulators are needed to permit evaluations of modern aircraft propulsion systems in a simulated flight environment. A real time digital simulation technique has been developed which provides the capabilities needed to evaluate propulsion system performance and aircraft system interaction on manned flight simulators. A parameter correlation technique is used with real and pseudo dynamics in a stable integration convergence loop. The technique has been applied to a multivariable propulsion system for use in a piloted NASA flight simulator program. Cycle time is 2.0 ms on a Univac 1110 computer and 5.7 ms on the simulator computer, a Xerox Sigma 8. The model is stable and accurate with time steps up to 50 ms. The program evaluated the simulation technique and the propulsion system digital control. The simulation technique and model used in that program are described and results from the simulation are presented.

  6. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  7. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP. PMID:26285220

  8. Simulation of debris flow events in Sicily by cellular automata model SCIDDICA_SS3

    NASA Astrophysics Data System (ADS)

    Cancelliere, A.; Lupiano, V.; Peres, D. J.; Stancanelli, L.; Avolio, M.; Foti, E.; Di Gregorio, S.

    2013-12-01

    Debris flow models are widely used for hazard mapping or for evaluating the effectiveness of risk mitigation measures. Several models analyze the dynamics of debris flow runout solving Partial Differential Equations. In use of such models, difficulties arise in estimating kinematic geotechnical soil parameters for real phenomena. In order to overcome such difficulties, alternative semi-empirical approaches can be employed, such as macroscopic Cellular Automata (CA). In particular, for CA simulation purposes, the runout of debris flows emerges from local interactions in a dynamical system, subdivided into elementary parts, whose state evolves within a spatial and temporal discretum. The attributes of each cell (substates) describe physical characteristics. For computational reasons, the natural phenomenon is splitted into a number of elementary processes, whose proper composition makes up the CA transition function. By simultaneously applying this function to all the cells, the evolution of the phenomenon can be simulated in terms of modifications of the substates. In this study, we present an application of the macroscopic CA semi-empirical model SCIDDICA_SS3 to the Peloritani Mountains area in Sicily island, Italy. The model was applied using detailed data from the 1 October 2009 debris flow event, which was triggered by a rainfall event of about 250 mm falling in 9 hours, that caused the death of 37 persons. This region is characterized by river valleys with large hillslope angles (30°-60°), catchment basins of small extensions (0.5-12 km2) and soil composed by metamorphic material, which is easy to be eroded. CA usage implies a calibration phase, that identifies an optimal set of parameters capable of adequately play back the considered case, and a validation phase, that tests the model on a sufficient (and different) number of cases similar in terms of physical and geomorphological properties. The performance of the model can be measured in terms of a fitness

  9. Characteristics and dependencies of error in satellite-based flood event simulations

    NASA Astrophysics Data System (ADS)

    Mei, Yiwen; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Zoccatelli, Davide; Borga, Marco

    2016-04-01

    The error in satellite precipitation driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. rainfall and runoff cumulative depth and time series shape). Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite-precipitation exhibits good agreement with reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of time series shows significant dampening effect. The random error dampening effect is less pronounced for the flash flood events, and the rain flood events with high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  10. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. PMID:27153981

  11. Simulation of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Schnepfleitner, Harald; Rode, Matthias; Sass, Oliver

    2014-05-01

    Rock moisture distribution during freeze-thaw events is the key to understanding frost weathering and subsequent rockfall. Data on moisture levels of natural rock walls are scarce and difficult to measure. An innovative and cheap way to avoid these problems is the use of simulation calculations. Although they are an abstraction of the real system they are widely used in natural science. A novel way to simulate moisture in natural rock walls is the use of the software WUFI which has been developed to understand the moisture behavior in building materials. However, the enormous know-how behind these commercial applications has not been exploited for geomorphological research to date. Necessary input data for the simulation are climate data in hourly resolution (temperature, rainfall, wind, irradiation) and material properties (porosity, sorption and diffusivity parameters) of the prevailing rock. Two different regions were analysed, the Gesäuse (Johnsbachtal: 700 m, limestone and dolomite) and the Sonnblick (3000 m, gneiss and granite). We aimed at comparing the two regions in terms of general susceptibility to frost weathering, as well as the influence of aspect, inclination and rock parameters and the possible impact of climate change. The calculated 1D-moisture profiles and temporal progress of rock moisture - in combination with temperature data - allow to detect possible periods of active weathering and resulting rockfalls. These results were analyzed based on two different frost weathering theories, the "classical" frost shattering theory (requiring high number of freeze-thaw cycles and a pore saturation of 90%) and the segregation ice theory (requiring a long freezing period and a pore saturation threshold of approx. 60%). An additionally considered critical factor for both theories was the frost depth, namely the duration of the "frost cracking window" (between -3 and -10°C) at each site. The results shows that in both areas, north-facing rocks are

  12. Simulation System for Training in Laparoscopic Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao

    2003-01-01

    A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set

  13. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  14. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  15. Estimating Flood Quantiles on the Basis of Multi-Event Rainfall Simulation - Case Study

    NASA Astrophysics Data System (ADS)

    Jarosińska, Elżbieta; Pierzga, Katarzyna

    2015-12-01

    This paper presents an approach to estimating the probability distribution of annual discharges Q based on rainfall-runoff modelling using multiple rainfall events. The approach is based on the prior knowledge about the probability distribution of annual maximum daily totals of rainfall P in a natural catchment, random disaggregation of the totals into hourly values, and rainfall-runoff modelling. The presented Multi-Event Simulation of Extreme Flood method (MESEF) combines design event method based on single-rainfall event modelling, and continuous simulation method used for estimating the maximum discharges of a given exceedance probability using rainfall-runoff models. In the paper, the flood quantiles were estimated using the MESEF method, and then compared to the flood quantiles estimated using classical statistical method based on observed data.

  16. Discrete-event simulation of a wide-area health care network.

    PubMed Central

    McDaniel, J G

    1995-01-01

    OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646

  17. Simulation of linear mechanical systems

    NASA Technical Reports Server (NTRS)

    Sirlin, S. W.

    1993-01-01

    A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.

  18. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  19. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  20. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  1. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  2. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  3. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... injection systems and the low pressure injection function of residual (decay) heat removal systems. (4) ECCS... radioactive material; or (D) Mitigate the consequences of an accident. (vi) Events covered in paragraph...

  4. An abrupt climate event in a coupled ocean-atmosphere simulation without external forcing.

    PubMed

    Hall, A; Stouffer, R J

    2001-01-11

    Temperature reconstructions from the North Atlantic region indicate frequent abrupt and severe climate fluctuations during the last glacial and Holocene periods. The driving forces for these events are unclear and coupled atmosphere-ocean models of global circulation have only simulated such events by inserting large amounts of fresh water into the northern North Atlantic Ocean. Here we report a drastic cooling event in a 15,000-yr simulation of global circulation with present-day climate conditions without the use of such external forcing. In our simulation, the annual average surface temperature near southern Greenland spontaneously fell 6-10 standard deviations below its mean value for a period of 30-40 yr. The event was triggered by a persistent northwesterly wind that transported large amounts of buoyant cold and fresh water into the northern North Atlantic Ocean. Oceanic convection shut down in response to this flow, concentrating the entire cooling of the northern North Atlantic by the colder atmosphere in the uppermost ocean layer. Given the similarity between our simulation and observed records of rapid cooling events, our results indicate that internal atmospheric variability alone could have generated the extreme climate disruptions in this region. PMID:11196636

  5. BEEC: An event generator for simulating the Bc meson production at an e+e- collider

    NASA Astrophysics Data System (ADS)

    Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You

    2013-12-01

    The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in

  6. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    USGS Publications Warehouse

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  7. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event. PMID:18302490

  8. Simulated seismic event release fraction data: Progress report, April 1986-April 1987

    SciTech Connect

    Langer, G.; Deitesfeld, C.A.

    1987-11-15

    The object of this project is to obtain experimental data on the release of airborne particles during seismic events involving plutonium handling facilities. In particular, cans containing plutonium oxide powder may be involved and some of the powder may become airborne. No release fraction data for such scenarios are available and risk assessment calculations for such events lacked specificity describing the physical processes involved. This study has provided initial data based on wind tunnel tests simulating the impact of the debris on simulated cans of plutonium oxide powder. The release fractions are orders of magnitude smaller than previously available estimates. 8 refs., 3 figs., 2 tabs.

  9. On computer-intensive simulation and estimation methods for rare-event analysis in epidemic models.

    PubMed

    Clémençon, Stéphan; Cousien, Anthony; Felipe, Miraine Dávila; Tran, Viet Chi

    2015-12-10

    This article focuses, in the context of epidemic models, on rare events that may possibly correspond to crisis situations from the perspective of public health. In general, no close analytic form for their occurrence probabilities is available, and crude Monte Carlo procedures fail. We show how recent intensive computer simulation techniques, such as interacting branching particle methods, can be used for estimation purposes, as well as for generating model paths that correspond to realizations of such events. Applications of these simulation-based methods to several epidemic models fitted from real datasets are also considered and discussed thoroughly. PMID:26242476

  10. Decentralised consensus for multiple Lagrangian systems based on event-triggered strategy

    NASA Astrophysics Data System (ADS)

    Liu, Xiangdong; Du, Changkun; Lu, Pingli; Yang, Dapeng

    2016-06-01

    This paper considers the decentralised event-triggered consensus problem for multi-agent systems with Lagrangian dynamics under undirected graphs. First, a distributed, leaderless, and event-triggered consensus control algorithm is presented based on the definition of generalised positions and velocities for all agents. There is only one triggering function for both the generalised positions and velocities and no Zeno behaviour exhibited under the proposed consensus strategy. Second, an adaptive event-triggered consensus control algorithm is proposed for such multi-agent systems with unknown constant parameters. Third, based on sliding-mode method, an event-triggered consensus control algorithm is considered for the case with external disturbance. Finally, simulation results are given to illustrate the theoretical results.

  11. Monte Carlo generator ELRADGEN 2.0 for simulation of radiative events in elastic ep-scattering of polarized particles

    NASA Astrophysics Data System (ADS)

    Akushevich, I.; Filoti, O. F.; Ilyichev, A.; Shumeiko, N.

    2012-07-01

    The structure and algorithms of the Monte Carlo generator ELRADGEN 2.0 designed to simulate radiative events in polarized ep-scattering are presented. The full set of analytical expressions for the QED radiative corrections is presented and discussed in detail. Algorithmic improvements implemented to provide faster simulation of hard real photon events are described. Numerical tests show high quality of generation of photonic variables and radiatively corrected cross section. The comparison of the elastic radiative tail simulated within the kinematical conditions of the BLAST experiment at MIT BATES shows a good agreement with experimental data. Catalogue identifier: AELO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1299 No. of bytes in distributed program, including test data, etc.: 11 348 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: All Operating system: Any RAM: 1 MB Classification: 11.2, 11.4 Nature of problem: Simulation of radiative events in polarized ep-scattering. Solution method: Monte Carlo simulation according to the distributions of the real photon kinematic variables that are calculated by the covariant method of QED radiative correction estimation. The approach provides rather fast and accurate generation. Running time: The simulation of 108 radiative events for itest:=1 takes up to 52 seconds on Pentium(R) Dual-Core 2.00 GHz processor.

  12. Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model

    NASA Astrophysics Data System (ADS)

    Mikolajewicz, Uwe; Ziemen, Florian

    2016-04-01

    Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.

  13. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

    PubMed Central

    Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  14. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  15. Accuracy analysis of distributed simulation systems

    NASA Astrophysics Data System (ADS)

    Lin, Qi; Guo, Jing

    2010-08-01

    Existed simulation works always emphasize on procedural verification, which put too much focus on the simulation models instead of simulation itself. As a result, researches on improving simulation accuracy are always limited in individual aspects. As accuracy is the key in simulation credibility assessment and fidelity study, it is important to give an all-round discussion of the accuracy of distributed simulation systems themselves. First, the major elements of distributed simulation systems are summarized, which can be used as the specific basis of definition, classification and description of accuracy of distributed simulation systems. In Part 2, the framework of accuracy of distributed simulation systems is presented in a comprehensive way, which makes it more sensible to analyze and assess the uncertainty of distributed simulation systems. The concept of accuracy of distributed simulation systems is divided into 4 other factors and analyzed respectively further more in Part 3. In Part 4, based on the formalized description of framework of accuracy analysis in distributed simulation systems, the practical approach are put forward, which can be applied to study unexpected or inaccurate simulation results. Following this, a real distributed simulation system based on HLA is taken as an example to verify the usefulness of the approach proposed. The results show that the method works well and is applicable in accuracy analysis of distributed simulation systems.

  16. High-speed event detector for embedded nanopore bio-systems.

    PubMed

    Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie

    2015-08-01

    Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods. PMID:26736722

  17. Evaluating the potential effectiveness of using computerized information systems to prevent adverse drug events.

    PubMed Central

    Anderson, J. G.; Jay, S. J.; Anderson, M.; Hunt, T. J.

    1997-01-01

    In this study a dynamic computer simulation model is used to estimate the effectiveness of various information systems applications designed to detect and prevent medication errors that result in adverse drug events (ADEs). The model simulates the four stages of the drug ordering and delivery system: prescribing, transcribing, dispensing and administering drugs. In this study we simulated interventions that have been demonstrated in prior studies to decrease error rates. The results demonstrated that a computerized information system that detected 26% of medication errors and prevented associated ADEs could save 1,226 days of excess hospitalization and $1.4 million in hospital costs annually. Those results suggest that such systems are potentially a cost-effective means of preventing ADEs in hospitals. The results demonstrated the importance of viewing adverse drug events from a systems perspective. Prevention efforts that focus on a single stage of the process had limited impact on the overall error rate. This study suggests that system-wide changes to the drug-ordering and delivery system are required to significantly reduce adverse drug events in a hospital setting. PMID:9357622

  18. Model for the evolution of the time profile in optimistic parallel discrete event simulations

    NASA Astrophysics Data System (ADS)

    Ziganurova, L.; Novotny, M. A.; Shchur, L. N.

    2016-02-01

    We investigate synchronisation aspects of an optimistic algorithm for parallel discrete event simulations (PDES). We present a model for the time evolution in optimistic PDES. This model evaluates the local virtual time profile of the processing elements. We argue that the evolution of the time profile is reminiscent of the surface profile in the directed percolation problem and in unrestricted surface growth. We present results of the simulation of the model and emphasise predictive features of our approach.

  19. State-dependent doubly weighted stochastic simulation algorithm for automatic characterization of stochastic biochemical rare events

    NASA Astrophysics Data System (ADS)

    Roh, Min K.; Daigle, Bernie J.; Gillespie, Dan T.; Petzold, Linda R.

    2011-12-01

    In recent years there has been substantial growth in the development of algorithms for characterizing rare events in stochastic biochemical systems. Two such algorithms, the state-dependent weighted stochastic simulation algorithm (swSSA) and the doubly weighted SSA (dwSSA) are extensions of the weighted SSA (wSSA) by H. Kuwahara and I. Mura [J. Chem. Phys. 129, 165101 (2008)], 10.1063/1.2987701. The swSSA substantially reduces estimator variance by implementing system state-dependent importance sampling (IS) parameters, but lacks an automatic parameter identification strategy. In contrast, the dwSSA provides for the automatic determination of state-independent IS parameters, thus it is inefficient for systems whose states vary widely in time. We present a novel modification of the dwSSA—the state-dependent doubly weighted SSA (sdwSSA)—that combines the strengths of the swSSA and the dwSSA without inheriting their weaknesses. The sdwSSA automatically computes state-dependent IS parameters via the multilevel cross-entropy method. We apply the method to three examples: a reversible isomerization process, a yeast polarization model, and a lac operon model. Our results demonstrate that the sdwSSA offers substantial improvements over previous methods in terms of both accuracy and efficiency.

  20. Event-triggered output feedback control for distributed networked systems.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature. PMID:26708304

  1. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  2. Critical event management with geographic information system technology

    NASA Astrophysics Data System (ADS)

    Booth, John F.; Young, Jeffrey M.

    1997-02-01

    Critical event management at the Los Angeles County Regional Criminal Information Clearinghouse (LACRCIC) provides for the deconfliction of operations, such as reverse stings, arrests, undercover buys/busts, searches, surveillances, and site surveys in the Los Angeles, Orange, Riverside, and San Bernardino county area. During these operations, the opportunity for officer-to-officer confrontation is high, possibly causing a worse case scenario -- officers drawing on each other resulting in friendly fire injuries or casualties. In order to prevent local, state, and federal agencies in the Los Angeles area from experiencing this scenario, the LACRCIC provides around the clock critical event management services via its secure war room. The war room maintains a multicounty detailed street-level map base and geographic information system (GIS) application to support this effort. Operations are telephoned in by the participating agencies and posted in the critical event management system by war room analysts. The application performs both a proximity search around the address and a commonality of suspects search. If a conflict is found, the system alerts the analyst by sounding an audible alarm and flashing the conflicting events on the automated basemap. The analyst then notifies the respective agencies of the conflicting critical events so coordination or rescheduling can occur.

  3. Adaptable, high recall, event extraction system with minimal configuration

    PubMed Central

    2015-01-01

    Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems

  4. Wheelchair type biomedical system with event-recorder function.

    PubMed

    Han, Dong-Kyoon; Kim, Jong-Myoung; Cha, Eun-Jong; Lee, Tae-Soo

    2008-01-01

    The present study is about a biometric system for a wheelchair, which can measure both bio-signal (ECG-Electrocardiogram, BCG-Ballistocardiogram) and kinetic signal (acceleration) simultaneously and send the data to a remote medical server. The equipment was developed with the object of building a system that measures the bio-signal and kinetic signal of a subject who is moving or at rest on a wheelchair and transmits the measured signals to a remote server through a CDMA (Code Division Multiple Access) network. The equipment is composed of body area network and remote medical server. The body area network was designed to obtain bio-signal and kinetic signal simultaneously and, on the occurrence of an event, to transmit data to a remote medical server through a CDMA network. The remote medical server was designed to display event data transmitted from the body area network in real time. The performance of the developed system was evaluated through two experiments. First, we measured battery life on the occurrence of events, and second, we tested whether biometric data are transmitted accurately to the remote server on the occurrence of an event. In the first experiment using the developed equipment, events were triggered 16 times and the battery worked stably for around 29 hours. In the second experiment, when an event took place, the corresponding data were transmitted accurately to the remote medical server through a CDMA network. This system is expected to be usable for the healthcare of those moving on a wheelchair and applicable to a mobile healthcare system. PMID:19162939

  5. Electrical aspects of photovoltaic-system simulation

    NASA Astrophysics Data System (ADS)

    Hart, G. W.; Raghuraman, P.

    1982-06-01

    A TRNSYS simulation was developed to simulate the performance of utility interactive residential photovoltaic energy systems. The PV system is divided into major functional components, which are individually described with computer models. The results of simulation and actual measured data are compared. The electrical influences on the design of such photovoltaic energy systems are given particular attention.

  6. Effects of a simulated agricultural runoff event on sediment toxicity in a managed backwater wetland

    Technology Transfer Automated Retrieval System (TEKTRAN)

    permethrin (both cis and trans isomers), on 10-day sediment toxicity to Hyalella azteca in a managed natural backwater wetland after a simulated agricultural runoff event. Sediment samples were collected at 10, 40, 100, 300, and 500 m from inflow 13 days prior to amendment and 1, 5, 12, 22, and 36 ...

  7. Pesticide trapping efficiency of a modified backwater wetland using a simulated runoff event

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study examined the trapping efficiency of a modified backwater wetland amended with a mixture of three pesticides, atrazine, metolachlor, and fipronil, using a simulated runoff event. The 700 m long, 25 m wide wetland, located along the Coldwater River in Tunica County, Mississippi, was modifie...

  8. Simulation and field monitoring of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Rode, Matthias; Sass, Oliver

    2013-04-01

    Detachment of rock fragments from alpine rockwalls is mainly assigned to frost weathering. However, the actual process of frost weathering as well as the contribution of further weathering processes (e.g. hydration, thermal fatigue) is poorly understood. Rock moisture distribution during freeze-thaw events is key to understanding weathering. For this purpose, different measuring systems were set up in two study areas (Dachstein - permafrost area (2700m a.s.l.) and Gesäuse - non permafrost area (900m a.s.l.), Styria, Austria) within the framework of the research project ROCKING ALPS (FWF-P24244). We installed small-scale 2D-geoelectric survey lines in north and in south facing rockwalls, supplemented by high resolution temperature and moisture sensors. Moisture is determined by means of resistivity measurements which are difficult to calibrate, but provide good time series. Additional novel moisture sensors were developed which use the heat capacity of the surrounding rock as a proxy of water content. These sensors give point readings from a defined depth and are independent from soluble salt contents. Pore water pressure occurring during freeze-thaw events is recorded by means of pressure transducers (piezometers). First results from the Dachstein show that short term latent heat effects during the phase change have crucial influence on the moisture content. These results are cross-checked by simulation calculations. Based on meteorologic and lithologic input values, the simulation routine calculates, in an iterative procedure, the hourly energy and water transport at different depths, the latter in the liquid and in the vapor phase. The calculated profile lines and chronological sequences of rock moisture allow - in combination with temperature data - to detect possible periods of active weathering. First simulations from the Gesäuse show that maximum values of pore saturation occur from May to September. The thresholds of the "classical" frost shattering theory

  9. CRRES observation and STEERB simulation of the 9 October 1990 electron radiation belt dropout event

    NASA Astrophysics Data System (ADS)

    Su, Zhenpeng; Xiao, Fuliang; Zheng, Huinan; Wang, Shui

    2011-03-01

    We examine and simulate the electron radiation belt dropout event on 9 October 1990. CRRES observations show that significant depletions of electron fluxes occurred at energies ˜0.1-1.0 MeV beyond 6 RE and at energies > ˜0.4 MeV within 6 RE. The three-dimensional kinetic radiation belt model STEERB is used to simulate this dropout event, taking into account the magnetopause shadowing, adiabatic transport, radial diffusion, and plume and chorus wave-particle interactions. Our results show that STEERB code can basically reproduce the observed depletion of ˜0.1-1.0 MeV electron fluxes throughout the outer radiation belt, suggesting that the competition and combination of all these physical mechanisms can well explain this electron radiation belt dropout event.

  10. Behavior coordination of mobile robotics using supervisory control of fuzzy discrete event systems.

    PubMed

    Jayasiri, Awantha; Mann, George K I; Gosine, Raymond G

    2011-10-01

    In order to incorporate the uncertainty and impreciseness present in real-world event-driven asynchronous systems, fuzzy discrete event systems (DESs) (FDESs) have been proposed as an extension to crisp DESs. In this paper, first, we propose an extension to the supervisory control theory of FDES by redefining fuzzy controllable and uncontrollable events. The proposed supervisor is capable of enabling feasible uncontrollable and controllable events with different possibilities. Then, the extended supervisory control framework of FDES is employed to model and control several navigational tasks of a mobile robot using the behavior-based approach. The robot has limited sensory capabilities, and the navigations have been performed in several unmodeled environments. The reactive and deliberative behaviors of the mobile robotic system are weighted through fuzzy uncontrollable and controllable events, respectively. By employing the proposed supervisory controller, a command-fusion-type behavior coordination is achieved. The observability of fuzzy events is incorporated to represent the sensory imprecision. As a systematic analysis of the system, a fuzzy-state-based controllability measure is introduced. The approach is implemented in both simulation and real time. A performance evaluation is performed to quantitatively estimate the validity of the proposed approach over its counterparts. PMID:21421445

  11. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  12. Explicit spatial scattering for load balancing in conservatively synchronized parallel discrete-event simulations

    SciTech Connect

    Thulasidasan, Sunil; Kasiviswanathan, Shiva; Eidenbenz, Stephan; Romero, Philip

    2010-01-01

    We re-examine the problem of load balancing in conservatively synchronized parallel, discrete-event simulations executed on high-performance computing clusters, focusing on simulations where computational and messaging load tend to be spatially clustered. Such domains are frequently characterized by the presence of geographic 'hot-spots' - regions that generate significantly more simulation events than others. Examples of such domains include simulation of urban regions, transportation networks and networks where interaction between entities is often constrained by physical proximity. Noting that in conservatively synchronized parallel simulations, the speed of execution of the simulation is determined by the slowest (i.e most heavily loaded) simulation process, we study different partitioning strategies in achieving equitable processor-load distribution in domains with spatially clustered load. In particular, we study the effectiveness of partitioning via spatial scattering to achieve optimal load balance. In this partitioning technique, nearby entities are explicitly assigned to different processors, thereby scattering the load across the cluster. This is motivated by two observations, namely, (i) since load is spatially clustered, spatial scattering should, intuitively, spread the load across the compute cluster, and (ii) in parallel simulations, equitable distribution of CPU load is a greater determinant of execution speed than message passing overhead. Through large-scale simulation experiments - both of abstracted and real simulation models - we observe that scatter partitioning, even with its greatly increased messaging overhead, significantly outperforms more conventional spatial partitioning techniques that seek to reduce messaging overhead. Further, even if hot-spots change over the course of the simulation, if the underlying feature of spatial clustering is retained, load continues to be balanced with spatial scattering leading us to the observation that

  13. Modelling the dependence and internal structure of storm events for continuous rainfall simulation

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Yeboah; Melching, Charles S.

    2012-09-01

    SummaryPair-copula construction methodology has been explored to model the dependence structure between net storm event depth (R), maximum wet periods' depth (M), and the total wet periods' duration (L), noting that the total storm event depth is RT = R + M. Random variable R was used instead of RT in order to avoid physical boundary effects due to the condition of RT ⩾ M. The flexibility of pair-copula construction allowed the examination of 11 bivariate copulas at the three bivariate stages of the three-dimensional (3D) copula. For 21 years of hourly rainfall data from Cook County, Illinois, USA, examined, three different copulas were found suitable for the bivariate stages. For the internal storm event structure, a Geometric distribution was used to model the net event duration, defined as the difference between the total duration (D) and L. A two-parameter Poisson model was adopted for modelling the distribution of the L wet periods within D, and the first-order autoregressive Lognormal model was applied for the distribution of RT over the L wet periods. Incorporation of an inter-event (I) sub-model completed the continuous rainfall simulation scheme. The strong seasonality in the marginal and dependence model parameters was captured using first harmonic Fourier series, thus, reducing the number of parameters. Polynomial functions were fitted to the internal storm event model parameters which did not exhibit seasonal variability. Four hundred simulation runs were carried out in order to verify the developed model. Kolmogorov-Smirnov (KS) tests found the hypothesis that the observed and simulated storm event quantiles come from the same distribution cannot be rejected at the 5% significance level in nearly all cases. Gross statistics (dry probability, mean, variance, skewness, autocorrelations, and the intensity-duration-frequency (IDF) curves) of the continuous rainfall time series at several aggregation levels were very well preserved by the developed model.

  14. Topics in gravitation - numerical simulations of event horizons and parameter estimation for LISA

    NASA Astrophysics Data System (ADS)

    Cohen, Michael Isaac

    2011-08-01

    In Part I, we consider numerical simulations of event horizons. Event horizons are the defining physical features of black hole spacetimes, and are of considerable interest in studying black hole dynamics. Here, we reconsider three techniques to find event horizons in numerical spacetimes, and find that straightforward integration of geodesics backward in time is most robust. We apply this method to various systems, from a highly spinning Kerr hole through to an asymmetric binary black hole inspiral. We find that the exponential rate at which outgoing null geodesics diverge from the event horizon of a Kerr black hole is the surface gravity of the hole. In head-on mergers we are able to track quasi-normal ringing of the merged black hole through seven oscillations, covering a dynamic range of about 10^5. In the head-on "kick" merger, we find that computing the Landau-Lifshitz velocity of the event horizon is very useful for an improved understanding of the kick behaviour. Finally, in the inspiral simulations, we find that the topological structure of the black holes does not produce an intermediate toroidal phase, though the structure is consistent with a potential re-slicing of the spacetime in order to introduce such a phase. We further discuss the topological structure of non-axisymmetric collisions. In Part II, we consider parameter estimation of cosmic string burst gravitational waves in Mock LISA data. A network of observable, macroscopic cosmic (super-)strings may well have formed in the early Universe. If so, the cusps that generically develop on cosmic-string loops emit bursts of gravitational radiation that could be detectable by gravitational-wave interferometers, such as the ground-based LIGO/Virgo detectors and the planned, space-based LISA detector. We develop two versions of a LISA-oriented string-burst search pipeline within the context of the Mock LISA Data Challenges, which rely on the publicly available MultiNest and PyMC software packages

  15. Dust events in Arizona: Long-term satellite and surface observations, and the National Air Quality Forecasting Capability CMAQ simulations

    NASA Astrophysics Data System (ADS)

    Huang, M.; Tong, D.; Lee, P.; Pan, L.; Tang, Y.; Stajner, I.; Pierce, R. B.; McQueen, J.

    2015-12-01

    Dust events in Arizona: An analysis integrating satellite and surface weather and aerosol measurements, and National Air Quality Forecasting Capability CMAQ simulations Dust records in Arizona during 2005-2013 are developed using multiple observation datasets, including level 2 deep blue aerosol product by the Moderate Resolution Imaging Spectroradiometer (MODIS) and the in-situ measurements at the surface Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) sites in Phoenix. The satellite and surface aerosol observations were anti-correlated with three drought indicators (i.e., MODIS vegetation index, a European satellite soil moisture dataset, and Palmer Drought Severity Index). During the dusty year of 2007, we show that the dust events were stronger and more frequent in the afternoon hours than in the morning due to faster winds and drier soil, and the Sonoran and Chihuahuan deserts were important dust source regions during identified dust events in Phoenix as indicated by NOAA's Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) Model calculations. Based on these findings, we suggested a potential for use of satellite soil moisture and vegetation index products to interpret and predict dust activity. We also emphasized the importance of using hourly observations for better capturing dust events, and expect the hourly geostationary satellite observations in the future to well complement the current surface PM and meteorological observations considering their broader spatial coverage. Additionally, the performance of the National Air Quality Forecasting Capability (NAQFC) 12 km CMAQ model simulation is evaluated during a recent strong dust event in the western US accompanied by stratospheric ozone intrusion. The current modeling system well captured the temporal variability and the magnitude of aerosol concentrations during this event. Directions of integrating satellite weather and vegetation observations

  16. A computer aided treatment event recognition system in radiation therapy

    SciTech Connect

    Xia, Junyi Mart, Christopher; Bayouth, John

    2014-01-15

    Purpose: To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. Methods: CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012–November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors’ clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. Results: All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when

  17. Exercise-Associated Collapse in Endurance Events: A Classification System.

    ERIC Educational Resources Information Center

    Roberts, William O.

    1989-01-01

    Describes a classification system devised for exercise-associated collapse in endurance events based on casualties observed at six Twin Cities Marathons. Major diagnostic criteria are body temperature and mental status. Management protocol includes fluid and fuel replacement, temperature correction, and leg cramp treatment. (Author/SM)

  18. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  19. A patient classification system for emergency events in home care.

    PubMed

    Sienkiewicz, Josephine; Wilkinson, Ginny; Cubbage, Betsy

    2007-06-01

    The purpose of this article is to describe the development of a uniform classification system that provides a way for home care agencies to classify patient priority needs for evacuation, transport, supportive care, and use of staffing resources in an emergency/disaster situation/bioterroristic event. PMID:17556919

  20. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  1. Soil Organic Carbon Loss and Selective Transportation under Field Simulated Rainfall Events

    PubMed Central

    Nie, Xiaodong; Li, Zhongwu; Huang, Jinquan; Huang, Bin; Zhang, Yan; Ma, Wenming; Hu, Yanbiao; Zeng, Guangming

    2014-01-01

    The study on the lateral movement of soil organic carbon (SOC) during soil erosion can improve the understanding of global carbon budget. Simulated rainfall experiments on small field plots were conducted to investigate the SOC lateral movement under different rainfall intensities and tillage practices. Two rainfall intensities (High intensity (HI) and Low intensity (LI)) and two tillage practices (No tillage (NT) and Conventional tillage (CT)) were maintained on three plots (2 m width × 5 m length): HI-NT, LI-NT and LI-CT. The rainfall lasted 60 minutes after the runoff generated, the sediment yield and runoff volume were measured and sampled at 6-min intervals. SOC concentration of sediment and runoff as well as the sediment particle size distribution were measured. The results showed that most of the eroded organic carbon (OC) was lost in form of sediment-bound organic carbon in all events. The amount of lost SOC in LI-NT event was 12.76 times greater than that in LI-CT event, whereas this measure in HI-NT event was 3.25 times greater than that in LI-NT event. These results suggest that conventional tillage as well as lower rainfall intensity can reduce the amount of lost SOC during short-term soil erosion. Meanwhile, the eroded sediment in all events was enriched in OC, and higher enrichment ratio of OC (ERoc) in sediment was observed in LI events than that in HI event, whereas similar ERoc curves were found in LI-CT and LI-NT events. Furthermore, significant correlations between ERoc and different size sediment particles were only observed in HI-NT event. This indicates that the enrichment of OC is dependent on the erosion process, and the specific enrichment mechanisms with respect to different erosion processes should be studied in future. PMID:25166015

  2. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  3. Generalized Fluid System Simulation Program

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok Kumar (Inventor); Bailey, John W. (Inventor); Schallhorn, Paul Alan (Inventor); Steadman, Todd E. (Inventor)

    2004-01-01

    A general purpose program implemented on a computer analyzes steady state and transient flow in a complex fluid network, modeling phase changes, compressibility, mixture thermodynamics and external body forces such as gravity and centrifugal force. A preprocessor provides for the inter- active development of a fluid network simulation having nodes and branches. Mass, energy, and specie conservation equations are solved at the nodes, and momentum conservation equations are solved in the branches. Contained herein are subroutines for computing "real fluid" thermodynamic and thermophysical properties for 12 fluids, and a number of different source options are provided for model- ing momentum sources or sinks in the branches. The system of equations describing the fluid network is solved by a hybrid numerical method that is a combination of the Newton-Raphson and successive substitution methods. Application and verification of this invention are provided through an example problem, which demonstrates that the predictions of the present invention compare most reasonably with test data.

  4. Distributed event-triggered consensus tracking of second-order multi-agent systems with a virtual leader

    NASA Astrophysics Data System (ADS)

    Jie, Cao; Zhi-Hai, Wu; Li, Peng

    2016-05-01

    This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy. Project supported by the National Natural Science Foundation of China (Grant Nos. 61203147, 61374047, and 61403168).

  5. Parallel and Distributed System Simulation

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack

    1998-01-01

    This exploratory study initiated our research into the software infrastructure necessary to support the modeling and simulation techniques that are most appropriate for the Information Power Grid. Such computational power grids will use high-performance networking to connect hardware, software, instruments, databases, and people into a seamless web that supports a new generation of computation-rich problem solving environments for scientists and engineers. In this context we looked at evaluating the NetSolve software environment for network computing that leverages the potential of such systems while addressing their complexities. NetSolve's main purpose is to enable the creation of complex applications that harness the immense power of the grid, yet are simple to use and easy to deploy. NetSolve uses a modular, client-agent-server architecture to create a system that is very easy to use. Moreover, it is designed to be highly composable in that it readily permits new resources to be added by anyone willing to do so. In these respects NetSolve is to the Grid what the World Wide Web is to the Internet. But like the Web, the design that makes these wonderful features possible can also impose significant limitations on the performance and robustness of a NetSolve system. This project explored the design innovations that push the performance and robustness of the NetSolve paradigm as far as possible without sacrificing the Web-like ease of use and composability that make it so powerful.

  6. Abstracting event-based control models for high autonomy systems

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  7. Interannual and Intraseasonal oscillations and extreme events over South America simulated by HIGEM models.

    NASA Astrophysics Data System (ADS)

    Custodio, Maria; Ambrizzi, Tercio

    2014-05-01

    The climatic system has its fluctuations determined mainly by the complex fluxes from the ocean and atmosphere. The fluxes transport energy, momentum and tracers within and between system components; they occur in a wide range of spatial and temporal scales. Because of this, according to Shaffrey et al. (2009) the development of high resolution global models is indispensable, to simulate the energy transfer to smaller scales and to capture the non linear interactions between wide ranges of spatial and temporal scales, and between the different components of climatic system. There are strong reasons to increase the resolution of all the atmospheric and oceanic components of coupled climatic models (AGCM) and uncoupled climatic models (GCM). The South America (SA) climate is characterized by different precipitation regimes and its variability has large influences of the large scale phenomena in the interanual (El Niño South Oscilation - ENSO) and intraseasonal (Maden Julian Oscilation - MJO) timescales. Normally, the AGCM and CGM use low horizontal resolution and present difficult in the representation of these low frequency variability phenomena. The goal of this work is to evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model, which will be denominated NUGEM (~60 Km), HiGEM (~90 km) and HadGEM (~135 km) and NUGAM (~60 Km), HiGAM (~90 Km) and HadGAM (~135 Km), respectively, in capturing the signal of interannual and intraseasonal variability of precipitation and temperature over SA. Basically we want discuss the impact of sea surface temperature in the annual cycle of atmospheric variables. The simulations were compared with precipitation data from Climate Prediction Center - Merged Analysis of Precipitation (CMAP) and with temperature data from ERA-Interim, both for the period 1979 to 2008. The precipitation and temperature time-series were filtered on the interanual (period > 365 days) and intraseasonal (30

  8. Recurrence time statistics of landslide events simulated by a cellular automaton model

    NASA Astrophysics Data System (ADS)

    Piegari, Ester; Di Maio, Rosa; Avella, Adolfo

    2014-05-01

    The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times

  9. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  10. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A recommendation and a specification for the visual simulation system design for the space shuttle mission simulator are presented. A recommended visual system is described which most nearly meets the visual design requirements. The cost analysis of the recommended system covering design, development, manufacturing, and installation is reported. Four alternate systems are analyzed.

  11. Simulation of Anomalous Regional Climate Events with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    1999-01-01

    The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.

  12. Using Discrete Event Computer Simulation to Improve Patient Flow in a Ghanaian Acute Care Hospital

    PubMed Central

    Best, Allyson M.; Dixon, Cinnamon A.; Kelton, W. David; Lindsell, Christopher J.

    2014-01-01

    Objectives Crowding and limited resources have increased the strain on acute care facilities and emergency departments (EDs) worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation (DES) is a computer-based tool that can be used to estimate how changes to complex healthcare delivery systems, such as EDs, will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. Methods We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (e.g. modified staff start times and roles) and resource-additional (e.g. increased staff) operational interventions on patient throughput. Previously captured, de-identified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). Results The base-case (no change) scenario had a mean LOS of 292 minutes (95% CI 291, 293). In isolation, neither adding staffing, changing staff roles, nor varying shift times affected overall patient LOS. Specifically, adding two registration workers, history takers, and physicians resulted in a 23.8 (95% CI 22.3, 25.3) minute LOS decrease. However, when shift start-times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI 94, 98); and with the simultaneous combination of staff roles (Registration and History-taking) there was an overall mean LOS reduction of 152 minutes (95% CI 150, 154). Conclusions Resource-neutral interventions identified through DES modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. DES offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute

  13. SARDA HITL Simulations: System Performance Results

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam

    2012-01-01

    This presentation gives an overview of the 2012 SARDA human-in-the-loop simulation, and presents a summary of system performance results from the simulation, including delay, throughput and fuel consumption

  14. Discrete event simulation for healthcare organizations: a tool for decision making.

    PubMed

    Hamrock, Eric; Paige, Kerrie; Parks, Jennifer; Scheulen, James; Levin, Scott

    2013-01-01

    Healthcare organizations face challenges in efficiently accommodating increased patient demand with limited resources and capacity. The modern reimbursement environment prioritizes the maximization of operational efficiency and the reduction of unnecessary costs (i.e., waste) while maintaining or improving quality. As healthcare organizations adapt, significant pressures are placed on leaders to make difficult operational and budgetary decisions. In lieu of hard data, decision makers often base these decisions on subjective information. Discrete event simulation (DES), a computerized method of imitating the operation of a real-world system (e.g., healthcare delivery facility) over time, can provide decision makers with an evidence-based tool to develop and objectively vet operational solutions prior to implementation. DES in healthcare commonly focuses on (1) improving patient flow, (2) managing bed capacity, (3) scheduling staff, (4) managing patient admission and scheduling procedures, and (5) using ancillary resources (e.g., labs, pharmacies). This article describes applicable scenarios, outlines DES concepts, and describes the steps required for development. An original DES model developed to examine crowding and patient flow for staffing decision making at an urban academic emergency department serves as a practical example. PMID:23650696

  15. Simulations of Transient Phenomena in Liquid Rocket Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, V.; Hosangadi, A.; Cavallo, P. A.; Daines, R.

    2006-01-01

    Valve systems in rocket propulsion systems and testing facilities are constantly subject to dynamic events resulting from the timing of valve motion leading to unsteady fluctuations in pressure and mass flow. Such events can also be accompanied by cavitation, resonance, system vibration leading to catastrophic failure. High-fidelity dynamic computational simulations of valve operation can yield important information of valve response to varying flow conditions. Prediction of transient behavior related to valve motion can serve as guidelines for valve scheduling, which is of crucial importance in engine operation and testing. Feed components operating in cryogenic regimes can also experience cavitation based instabilities leading to large scale shedding of vapor clouds and pressure oscillations. In this paper, we present simulations of the diverse unsteady phenomena related to valve and feed systems that include valve stall, valve timing studies as well as two different forms of cavitation instabilities in components utilized in the test loop.

  16. Safety monitoring in the Vaccine Adverse Event Reporting System (VAERS).

    PubMed

    Shimabukuro, Tom T; Nguyen, Michael; Martin, David; DeStefano, Frank

    2015-08-26

    The Centers for Disease Control and Prevention (CDC) and the U.S. Food and Drug Administration (FDA) conduct post-licensure vaccine safety monitoring using the Vaccine Adverse Event Reporting System (VAERS), a spontaneous (or passive) reporting system. This means that after a vaccine is approved, CDC and FDA continue to monitor safety while it is distributed in the marketplace for use by collecting and analyzing spontaneous reports of adverse events that occur in persons following vaccination. Various methods and statistical techniques are used to analyze VAERS data, which CDC and FDA use to guide further safety evaluations and inform decisions around vaccine recommendations and regulatory action. VAERS data must be interpreted with caution due to the inherent limitations of passive surveillance. VAERS is primarily a safety signal detection and hypothesis generating system. Generally, VAERS data cannot be used to determine if a vaccine caused an adverse event. VAERS data interpreted alone or out of context can lead to erroneous conclusions about cause and effect as well as the risk of adverse events occurring following vaccination. CDC makes VAERS data available to the public and readily accessible online. We describe fundamental vaccine safety concepts, provide an overview of VAERS for healthcare professionals who provide vaccinations and might want to report or better understand a vaccine adverse event, and explain how CDC and FDA analyze VAERS data. We also describe strengths and limitations, and address common misconceptions about VAERS. Information in this review will be helpful for healthcare professionals counseling patients, parents, and others on vaccine safety and benefit-risk balance of vaccination. PMID:26209838

  17. Safety monitoring in the Vaccine Adverse Event Reporting System (VAERS)

    PubMed Central

    Shimabukuro, Tom T.; Nguyen, Michael; Martin, David; DeStefano, Frank

    2015-01-01

    The Centers for Disease Control and Prevention (CDC) and the U.S. Food and Drug Administration (FDA) conduct post-licensure vaccine safety monitoring using the Vaccine Adverse Event Reporting System (VAERS), a spontaneous (or passive) reporting system. This means that after a vaccine is approved, CDC and FDA continue to monitor safety while it is distributed in the marketplace for use by collecting and analyzing spontaneous reports of adverse events that occur in persons following vaccination. Various methods and statistical techniques are used to analyze VAERS data, which CDC and FDA use to guide further safety evaluations and inform decisions around vaccine recommendations and regulatory action. VAERS data must be interpreted with caution due to the inherent limitations of passive surveillance. VAERS is primarily a safety signal detection and hypothesis generating system. Generally, VAERS data cannot be used to determine if a vaccine caused an adverse event. VAERS data interpreted alone or out of context can lead to erroneous conclusions about cause and effect as well as the risk of adverse events occurring following vaccination. CDC makes VAERS data available to the public and readily accessible online. We describe fundamental vaccine safety concepts, provide an overview of VAERS for healthcare professionals who provide vaccinations and might want to report or better understand a vaccine adverse event, and explain how CDC and FDA analyze VAERS data. We also describe strengths and limitations, and address common misconceptions about VAERS. Information in this review will be helpful for healthcare professionals counseling patients, parents, and others on vaccine safety and benefit-risk balance of vaccination. PMID:26209838

  18. Safety Discrete Event Models for Holonic Cyclic Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Ciufudean, Calin; Filote, Constantin

    In this paper the expression “holonic cyclic manufacturing systems” refers to complex assembly/disassembly systems or fork/join systems, kanban systems, and in general, to any discrete event system that transforms raw material and/or components into products. Such a system is said to be cyclic if it provides the same sequence of products indefinitely. This paper considers the scheduling of holonic cyclic manufacturing systems and describes a new approach using Petri nets formalism. We propose an approach to frame the optimum schedule of holonic cyclic manufacturing systems in order to maximize the throughput while minimize the work in process. We also propose an algorithm to verify the optimum schedule.

  19. Numerical Simulation and Analysis of the Localized Heavy Precipitation Event in South Korea based on diagnostic variables

    NASA Astrophysics Data System (ADS)

    Roh, Joon-Woo; Choi, Young-Jean

    2016-04-01

    Accurate prediction of precipitation is one of the most difficult and significant tasks in weather forecasting. Heavy precipitations in the Korean Peninsula are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. Many previous studies have used observations, numerical modeling, and statistics to investigate the potential causes of warm-season heavy precipitation in South Korea. Especially, the frequency of warm-season torrential rainfall events more than 30 mm/h precipitation has increased threefold in Seoul, a metropolitan city in South Korea, in recent 30 years. Localized heavy rainfall events in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances along the Changma front, or from convective instabilities resulting from unstable air masses. In order to investigate localized heavy precipitation system in Seoul metropolitan area, analysis and numerical experiment were performed for a typical event in 20 June 2014. This case is described to a structure of baroclinic instability associated with a short-wave trough from the northwest and high moist and warm air by a thermal low from the southwest of the Korean Peninsula. We investigated localized heavy precipitation in narrow zone of the Seoul urban area using numerical simulations based on the Weather Research and Forecast (WRF) model with convective scale. The topography and land use data of the revised U.S. Geological Survey (USGS) data and the appropriate set of physical scheme options for WRF model simulation were deliberated. Simulation experiments showed patches of primary physical structures related to the localized heavy precipitation using the diagnostic fields, which are storm relative helicity (SRH), updraft helicity (UH), and instantaneous contraction rates (ICON). SRH and UH are dominantly related to

  20. Evaluating the aerosol indirect effect in WRF-Chem simulations of the January 2013 Beijing air pollution event.

    NASA Astrophysics Data System (ADS)

    Peckham, Steven; Grell, Georg; Xie, Ying; Wu, Jian-Bin

    2015-04-01

    In January 2013, an unusual weather pattern over Northern China produced unusually cool, moist conditions for the region. Recent peer-reviewed scientific manuscripts report that during this time period, Beijing experienced a historically severe haze and smog event with observed monthly average fine particulate matter (PM2.5) concentrations exceeding 225 micrograms per cubic meter. MODIS satellite observations produced AOD values of approximately 1.5 to 2 for the same time. In addition, over eastern and northern China record-breaking hourly average PM2.5 concentrations of more than 700 μg m-3 were observed. Clearly, the severity and persistence of this air pollution episode has raised the interest of the scientific community as well as widespread public attention. Despite the significance of this and similar air pollution events, several questions regarding the ability of numerical weather prediction models to forecast such events remain. Some of these questions are: • What is the importance of including aerosols in the weather prediction models? • What is the current capability of weather prediction models to simulate aerosol impacts upon the weather? • How important is it to include the aerosol feedbacks (direct and indirect effect) in the numerical model forecasts? In an attempt to address these and other questions, a Joint Working Group of the Commission for Atmospheric Sciences and the World Climate Research Programme has been convened. This Working Group on Numerical Experimentation (WGNE), has set aside several events of interest and has asked its members to generate numerical simulations of the events and examine the results. As part of this project, weather and pollution simulations were produced at the NOAA Earth System Research Laboratory using the Weather Research and Forecasting (WRF) chemistry model. These particular simulations include the aerosol indirect effect and are being done in collaboration with a group in China that will produce

  1. Numerical simulations of the jetted tidal disruption event Swift J1644+57

    NASA Astrophysics Data System (ADS)

    Mimica, Petar; Aloy, Miguel A.; Giannios, Dimitrios; Metzger, Brian D.

    2016-05-01

    In this work we focus on the technical details of the numerical simulations of the non-thermal transient Swift J1644+57, whose emission is probably produced by a two- component jet powered by a tidal disruption event. In this context we provide details of the coupling between the relativistic hydrodynamic simulations and the radiative transfer code. First, we consider the technical demands of one-dimensional simulations of a fast relativistic jet, and show to what extent (for the same physical parameters of the model) do the computed light curves depend on the numerical parameters of the different codes employed. In the second part we explain the difficulties of computing light curves from axisymmetric two dimensonal simulations and discuss a procedure that yields an acceptable tradeoff between the computational cost and the quality of the results.

  2. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  3. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  4. Developing clinical competency in crisis event management: an integrated simulation problem-based learning activity.

    PubMed

    Liaw, S Y; Chen, F G; Klainin, P; Brammer, J; O'Brien, A; Samarasekera, D D

    2010-08-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session would be superior to those who completed the conventional problem-based session. The students were allocated into either simulation with problem-based discussion (SPBD) or problem-based discussion (PBD) for scenarios on respiratory and cardiac distress. Following completion of each scenario, students from both groups were invited to sit an optional individual test involving a systematic assessment and immediate management of a simulated patient facing a crisis event. A total of thirty students participated in the first post test related to a respiratory scenario and thirty-three participated in the second post test related to a cardiac scenario. Their clinical performances were scored using a checklist. Mean test scores for students completing the SPBD were significantly higher than those who completing the PBD for both the first post test (SPBD 20.08, PBD 18.19) and second post test (SPBD 27.56, PBD 23.07). Incorporation of simulation learning activities into problem-based discussion appeared to be an effective educational strategy for teaching nursing students to assess and manage crisis events. PMID:19916052

  5. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    SciTech Connect

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  6. Computer simulation of initial events in the biochemical mechanisms of DNA damage

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Holley, W. R.

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article.

  7. Computer simulation of initial events in the biochemical mechanisms of DNA damage.

    PubMed

    Chatterjee, A; Holley, W R

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article. PMID:11537895

  8. Address-event-based platform for bioinspired spiking systems

    NASA Astrophysics Data System (ADS)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  9. Simulation system of airborne FLIR searcher

    NASA Astrophysics Data System (ADS)

    Sun, Kefeng; Li, Yu; Gao, Jiaobo; Wang, Jun; Wang, Jilong; Xie, Junhu; Ding, Na; Sun, Dandan

    2014-11-01

    Airborne Forward looking infra-red (FLIR) searcher simulation system can provide multi-mode simulated test environment that almost actual field environment, and can simulate integrated performance and external interface of airborne FLIR simulation system. Furthermore, the airborne FLIR searcher simulation system can support the algorithm optimization of image processing, and support the test and evaluation of electro-optical system, and also support the line test of software and evaluate the performance of the avionics system. The detailed design structure and information cross-linking relationship of each component are given in this paper. The simulation system is composed of the simulation center, the FLIR actuator, the FLIR emulator, and the display control terminal. The simulation center can generate the simulated target and aircraft flying data in the operation state of the airborne FLIR Searcher. The FLIR actuator can provide simulation scene. It can generate the infrared target and landform based scanning scene, response to the commands from simulation center and the FLIR actuator and operation control unit. The infrared image generated by the FLIR actuator can be processed by the FLIR emulator using PowerPC hardware framework and processing software based on VxWorks system. It can detect multi-target and output the DVI video and the multi-target detection information which corresponds to the working state of the FLIR searcher. Display control terminal can display the multi-target detection information in two-dimension situation format, and realize human-computer interaction function.

  10. A simulation study of two major events in the heliosphere during the present sunspot cycle

    NASA Technical Reports Server (NTRS)

    Akasofu, S. I.; Fillius, W.; Sun, W.; Fry, C.; Dryer, M.

    1985-01-01

    The two major disturbances in the heliosphere during the present sunspot cycle, the event of June to August, 1982, and the event of April to June, 1978, are simulated by the method developed by Hakamada and Akasofu (1982). Specifically, an attempt was made to simulate the effects of six major flares from three active regions in June and July, 1982, and April and May, 1978. A comparison of the results with the solar wind observations at Pioneer 12 (approximately 0.8 au), ISEE-3 (approximately 1 au), Pioneer 11 (approximately 7 to 13 au) and Pioneer 10 (approximately 16 to 28 au) suggests that some major flares occurred behind the disk of the sun during the two periods. The method provides qualitatively some information as to how such a series of intense solar flares can greatly disturb both the inner and outer heliospheres. A long lasting effect on cosmic rays is discussed in conjunction with the disturbed heliosphere.

  11. Selective Attention in Multi-Chip Address-Event Systems

    PubMed Central

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model. PMID:22346689

  12. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  13. Simulation of seismic events induced by CO2 injection at In Salah, Algeria

    NASA Astrophysics Data System (ADS)

    Verdon, James P.; Stork, Anna L.; Bissell, Rob C.; Bond, Clare E.; Werner, Maximilian J.

    2015-09-01

    Carbon capture and storage technology has the potential to reduce anthropogenic CO2 emissions. However, the geomechanical response of the reservoir and sealing caprocks must be modelled and monitored to ensure that injected CO2 is safely stored. To ensure confidence in model results, there is a clear need to develop ways of comparing model predictions with observations from the field. In this paper we develop an approach to simulate microseismic activity induced by injection, which allows us to compare geomechanical model predictions with observed microseismic activity. We apply this method to the In Salah CCS project, Algeria. A geomechanical reconstruction is used to simulate the locations, orientations and sizes of pre-existing fractures in the In Salah reservoir. The initial stress conditions, in combination with a history matched reservoir flow model, are used to determine when and where these fractures exceed Mohr-Coulomb limits, triggering failure. The sizes and orientations of fractures, and the stress conditions thereon, are used to determine the resulting micro-earthquake focal mechanisms and magnitudes. We compare our simulated event population with observations made at In Salah, finding good agreement between model and observations in terms of event locations, rates of seismicity, and event magnitudes.

  14. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  15. Residential photovoltaic system simulation: Thermal aspects

    NASA Astrophysics Data System (ADS)

    Hart, G. W.; Raghuraman, P.

    1982-04-01

    A TRNSYS simulation was developed to simulate the performance of utility interactive residential photovoltaic energy systems. The PV system is divided into its major functional components, which are individually described with computer models. These models are described in detail. The results of simulation and actual measured data obtained a MIT Lincoln Laboratory's Northeast Residential Station are compared. The thermal influences on the design of such photovoltaic energy systems are given particular attention.

  16. The waveform correlation event detection system project: Issues in system refinement, tuning, and operation

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Harris, J.M.; Moore, S.G.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.

    1996-08-01

    The goal of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs has been to develop a prototype of a full-waveform correlation based seismic event detection system which could be used to assess potential usefulness for CTBT monitoring. The current seismic event detection system in use at the IDC is very sophisticated and provides good results but there is still significant room for improvement, particularly in reducing the number of false events (currently being nearly equal to the number of real events). Our first prototype was developed last year and since then we have used it for extensive testing from which we have gained considerable insight. The original prototype was based on a long-period detector designed by Shearer (1994), but it has been heavily modified to address problems encountered in application to a data set from the Incorporated Research Institutes for Seismology (IRIS) broadband global network. Important modifications include capabilities for event masking and iterative event detection, continuous near-real time execution, improved Master Image creation, and individualized station pre-processing. All have been shown to improve bulletin quality. In some cases the system has detected marginal events which may not be detectable by traditional detection systems, but definitive conclusions cannot be made without direct comparisons. For this reason future work will focus on using the system to process GSETT3 data for comparison with current event detection systems at the IDC.

  17. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  18. [ Preventing adverse drug events using clinical decision support systems].

    PubMed

    Salili, Ali Reza; Hammann, Felix; Taegtmeyer, Anne B

    2015-12-01

    Adverse drug events pose a great risk to patients, are an everyday clinical problem and can have potential/ega/ consequences. Computerized physician order entry or computerized provider order entry (CPOE} in combination with clinical decision support systems {CDSS) are popular and aim to reduce prescribing errors as well as identifying potentially harmful drug drug interactions. The quantifiable benejit these systems bring to patients, has however, yet to be definitively proven. This article focusses on the current standpoint of CPOE-/CDSS, their risks and benefits, the potential for improvement and their perspectives for the future. PMID:26654813

  19. Convection-Resolving Climate Change Simulations: Intensification of Heavy Hourly Precipitation Events

    NASA Astrophysics Data System (ADS)

    Ban, N.; Schmidli, J.; Schar, C.

    2014-12-01

    Reliable climate-change projections of extreme precipitation events are of great interest to decision makers, due to potentially important hydrological impacts such as floods, land slides and debris flows. Low-resolution climate models generally project increases of heavy precipitation events with climate change, but there are large uncertainties related to the limited spatial resolution and the parameterized representation of atmospheric convection. Here we employ a convection-resolving version of the COSMO model across an extended region (1100 km x 1100 km) covering the European Alps to investigate the differences between parameterized and explicit convection in climate-change scenarios. We conduct 10-year long integrations at resolutions of 12 and 2km. Validation using ERA-Interim driven simulations reveals major improvements with the 2km resolution, in particular regarding the diurnal cycle of mean precipitation and the representation of hourly extremes. In addition, 2km simulations replicate the observed super-adiabatic scaling at precipitation stations, i.e. peak hourly events increase faster with temperature than the Clausius-Clapeyron scaling of 7%/K (see Ban et al. 2014). Convection-resolving climate change scenarios are conducted using control (1991-2000) and scenario (2081-2090) simulations driven by a CMIP5 GCM (i.e. the MPI-ESM-LR) under the IPCC RCP8.5 scenario. Comparison between 12 and 2km resolutions with parameterized and explicit convection, respectively, reveals close agreement in terms of mean summer precipitation amounts (decrease by 30%), and regarding slight increases of heavy day-long events (amounting to 15% for 90th-percentile for wet-day precipitation). However, the different resolutions yield large differences regarding extreme hourly precipitation, with the 2km version projecting substantially faster increases of heavy hourly precipitation events (about 30% increases for 90th-percentile hourly events). Ban, N., J. Schmidli and C. Sch

  20. Relativistic positioning systems: Numerical simulations

    NASA Astrophysics Data System (ADS)

    Puchades Colmenero, Neus

    The position of users located on the Earth's surface or near it may be found with the classic positioning systems (CPS). Certain information broadcast by satellites of global navigation systems, as GPS and GALILEO, may be used for positioning. The CPS are based on the Newtonian formalism, although relativistic post-Newtonian corrections are done when they are necessary. This thesis contributes to the development of a different positioning approach, which is fully relativistic from the beginning. In the relativistic positioning systems (RPS), the space-time position of any user (ship, spacecraft, and so on) can be calculated with the help of four satellites, which broadcast their proper times by means of codified electromagnetic signals. In this thesis, we have simulated satellite 4-tuples of the GPS and GALILEO constellations. If a user receives the signals from four satellites simultaneously, the emission proper times read -after decoding- are the user "emission coordinates". In order to find the user "positioning coordinates", in an appropriate almost inertial reference system, there are two possibilities: (a) the explicit relation between positioning and emission coordinates (broadcast by the satellites) is analytically found or (b) numerical codes are designed to calculate the positioning coordinates from the emission ones. Method (a) is only viable in simple ideal cases, whereas (b) allows us to consider realistic situations. In this thesis, we have designed numerical codes with the essential aim of studying two appropriate RPS, which may be generalized. Sometimes, there are two real users placed in different positions, which receive the same proper times from the same satellites; then, we say that there is bifurcation, and additional data are needed to choose the real user position. In this thesis, bifurcation is studied in detail. We have analyzed in depth two RPS models; in both, it is considered that the satellites move in the Schwarzschild's space

  1. State-space supervision of reconfigurable discrete event systems

    SciTech Connect

    Garcia, H.E.; Ray, A.

    1995-12-31

    The Discrete Event Systems (DES) theory of supervisory and state feedback control offers many advantages for implementing supervisory systems. Algorithmic concepts have been introduced to assure that the supervising algorithms are correct and meet the specifications. It is often assumed that the supervisory specifications are invariant or, at least, until a given supervisory task is completed. However, there are many practical applications where the supervising specifications update at real time. For example, in a Reconfigurable Discrete Event System (RDES) architecture, a bank of supervisors is defined to accommodate each identified operational condition or different supervisory specifications. This adaptive supervisory control system changes the supervisory configuration to accept coordinating commands or to adjust for changes in the controlled process. This paper addresses reconfiguration at the supervisory level of hybrid systems along with a RDES underlying architecture. It reviews the state-based supervisory control theory and extends it to the paradigm of RDES and in view of process control applications. The paper addresses theoretical issues with a limited number of practical examples. This control approach is particularly suitable for hierarchical reconfigurable hybrid implementations.

  2. A systems model of phosphorylation for inflammatory signaling events.

    PubMed

    Sadreev, Ildar I; Chen, Michael Z Q; Welsh, Gavin I; Umezawa, Yoshinori; Kotov, Nikolay V; Valeyev, Najl V

    2014-01-01

    Phosphorylation is a fundamental biochemical reaction that modulates protein activity in cells. While a single phosphorylation event is relatively easy to understand, multisite phosphorylation requires systems approaches for deeper elucidation of the underlying molecular mechanisms. In this paper we develop a mechanistic model for single- and multi-site phosphorylation. The proposed model is compared with previously reported studies. We compare the predictions of our model with experiments published in the literature in the context of inflammatory signaling events in order to provide a mechanistic description of the multisite phosphorylation-mediated regulation of Signal Transducer and Activator of Transcription 3 (STAT3) and Interferon Regulatory Factor 5 (IRF-5) proteins. The presented model makes crucial predictions for transcription factor phosphorylation events in the immune system. The model proposes potential mechanisms for T cell phenotype switching and production of cytokines. This study also provides a generic framework for the better understanding of a large number of multisite phosphorylation-regulated biochemical circuits. PMID:25333362

  3. Evaluating resilience of DNP3-controlled SCADA systems against event buffer flooding

    SciTech Connect

    Yan, Guanhua; Nicol, David M; Jin, Dong

    2010-12-16

    The DNP3 protocol is widely used in SCADA systems (particularly electrical power) as a means of communicating observed sensor state information back to a control center. Typical architectures using DNP3 have a two level hierarchy, where a specialized data aggregator device receives observed state from devices within a local region, and the control center collects the aggregated state from the data aggregator. The DNP3 communication between control center and data aggregator is asynchronous with the DNP3 communication between data aggregator and relays; this leads to the possibility of completely filling a data aggregator's buffer of pending events, when a relay is compromised or spoofed and sends overly many (false) events to the data aggregator. This paper investigates how a real-world SCADA device responds to event buffer flooding. A Discrete-Time Markov Chain (DTMC) model is developed for understanding this. The DTMC model is validated by a Moebius simulation model and data collected on real SCADA testbed.

  4. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  5. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  6. Systemic chemokine levels, coronary heart disease, and ischemic stroke events

    PubMed Central

    Canouï-Poitrine, F.; Luc, G.; Mallat, Z.; Machez, E.; Bingham, A.; Ferrieres, J.; Ruidavets, J.-B.; Montaye, M.; Yarnell, J.; Haas, B.; Arveiler, D.; Morange, P.; Kee, F.; Evans, A.; Amouyel, P.; Ducimetiere, P.

    2011-01-01

    Objectives: To quantify the association between systemic levels of the chemokine regulated on activation normal T-cell expressed and secreted (RANTES/CCL5), interferon-γ-inducible protein-10 (IP-10/CXCL10), monocyte chemoattractant protein-1 (MCP-1/CCL2), and eotaxin-1 (CCL11) with future coronary heart disease (CHD) and ischemic stroke events and to assess their usefulness for CHD and ischemic stroke risk prediction in the PRIME Study. Methods: After 10 years of follow-up of 9,771 men, 2 nested case-control studies were built including 621 first CHD events and 1,242 matched controls and 95 first ischemic stroke events and 190 matched controls. Standardized hazard ratios (HRs) for each log-transformed chemokine were estimated by conditional logistic regression. Results: None of the 4 chemokines were independent predictors of CHD, either with respect to stable angina or to acute coronary syndrome. Conversely, RANTES (HR = 1.70; 95% confidence interval [CI] 1.05–2.74), IP-10 (HR = 1.53; 95% CI 1.06–2.20), and eotaxin-1 (HR = 1.59; 95% CI 1.02–2.46), but not MCP-1 (HR = 0.99; 95% CI 0.68–1.46), were associated with ischemic stroke independently of traditional cardiovascular risk factors, hs-CRP, and fibrinogen. When the first 3 chemokines were included in the same multivariate model, RANTES and IP-10 remained predictive of ischemic stroke. Their addition to a traditional risk factor model predicting ischemic stroke substantially improved the C-statistic from 0.6756 to 0.7425 (p = 0.004). Conclusions: In asymptomatic men, higher systemic levels of RANTES and IP-10 are independent predictors of ischemic stroke but not of CHD events. RANTES and IP-10 may improve the accuracy of ischemic stroke risk prediction over traditional risk factors. PMID:21849651

  7. Method for simulating discontinuous physical systems

    DOEpatents

    Baty, Roy S.; Vaughn, Mark R.

    2001-01-01

    The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.

  8. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  9. An adverse event capture and management system for cancer studies

    PubMed Central

    2015-01-01

    Background Comprehensive capture of Adverse Events (AEs) is crucial for monitoring for side effects of a therapy while assessing efficacy. For cancer studies, the National Cancer Institute has developed the Common Terminology Criteria for Adverse Events (CTCAE) as a required standard for recording attributes and grading AEs. The AE assessments should be part of the Electronic Health Record (EHR) system; yet, due to patient-centric EHR design and implementation, many EHR's don't provide straightforward functions to assess ongoing AEs to indicate a resolution or a grade change for clinical trials. Methods At UAMS, we have implemented a standards-based Adverse Event Reporting System (AERS) that is integrated with the Epic EHR and other research systems to track new and existing AEs, including automated lab result grading in a regulatory compliant manner. Within a patient's chart, providers can launch AERS, which opens the patient's ongoing AEs as default and allows providers to assess (resolution/ongoing) existing AEs. In another tab, it allows providers to create a new AE. Also, we have separated symptoms from diagnoses in the CTCAE to minimize inaccurate designation of the clinical observations. Upon completion of assessments, a physician would submit the AEs to the EHR via a Health Level 7 (HL7) message and then to other systems utilizing a Representational State Transfer Web Service. Conclusions AERS currently supports CTCAE version 3 and 4 with more than 65 cancer studies and 350 patients on those studies. This type of standard integrated into the EHR aids in research and data sharing in a compliant, efficient, and safe manner. PMID:26424052

  10. Classification of novel events for structural health monitoring systems

    NASA Astrophysics Data System (ADS)

    Dhruve, Nishant J.; McNeill, Dean K.

    2007-04-01

    This article reports on results obtained when applying neural networks to the problem of vehicle classification from SHM measurement data. It builds upon previous work which addressed the issue of reducing vast amounts of data collected during an SHM process by storing only those events regarded as being "interesting," thus decreasing the stored data to a manageable size. This capability is extended here by providing a means to group and classify these novel events using artificial neural network (ANN) techniques. Two types of neural systems are investigated, the first one consists of two neural layers employing both supervised and unsupervised learning. The second, which is an extension of the first, has a data pre-processing stage. In this later system, input data presented to the system is first pre-scaled before being presented to the first network layer. The scaling value is retained and later passed to the second layer as an extra input. The results obtained for vehicle classification using these two methods showed a success rate of 60% and 90% for the first and second ANN systems respectively.

  11. DKIST Adaptive Optics System: Simulation Results

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Schmidt, Dirk

    2016-05-01

    The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.

  12. Database system support for simulation data

    SciTech Connect

    Murphy, M.C.

    1989-07-01

    This report addresses database system issues arising in the design, implementation and execution of queuing simulation experiments. The primary goal is to identify new features for inclusion in a custom database system implemented using an extensible database system. Simulation data is first identified as a distinct subset of scientific data. An overview of the experimental process is then presented along with a survey of related simulation environments. A queuing simulation paradigm is described in detail in order to identify the distinguishing characteristics of queuing simulation data and the modes of manipulation. This is the basis for a traditional ER/Relational implementation, which in turn serves as the focus of a complete simulation environment. Difficulties encountered in the use of traditional implementation tools motivate the custom database system extensions. 37 refs., 5 figs., 2 tabs.

  13. Did the Solar system form in a sequential triggered star formation event?

    NASA Astrophysics Data System (ADS)

    Parker, Richard J.; Dale, James E.

    2016-02-01

    The presence and abundance of the short-lived radioisotopes (SLRs) 26Al and 60Fe during the formation of the Solar system is difficult to explain unless the Sun formed in the vicinity of one or more massive star(s) that exploded as supernovae. Two different scenarios have been proposed to explain the delivery of SLRs to the protosolar nebula: (i) direct pollution of the protosolar disc by supernova ejecta, and (ii) the formation of the Sun in a sequential star formation event in which supernovae shockwaves trigger further star formation which is enriched in SLRs. The sequentially triggered model has been suggested as being more astrophysically likely than the direct pollution scenario. In this paper, we investigate this claim by analysing a combination of N-body and smoothed particle hydrodynamics simulations of star formation. We find that sequential star formation would result in large age spreads (or even bi-modal age distributions for spatially coincident events) due to the dynamical relaxation of the first star formation event(s). Secondly, we discuss the probability of triggering spatially and temporally discrete populations of stars and find this to be only possible in very contrived situations. Taken together, these results suggest that the formation of the Solar system in a triggered star formation event is as improbable, if not more so, than the direct pollution of the protosolar disc by a supernova.

  14. Sensitivity of a simulated extreme precipitation event to spatial resolution, parametrisations and assimilation

    NASA Astrophysics Data System (ADS)

    Ferreira, J.; Carvalho, A.; Carvalheiro, L.; Rocha, A.; Castanheira, J.

    2010-09-01

    . The first part of this study evaluates the sensitivity of the model to horizontal resolution and physical parametrisations in the prediction of the selected precipitation extreme events. Additionally, two other sensitivity tests were performed with the OP1 configuration, one regarding the cumulus physics parametrisation, which has been switched of (i.e. explicit calculation of convective eddies), to compare the results with the operational configuration and the other with assimilation of surface and upper air data. Physical processes of the precipitation in this period have been revealed through the analysis of the precipitation fields associated with the microphysics and the cumulus parametrisations. During the early morning microphysics plays an important role, whereas for late morning precipitation is due to a squall line convective system. As expected, results show that model resolution affects the amount of predicted precipitation and the parameterizations affect the location and time of the extreme precipitation. For this particular event, assimilation seems to degrade the simulation, particularly the maximum of precipitation.

  15. Model simulations of the modulating effect of the snow cover in a rain on snow event

    NASA Astrophysics Data System (ADS)

    Wever, N.; Jonas, T.; Fierz, C.; Lehning, M.

    2014-05-01

    In October 2011, the Swiss Alps encountered a marked rain on snow event when a large snowfall on 8 and 9 October was followed by intense rain on the 10th. This resulted in severe flooding in some parts of Switzerland. Model simulations were carried out for 14 meteorological stations in two regions of the Swiss Alps using the detailed physically-based snowpack model SNOWPACK. The results show that the snow cover has a strong modulating effect on the incoming rainfall signal on the sub-daily time scales. The snowpack runoff dynamics appears to be strongly dependent on the snow depth at the onset of the rain. Deeper snow covers have more storage potential and can absorb all rain and meltwater in the first hours, whereas the snowpack runoff from shallow snow covers reacts much quicker. It has been found that after about 4-6 h, the snowpack produced runoff and after about 11-13 h, total snowpack runoff becomes higher than total rainfall as a result of additional snow melt. These values are strongly dependent on the snow height at the onset of rainfall as well as precipitation and melt rates. An ensemble model study was carried out, in which meteorological forcing and rainfall from other stations were used for repeated simulations at a specific station. Using regression analysis, the individual contributions of rainfall, snow melt and the storage could be quantified. It was found that once the snowpack is producing runoff, deep snow covers produce more runoff than shallow ones. This could be associated with a higher contribution of the storage term. This term represents the recession curve from the liquid water storage and snowpack settling. In the event under study, snow melt in deep snow covers also turned out to be higher than in the shallow ones, although this is rather accidental. Our results show the dual nature of snow covers in rain on snow events. Snow covers initially absorb important amounts of rain water, but once meltwater is released by the snow cover, the

  16. Integrating Existing Simulation Components into a Cohesive Simulation System

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment

  17. Simulation and analysis of infrastructure interdependencies using a Petri net simulator in a geographical information system

    NASA Astrophysics Data System (ADS)

    Ge, Yong; Xing, Xitao; Cheng, Qiuming

    2010-12-01

    Society relies greatly upon infrastructure networks that are highly interconnected and mutually dependent in complex ways. Simulation and modeling assist in dealing with the complexity of infrastructure networks, especially in the effective response and management of resources for rescue, recovery, and restoration. This paper introduces the Petri net into a geographical information system to develop the GeoPetri Net system, which can be used to simulate the complex geographical relationships among places and nodes. Unlike the ordinary Petri net, the GeoPetri Net deals with places and nodes with geographical locations and the geographical relationships between these nodes and records the statuses of nodes to produce simulated events. A case study involving an education layer with 15 nodes (schools) and a transportation layer with 25 node lines (streets) in a geographical information system is presented to substantiate the above conceptual arguments.

  18. Consistent simulations of multiple proxy responses to an abrupt climate change event.

    PubMed

    LeGrande, A N; Schmidt, G A; Shindell, D T; Field, C V; Miller, R L; Koch, D M; Faluvegi, G; Hoffmann, G

    2006-01-24

    Isotope, aerosol, and methane records document an abrupt cooling event across the Northern Hemisphere at 8.2 kiloyears before present (kyr), while separate geologic lines of evidence document the catastrophic drainage of the glacial Lakes Agassiz and Ojibway into the Hudson Bay at approximately the same time. This melt water pulse may have been the catalyst for a decrease in North Atlantic Deep Water formation and subsequent cooling around the Northern Hemisphere. However, lack of direct evidence for ocean cooling has lead to speculation that this abrupt event was purely local to Greenland and called into question this proposed mechanism. We simulate the response to this melt water pulse using a coupled general circulation model that explicitly tracks water isotopes and with atmosphere-only experiments that calculate changes in atmospheric aerosol deposition (specifically (10)Be and dust) and wetland methane emissions. The simulations produce a short period of significantly diminished North Atlantic Deep Water and are able to quantitatively match paleoclimate observations, including the lack of isotopic signal in the North Atlantic. This direct comparison with multiple proxy records provides compelling evidence that changes in ocean circulation played a major role in this abrupt climate change event. PMID:16415159

  19. Simulation of the cold climate event 8200 years ago by meltwater outburst from Lake Agassiz

    NASA Astrophysics Data System (ADS)

    Bauer, E.; Ganopolski, A.; Montoya, M.

    2004-09-01

    The cold climate anomaly about 8200 years ago is investigated with CLIMBER-2, a coupled atmosphere-ocean-biosphere model of intermediate complexity. This climate model simulates a cooling of about 3.6 K over the North Atlantic induced by a meltwater pulse from Lake Agassiz routed through the Hudson strait. The meltwater pulse is assumed to have a volume of 1.6 × 1014 m3 and a period of discharge of 2 years on the basis of glaciological modeling of the decay of the Laurentide Ice Sheet (LIS). We present a possible mechanism which can explain the centennial duration of the 8.2 ka cold event. The mechanism is related to the existence of an additional equilibrium climate state with reduced North Atlantic Deep Water (NADW) formation and a southward shift of the NADW formation area. Hints at the additional climate state were obtained from the largely varying duration of the pulse-induced cold episode in response to overlaid random freshwater fluctuations in Monte Carlo simulations. The model equilibrium state was attained by releasing a weak multicentury freshwater flux through the St. Lawrence pathway completed by the meltwater pulse. The existence of such a climate mode appears essential for reproducing climate anomalies in close agreement with paleoclimatic reconstructions of the 8.2 ka event. The results furthermore suggest that the temporal evolution of the cold event was partly a matter of chance.

  20. An investigation into pilot and system response to critical in-flight events, volume 2

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1981-01-01

    Critical in-flight event is studied using mission simulation and written tests of pilot responses. Materials and procedures used in knowledge tests, written tests, and mission simulations are included

  1. Using simulation to evaluate warhead monitoring system effectiveness

    SciTech Connect

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.; Liles, Karina R.; Meyer, Nicholas J.; Oster, Matthew R.; Waterworth, Angela M.

    2015-07-12

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled as declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical

  2. Control of discrete event systems modeled as hierarchical state machines

    NASA Technical Reports Server (NTRS)

    Brave, Y.; Heymann, M.

    1991-01-01

    The authors examine a class of discrete event systems (DESs) modeled as asynchronous hierarchical state machines (AHSMs). For this class of DESs, they provide an efficient method for testing reachability, which is an essential step in many control synthesis procedures. This method utilizes the asynchronous nature and hierarchical structure of AHSMs, thereby illustrating the advantage of the AHSM representation as compared with its equivalent (flat) state machine representation. An application of the method is presented where an online minimally restrictive solution is proposed for the problem of maintaining a controlled AHSM within prescribed legal bounds.

  3. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  4. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  5. Simulating rainfall-runoff dynamics of selected flash flood events in Slovakia using the KLEM hydrological model

    NASA Astrophysics Data System (ADS)

    Horvat, O.; Hlavcova, K.; Kohnova, S.; Szolgay, J.; Remiasova, R.

    2009-04-01

    The HYDRATE project (Hydrometeorological Data Resources and Technologies for Effective Flash Flood Forecasting) objective is to improve the scientific basis of flash flood forecasting by extending the understanding of past flash flood events and developing a coherent set of technologies and tools for effective early warning systems. To understand rainfall-runoff processes during selected extreme flash floods occurred in the past in Slovakia, runoff responses during selected major events were examined by using the spatially distributed hydrologic model KLEM (Kinematic Local Excess Model (Borga et al., 2007)). The distributed hydrological model is based on availability of raster information of the landscape topography, the soil and vegetation properties and radar rainfall data. In the model, the SCS-Curve Number procedure is applied on a grid way for the spatially distributed representation of runoff generating processes. For representing runoff routing a description of the drainage system response is used. In Slovakia, 3 extreme events selected from the HYDRATE flash-flood database were simulated by the model. Three selected major flash floods occurred 20th of July 1998 in the Malá Svinka and Dubovický creeks, 24th of July 2001 in the Štrbský Creek (both with more than 1000-years return period) and 19th of June 2004 in the Turniansky Creek (with 100-years return period). Rainfall-runoff characteristics of the floods in the Malá Svinka, Dubovický and Štrbský creek basins were similar and the floods had a similar progress. A value of runoff coefficient varied from 0.39 to 0.56. Opposite to them, the highest runoff coefficient in the Turniansky Creek Basin only reached a value equal to 0.26. The simulated values by the KLEM model were comparable with maximum peaks estimated on the base of post event surveying. The consistency of the estimated and simulated values by the KLEM model was evident both in time and space and the methodology has shown its

  6. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  7. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  8. Spatial and Temporal Signatures of Flux Transfer Events in Global Simulations of Magnetopause Dynamics

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Sibeck, David Gary; Hesse, Michael; Berrios, David; Rastaetter, Lutz; Toth, Gabor; Gombosi, Tamas I.

    2011-01-01

    Flux transfer events (FTEs) were originally identified by transient bipolar variations of the magnetic field component normal to the nominal magnetopause centered on enhancements in the total magnetic field strength. Recent Cluster and THEMIS multi-point measurements provided a wide range of signatures that are interpreted as evidence for FTE passage (e.g., crater FTE's, traveling magnetic erosion regions). We use the global magnetohydrodynamic (MHD) code BATS-R-US developed at the University of Michigan to model the global three-dimensional structure and temporal evolution of FTEs during multi-spacecraft magnetopause crossing events. Comparison of observed and simulated signatures and sensitivity analysis of the results to the probe location will be presented. We will demonstrate a variety of observable signatures in magnetic field profile that depend on space probe location with respect to the FTE passage. The global structure of FTEs will be illustrated using advanced visualization tools developed at the Community Coordinated Modeling Center

  9. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  10. Extended temperature-accelerated dynamics: Enabling long-time full-scale modeling of large rare-event systems

    SciTech Connect

    Bochenkov, Vladimir; Suetin, Nikolay; Shankar, Sadasivan

    2014-09-07

    A new method, the Extended Temperature-Accelerated Dynamics (XTAD), is introduced for modeling long-timescale evolution of large rare-event systems. The method is based on the Temperature-Accelerated Dynamics approach [M. Sørensen and A. Voter, J. Chem. Phys. 112, 9599 (2000)], but uses full-scale parallel molecular dynamics simulations to probe a potential energy surface of an entire system, combined with the adaptive on-the-fly system decomposition for analyzing the energetics of rare events. The method removes limitations on a feasible system size and enables to handle simultaneous diffusion events, including both large-scale concerted and local transitions. Due to the intrinsically parallel algorithm, XTAD not only allows studies of various diffusion mechanisms in solid state physics, but also opens the avenue for atomistic simulations of a range of technologically relevant processes in material science, such as thin film growth on nano- and microstructured surfaces.

  11. Near-relativistic electron events. Monte Carlo simulations of solar injection and interplanetary transport

    NASA Astrophysics Data System (ADS)

    Agueda, N.

    2008-04-01

    We have developed a Monte Carlo model to simulate the transport of solar near-relativistic (NR; 30-300 keV) electrons along the interplanetary magnetic field (IMF), including adiabatic focusing, pitch-angle dependent scattering, and solar wind effects. By taking into account the angular response of the LEFS60 telescope of the EPAM experiment on board the "Advanced Composition Explorer" spacecraft, we have been able to transform simulated pitch-angle distributions into sectored intensities measured by the telescope. We have developed an algorithm that allows us, for the first time, to infer the best-fit transport conditions and the underlying solar injection profile of NR electrons from the deconvolution of observational sectored intensities. We have studied seven NR electron events observed by the LEFS60 telescope between 1998 and 2004 with the aim of estimating the roles that solar flares and CME-driven shocks play in the acceleration and injection of NR electrons, as well as the conditions of the electron transport along the IMF. In this set of seven NR electron events, we have identified two types of injection episodes in the derived injection profiles: short (< 15 min) and time-extended (> 1 h). The injection profile of three events shows both components; an initial injection episode of short duration, followed by a second much longer lasting episode; two events only show a time-extended injection episode; while the others show an injection profile composed by several short injection episodes. By comparing the timing of the injection with the associated electromagnetic emissions at the Sun, we have concluded that short injection episodes are preferentially associated with the injection of flare-accelerated particles, while longer lasting episodes are provided by CME-driven shocks.

  12. Role of land state in a high resolution mesoscale model for simulating the Uttarakhand heavy rainfall event over India

    NASA Astrophysics Data System (ADS)

    Rajesh, P. V.; Pattnaik, S.; Rai, D.; Osuri, K. K.; Mohanty, U. C.; Tripathy, S.

    2016-04-01

    In 2013, Indian summer monsoon witnessed a very heavy rainfall event (>30 cm/day) over Uttarakhand in north India, claiming more than 5000 lives and property damage worth approximately 40 billion USD. This event was associated with the interaction of two synoptic systems, i.e., intensified subtropical westerly trough over north India and north-westward moving monsoon depression formed over the Bay of Bengal. The event had occurred over highly variable terrain and land surface characteristics. Although global models predicted the large scale event, they failed to predict realistic location, timing, amount, intensity and distribution of rainfall over the region. The goal of this study is to assess the impact of land state conditions in simulating this severe event using a high resolution mesoscale model. The land conditions such as multi-layer soil moisture and soil temperature fields were generated from High Resolution Land Data Assimilation (HRLDAS) modelling system. Two experiments were conducted namely, (1) CNTL (Control, without land data assimilation) and (2) LDAS, with land data assimilation (i.e., with HRLDAS-based soil moisture and temperature fields) using Weather Research and Forecasting (WRF) modelling system. Initial soil moisture correlation and root mean square error for LDAS is 0.73 and 0.05, whereas for CNTL it is 0.63 and 0.053 respectively, with a stronger heat low in LDAS. The differences in wind and moisture transport in LDAS favoured increased moisture transport from Arabian Sea through a convectively unstable region embedded within two low pressure centers over Arabian Sea and Bay of Bengal. The improvement in rainfall is significantly correlated to the persistent generation of potential vorticity (PV) in LDAS. Further, PV tendency analysis confirmed that the increased generation of PV is due to the enhanced horizontal PV advection component rather than the diabatic heating terms due to modified flow fields. These results suggest that, two

  13. Fast and robust microseismic event detection using very fast simulated annealing

    NASA Astrophysics Data System (ADS)

    Velis, Danilo R.; Sabbione, Juan I.; Sacchi, Mauricio D.

    2013-04-01

    The study of microseismic data has become an essential tool in many geoscience fields, including oil reservoir geophysics, mining and CO2 sequestration. In hydraulic fracturing, microseismicity studies permit the characterization and monitoring of the reservoir dynamics in order to optimize the production and the fluid injection process itself. As the number of events is usually large and the signal-to-noise ratio is in general very low, fast, automated, and robust detection algorithms are required for most applications. Also, real-time functionality is commonly needed to control the fluid injection in the field. Generally, events are located by means of grid search algorithms that rely on some approximate velocity model. These techniques are very effective and accurate, but computationally intensive when dealing with large three or four-dimensional grids. Here, we present a fast and robust method that allows to automatically detect and pick an event in 3C microseismic data without any input information about the velocity model. The detection is carried out by means of a very fast simulated annealing (VFSA) algorithm. To this end, we define an objective function that measures the energy of a potential microseismic event along the multichannel signal. This objective function is based on the stacked energy of the envelope of the signals calculated within a predefined narrow time window that depends on the source position, receivers geometry and velocity. Once an event has been detected, the source location can be estimated, in a second stage, by inverting the corresponding traveltimes using a standard technique, which would naturally require some knowledge of the velocity model. Since the proposed technique focuses on the detection of the microseismic events only, the velocity model is not required, leading to a fast algorithm that carries out the detection in real-time. Besides, the strategy is applicable to data with very low signal-to-noise ratios, for it relies

  14. Simulations of The Extreme Precipitation Event Enhanced by Sea Surface Temperature Anomaly over the Black Sea

    NASA Astrophysics Data System (ADS)

    Hakan Doǧan, Onur; Önol, Barış

    2016-04-01

    Istanbul Technical University, Aeronautics and Astronautics Faculty, Meteorological Engineering, Istanbul, Turkey In this study, we examined the extreme precipitation case over the Eastern Black Sea region of Turkey by using regional climate model, RegCM4. The flood caused by excessive rain in August 26, 2010 killed 12 people and the landslides in Rize province have damaged many buildings. The station based two days total precipitation exceeds 200 mm. One of the usual suspects for this extreme event is positive anomaly of sea surface temperature (SST) over the Black Sea where the significant warming trend is clear in the last three decades. In August 2010, the monthly mean SST is higher than 3 °C with respect to the period of 1981-2010. We designed three sensitivity simulations with RegCM4 to define the effects of the Black Sea as a moisture source. The simulation domain with 10-km horizontal resolution covers all the countries bordering the Black Sea and simulation period is defined for entire August 2010. It is also noted that the spatial variability of the precipitation produced by the reference simulation (Sim-0) is consistent with the TRMM data. In terms of analysis of the sensitivity to SST, we forced the simulations by subtracting 1 °C (Sim-1), 2 °C (Sim-2) and 3 °C (Sim-3) from the ERA-Interim 6-hourly SST data (considering only the Black Sea). The sensitivity simulations indicate that daily total precipitation for all these simulations gradually decreased based on the reference simulation (Sim-0). 3-hourly maximum precipitation rates for Sim-0, Sim-1, Sim-2 and Sim-3 are 32, 25, 13 and 10.5 mm respectively over the hotspot region. Despite the fact that the simulations signal points out the same direction, degradation of the precipitation intensity does not indicate the same magnitude for all simulations. It is revealed that 2 °C (Sim-2) threshold is critical for SST sensitivity. We also calculated the humidity differences from the simulation and these

  15. INTEGRATED SYSTEM SIMULATION IN X-RAY RADIOGRAPHY

    SciTech Connect

    T. KWAN; ET AL

    2001-01-01

    An integrated simulation capability is being developed to examine the fidelity of a dynamic radiographic system. This capability consists of a suite of simulation codes which individually model electromagnetic and particle transport phenomena and are chained together to model an entire radiographic event. Our study showed that the electron beam spot size at the converter target plays the key role in determining material edge locations. The angular spectrum is a relatively insensitive factor in radiographic fidelity. We also found that the full energy spectrum of the imaging photons must be modeled to obtain an accurate analysis of material densities.

  16. Power electronics system modeling and simulation

    SciTech Connect

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  17. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model. PMID:26262262

  18. Simulations of Wave Propagation in the Jovian Atmosphere after SL9 Impact Events

    NASA Astrophysics Data System (ADS)

    Pond, Jarrad W.; Palotai, C.; Korycansky, D.; Harrington, J.

    2013-10-01

    Our previous numerical investigations into Jovian impacts, including the Shoemaker Levy- 9 (SL9) event (Korycansky et al. 2006 ApJ 646. 642; Palotai et al. 2011 ApJ 731. 3), the 2009 bolide (Pond et al. 2012 ApJ 745. 113), and the ephemeral flashes caused by smaller impactors in 2010 and 2012 (Hueso et al. 2013; Submitted to A&A), have covered only up to approximately 3 to 30 seconds after impact. Here, we present further SL9 impacts extending to minutes after collision with Jupiter’s atmosphere, with a focus on the propagation of shock waves generated as a result of the impact events. Using a similar yet more efficient remapping method than previously presented (Pond et al. 2012; DPS 2012), we move our simulation results onto a larger computational grid, conserving quantities with minimal error. The Jovian atmosphere is extended as needed to accommodate the evolution of the features of the impact event. We restart the simulation, allowing the impact event to continue to progress to greater spatial extents and for longer times, but at lower resolutions. This remap-restart process can be implemented multiple times to achieve the spatial and temporal scales needed to investigate the observable effects of waves generated by the deposition of energy and momentum into the Jovian atmosphere by an SL9-like impactor. As before, we use the three-dimensional, parallel hydrodynamics code ZEUS-MP 2 (Hayes et al. 2006 ApJ.SS. 165. 188) to conduct our simulations. Wave characteristics are tracked throughout these simulations. Of particular interest are the wave speeds and wave positions in the atmosphere as a function of time. These properties are compared to the characteristics of the HST rings to see if shock wave behavior within one hour of impact is consistent with waves observed at one hour post-impact and beyond (Hammel et al. 1995 Science 267. 1288). This research was supported by National Science Foundation Grant AST-1109729 and NASA Planetary Atmospheres Program Grant

  19. Global Positioning System Simulator Field Operational Procedures

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Quinn, David A.; Day, John H. (Technical Monitor)

    2002-01-01

    Global Positioning System (GPS) simulation is an important activity in the development or qualification of GPS signal receivers for space flight. Because a GPS simulator is a critical resource it is highly desirable to develop a set of field operational procedures to supplement the basic procedures provided by most simulator vendors. Validated field procedures allow better utilization of the GPS simulator in the development of new test scenarios and simulation operations. These procedures expedite simulation scenario development while resulting in scenarios that are more representative of the true design, as well as enabling construction of more complex simulations than previously possible, for example, spacecraft maneuvers. One difficulty in the development of a simulation scenario is specifying various modes of test vehicle motion and associated maneuvers requiring that a user specify some (but not all) of a few closely related simulation parameters. Currently this can only be done by trial and error. A stand-alone procedure that implements the simulator maneuver motion equations and solves for the motion profile transient times, jerk and acceleration would be of considerable value. Another procedure would permit the specification of some configuration parameters that would determine the simulated GPS signal composition. The resulting signal navigation message, for example, would force the receiver under test to use only the intended C-code component of the simulated GPS signal. A representative class of GPS simulation-related field operational procedures is described in this paper. These procedures were developed and used in support of GPS integration and testing for many successful spacecraft missions such as SAC-A, EO-1, AMSAT, VCL, SeaStar, sounding rockets, and by using the industry standard Spirent Global Simulation Systems Incorporated (GSSI) STR series simulators.

  20. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  1. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    SciTech Connect

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  2. Characteristics of flight simulator visual systems

    NASA Technical Reports Server (NTRS)

    Statler, I. C. (Editor)

    1981-01-01

    The physical parameters of the flight simulator visual system that characterize the system and determine its fidelity are identified and defined. The characteristics of visual simulation systems are discussed in terms of the basic categories of spatial, energy, and temporal properties corresponding to the three fundamental quantities of length, mass, and time. Each of these parameters are further addressed in relation to its effect, its appropriate units or descriptors, methods of measurement, and its use or importance to image quality.

  3. Simulating Rain Fade In A Communication System

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Kurt A.; Nagy, Lawrence A.; Svoboda, James K.

    1994-01-01

    Automated, computer-controlled assembly of electronic equipment developed for use in simulation testing of downlink portion of Earth/satellite microwave digital communication system. Designed to show effects upon performance of system of rain-induced fading in received signal and increases in transmitted power meant to compensate for rain-induced fading. Design of communication system improved iteratively in response to results of simulations, leading eventually to design ensuring clear, uninterrupted transmission of digital signals.

  4. : A Scalable and Transparent System for Simulating MPI Programs

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    is a scalable, transparent system for experimenting with the execution of parallel programs on simulated computing platforms. The level of simulated detail can be varied for application behavior as well as for machine characteristics. Unique features of are repeatability of execution, scalability to millions of simulated (virtual) MPI ranks, scalability to hundreds of thousands of host (real) MPI ranks, portability of the system to a variety of host supercomputing platforms, and the ability to experiment with scientific applications whose source-code is available. The set of source-code interfaces supported by is being expanded to support a wider set of applications, and MPI-based scientific computing benchmarks are being ported. In proof-of-concept experiments, has been successfully exercised to spawn and sustain very large-scale executions of an MPI test program given in source code form. Low slowdowns are observed, due to its use of purely discrete event style of execution, and due to the scalability and efficiency of the underlying parallel discrete event simulation engine, sik. In the largest runs, has been executed on up to 216,000 cores of a Cray XT5 supercomputer, successfully simulating over 27 million virtual MPI ranks, each virtual rank containing its own thread context, and all ranks fully synchronized by virtual time.

  5. Simulation of the characteristics of low-energy proton induced single event upset

    NASA Astrophysics Data System (ADS)

    Geng, Chao; Xi, Kai; Liu, TianQi; Liu, Jie

    2014-10-01

    Monte Carlo simulation results are reported on the single event upset (SEU) triggered by the direct ionization effect of low-energy proton. The SEU cross-sections on the 45 nm static random access memory (SRAM) were compared with previous research work, which not only validated the simulation approach used herein, but also exposed the existence of saturated cross-section and the multiple bit upsets (MBUs) when the incident energy was less than 1 MeV. Additionally, it was observed that the saturated cross-section and MBUs are involved with energy loss and critical charge. The amount of deposited charge and the distribution with respect to the critical charge as the supplemental evidence are discussed.

  6. Simulation of temporal characteristics of ion-velocity susceptibility to single event upset effect

    NASA Astrophysics Data System (ADS)

    Geng, Chao; Xi, Kai; Liu, Tian-Qi; Gu, Song; Liu, Jie

    2014-08-01

    Using a Monte Carlo simulation tool of the multi-functional package for SEEs Analysis (MUFPSA), we study the temporal characteristics of ion-velocity susceptibility to the single event upset (SEU) effect, including the deposited energy, traversed time within the device, and profile of the current pulse. The results show that the averaged dposited energy decreases with the increase of the ion-velocity, and incident ions of 209Bi have a wider distribution of energy deposition than 132Xe at the same ion-velocity. Additionally, the traversed time presents an obvious decreasing trend with the increase of ion-velocity. Concurrently, ion-velocity certainly has an influence on the current pulse and then it presents a particular regularity. The detailed discussion is conducted to estimate the relevant linear energy transfer (LET) of incident ions and the SEU cross section of the testing device from experiment and simulation and to critically consider the metric of LET.

  7. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family.

    NASA Astrophysics Data System (ADS)

    Custodio, Maria; Ambrizzi, Tercio; da Rocha, Rosmeri

    2015-04-01

    coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The austral summer and winter composites of interannual and intraseasonal anomalies showed for wet and dry extreme events the same spatial distribution in models and reanalyses. The interannual variability analysis showed that coupled simulations intensify the impact of the El Niño Southern Oscillation (ENSO) in the Amazon. In the Intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. Note that there are differences between simulated and observed IS anomalies indicating that the models have problems to correctly represent the intensity of low frequency phenomena in this scale. The simulation of ENSO in GCMs can be attributed to their high resolution, mainly in the oceanic component, which contributes to the better solution of the small scale vortices in the ocean. This implies in improvements in the forecasting of sea surface temperature (SST) and as consequence in the ability of atmosphere to respond to this feature.

  8. Global positioning system interference and satellite anomalous event monitor

    NASA Astrophysics Data System (ADS)

    Marti, Lukas M.

    Global Positioning System satellite Signal Quality Monitoring (SQM) is required to ensure the integrity of the received signal for aviation safety-critical systems. Failure mitigation is not addressed since failure detection ensures system integrity. The GPS Anomalous Event Monitor (GAEM) is introduced, consisting of a GPS receiver serving as an anomaly sensor, and the Software Defined Radio, allowing for a thorough analysis of signal malfunction modes through advanced signal processing techniques. Algorithms to monitor the GPS signal by the anomaly sensor are developed and in case of possible signal inconsistencies the signal is analyzed by the Software Defined Radio. For the purpose of quality monitoring it is essential to understand the impact of the radio frequency front-end on the received signal, and implicitly onto the signal parameter estimation process; otherwise a signal inconsistency may be flagged which is induced by the monitoring system. Thus, radio frequency front-end induced errors are examined and the statistics for signal parameter estimators are derived. As the statistics of an anomalous signal are unknown, a non-parametric, non-homoscedastic (uncommon variance of sample space) statistical test is developed. Berry-Esseen bounds are introduced to quantify convergence and to establish confidence levels. The algorithm is applied to the detection of signal anomalies, with emphasis on interference detection. The algorithms to detect GPS signal anomalies are verified with experimental data. The performance of the interference detection algorithms is demonstrated through data collection in a shielded measurement chamber. Actual GPS signals in combination with interference sources such as narrowband, wideband and pulsed interference were broadcast in the chamber. Subsequently, case studies from continuous GPS monitoring are included and observed anomalies are discussed. The performance demonstration of the GPS anomalous event monitor is concluded with a

  9. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    PubMed Central

    2012-01-01

    Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality

  10. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  11. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  12. Solar simulator for concentrator photovoltaic systems.

    PubMed

    Domínguez, César; Antón, Ignacio; Sala, Gabriel

    2008-09-15

    A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories. PMID:18795026

  13. An interactive beam position monitor system simulator

    SciTech Connect

    Ryan, W.A.; Shea, T.J.

    1993-03-01

    A system simulator has been implemented to aid the development of the RHIC position monitor system. Based on the LabVIEW software package by National Instruments, this simulator allows engineers and technicians to interactively explore the parameter space of a system during the design phase. Adjustable parameters are divided into three categories: beam, pickup, and electronics. The simulator uses these parameters in simple formulas to produce results in both time-domain and frequencydomain. During the prototyping phase, these simulated results can be compared to test data acquired with the same software package. The RHIC position monitor system is presented as an example, but the software is applicable to several other systems as well.

  14. ROBOSIM, a simulator for robotic systems

    NASA Technical Reports Server (NTRS)

    Hinman, Elaine M.; Fernandez, Ken; Cook, George E.

    1991-01-01

    ROBOSIM, a simulator for robotic systems, was developed by NASA to aid in the rapid prototyping of automation. ROBOSIM has allowed the development of improved robotic systems concepts for both earth-based and proposed on-orbit applications while significantly reducing development costs. In a cooperative effort with an area university, ROBOSIM was further developed for use in the classroom as a safe and cost-effective way of allowing students to study robotic systems. Students have used ROBOSIM to study existing robotic systems and systems which they have designed in the classroom. Since an advanced simulator/trainer of this type is beneficial not only to NASA projects and programs but industry and academia as well, NASA is in the process of developing this technology for wider public use. An update on the simulators's new application areas, the improvements made to the simulator's design, and current efforts to ensure the timely transfer of this technology are presented.

  15. The role of regional climate model setup in simulating two extreme precipitation events in the European Alpine region

    NASA Astrophysics Data System (ADS)

    Awan, Nauman Khurshid; Gobiet, Andreas; Suklitsch, Martin

    2015-01-01

    In this study we have investigated the role of domain settings and model's physics in simulating two extreme precipitation events. Four regional climate models, all driven with a re-analysis dataset were used to create an ensemble of 61 high-resolution simulations by varying physical parameterization schemes, domain sizes, nudging and nesting techniques. The two discussed events are three-day time slices taken from approximately 15-months long climate simulations. The results show that dynamical downscaling significantly improves the spatial characteristics such as correlation, variability as well as location and intensity of maximum precipitation. Spatial variability, which is underestimated by most of the simulations can be improved by choosing suitable vertical resolution, convective and microphysics scheme. The results further suggest that for studies focusing on extreme precipitation events relatively small domains or nudging could be advantageous. However, a final conclusion on this issue would be premature, since only two extreme precipitation events are considered.

  16. The role of regional climate model setup in simulating two extreme precipitation events in the European Alpine region

    NASA Astrophysics Data System (ADS)

    Awan, Nauman Khurshid; Gobiet, Andreas; Suklitsch, Martin

    2014-09-01

    In this study we have investigated the role of domain settings and model's physics in simulating two extreme precipitation events. Four regional climate models, all driven with a re-analysis dataset were used to create an ensemble of 61 high-resolution simulations by varying physical parameterization schemes, domain sizes, nudging and nesting techniques. The two discussed events are three-day time slices taken from approximately 15-months long climate simulations. The results show that dynamical downscaling significantly improves the spatial characteristics such as correlation, variability as well as location and intensity of maximum precipitation. Spatial variability, which is underestimated by most of the simulations can be improved by choosing suitable vertical resolution, convective and microphysics scheme. The results further suggest that for studies focusing on extreme precipitation events relatively small domains or nudging could be advantageous. However, a final conclusion on this issue would be premature, since only two extreme precipitation events are considered.

  17. Simulation of large systems with neural networks

    SciTech Connect

    Paez, T.L.

    1994-09-01

    Artificial neural networks (ANNs) have been shown capable of simulating the behavior of complex, nonlinear, systems, including structural systems. Under certain circumstances, it is desirable to simulate structures that are analyzed with the finite element method. For example, when we perform a probabilistic analysis with the Monte Carlo method, we usually perform numerous (hundreds or thousands of) repetitions of a response simulation with different input and system parameters to estimate the chance of specific response behaviors. In such applications, efficiency in computation of response is critical, and response simulation with ANNs can be valuable. However, finite element analyses of complex systems involve the use of models with tens or hundreds of thousands of degrees of freedom, and ANNs are practically limited to simulations that involve far fewer variables. This paper develops a technique for reducing the amount of information required to characterize the response of a general structure. We show how the reduced information can be used to train a recurrent ANN. Then the trained ANN can be used to simulate the reduced behavior of the original system, and the reduction transformation can be inverted to provide a simulation of the original system. A numerical example is presented.

  18. Dynamic system simulation of small satellite projects

    NASA Astrophysics Data System (ADS)

    Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper

    2010-11-01

    A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.

  19. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGESBeta

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  20. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  1. Simulation of rainfall-runoff for major flash flood events in Karachi

    NASA Astrophysics Data System (ADS)

    Zafar, Sumaira

    2016-07-01

    Metropolitan city Karachi has strategic importance for Pakistan. With the each passing decade the city is facing urban sprawl and rapid population growth. These rapid changes directly affecting the natural resources of city including its drainage pattern. Karachi has three major cities Malir River with the catchment area of 2252 sqkm and Lyari River has catchment area about 470.4 sqkm. These are non-perennial rivers and active only during storms. Change of natural surfaces into hard pavement causing an increase in rainfall-runoff response. Curve Number is increased which is now causing flash floods in the urban locality of Karachi. There is only one gauge installed on the upstream of the river but there no record for the discharge. Only one gauge located at the upstream is not sufficient for discharge measurements. To simulate the maximum discharge of Malir River rainfall (1985 to 2014) data were collected from Pakistan meteorological department. Major rainfall events use to simulate the rainfall runoff. Maximum rainfall-runoff response was recorded in during 1994, 2007 and 2013. This runoff causes damages and inundation in floodplain areas of Karachi. These flash flooding events not only damage the property but also cause losses of lives

  2. Particle simulation of plasmas and stellar systems

    SciTech Connect

    Tajima, T.; Clark, A.; Craddock, G.G.; Gilden, D.L.; Leung, W.K.; Li, Y.M.; Robertson, J.A.; Saltzman, B.J.

    1985-04-01

    A computational technique is introduced which allows the student and researcher an opportunity to observe the physical behavior of a class of many-body systems. A series of examples is offered which illustrates the diversity of problems that may be studied using particle simulation. These simulations were in fact assigned as homework in a course on computational physics.

  3. Instructional Simulation of a Commercial Banking System.

    ERIC Educational Resources Information Center

    Hester, Donald D.

    1991-01-01

    Describes an instructional simulation of a commercial banking system. Identifies the teaching of portfolio theory, market robustness, and the subtleties of institutional constraints and decision making under uncertainty as the project's goals. Discusses the results of applying the simulation in an environment of local and national markets and a…

  4. Crop Simulation Models and Decision Support Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The first computer simulation models for agricultural systems were developed in the 1970s. These early models simulated potential production for major crops as a function of weather conditions, especially temperature and solar radiation. At a later stage, the water component was added to be able to ...

  5. The Canadian Hospital Executive Simulation System (CHESS).

    PubMed

    Pink, G H; Knotts, U A; Parrish, L G; Shields, C A

    1991-01-01

    The Canadian Hospital Executive Simulation System (CHESS) is a computer-based management decision-making game designed specifically for Canadian hospital managers. The paper begins with an introduction on the development of business and health services industry-specific simulation games. An overview of CHESS is provided, along with a description of its development and a discussion of its educational benefits. PMID:10109530

  6. Using Expert Systems To Build Cognitive Simulations.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wang, Sherwood

    2003-01-01

    Cognitive simulations are runnable computer programs for modeling human cognitive activities. A case study is reported where expert systems were used as a formalism for modeling metacognitive processes in a seminar. Building cognitive simulations engages intensive introspection, ownership and meaning making in learners who build them. (Author/AEF)

  7. An evaluation of a coupled atmosphere-ocean modelling system for regional climate studies: extreme events in the North Atlantic

    NASA Astrophysics Data System (ADS)

    Mooney, Priscilla A.; Mulligan, Frank J.

    2013-04-01

    We investigate the ability of a coupled regional atmosphere-ocean modelling system to simulate two extreme events in the North Atlantic. In this study we use the Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST; Warner et al.) modelling system with only the atmosphere and ocean models activated. COAWST couples the atmosphere model (Weather Research and Forecasting model; WRF) to the ocean model (Regional Ocean Modelling System; ROMS) with the Model Coupling Toolkit. Results from the coupled system are compared with atmosphere only simulations of North Atlantic storms to evaluate the performance of the coupled modelling system. Two extreme events (Hurricane Katia and Hurricane Irene) were chosen to assess the level of improvement (or otherwise) arising from coupling WRF with ROMS. These two hurricanes involve different dynamics and present different challenges to the modeling system. This provides a robust assessment of the advantages or disadvantages of coupling WRF with ROMS for regional climate modelling studies of extreme events in the North Atlantic. We examine the ability of the coupled modelling system to simulate these two extreme events by comparing modelled storm tracks, storm intensities, wind speeds and sea surface temperatures with observations in all cases. The effect of domain size, and two different planetary boundary layers used in WRF are also reported.

  8. 2007 Mutual events within the binary system of (22) Kalliope

    NASA Astrophysics Data System (ADS)

    Descamps, P.; Marchis, F.; Pollock, J.; Berthier, J.; Birlan, M.; Vachier, F.; Colas, F.

    2008-11-01

    In 2007, the asteroid Kalliope will reach one of its annual equinoxes. As a consequence, its small satellite Linus orbiting in the equatorial plane will undergo a season of mutual eclipses and occultations very similar to the one that the Galilean satellites undergo every 6 years. This paper is aimed at preparing a campaign of observations of these mutual events occurring from February to May 2007. This opportunity occurs only under favorable geometric conditions when the Sun and/or the Earth are close to the orbital plane of the system. This is the first international campaign devoted to the observation of photometric events within an asynchronous asteroidal binary system. We took advantage of a reliable orbit solution of Linus to predict a series of 24 mutual eclipses and 12 mutual occultations observable in the spring of 2007. Thanks to the brightness of Kalliope ( mv≃11), these observations are easy to perform even with a small telescope. Anomalous attenuation events could be observed lasting for about 1-3 h with amplitude up to 0.09 mag. The attenuations are of two distinct types that can clearly be identified as primary and secondary eclipses similar to those that have been previously observed in other minor planet binary systems [Pravec, P., Scheirich, P., Kusnirák, P., Sarounová, L., Mottola, S., Hahn, G., Brown, P., Esquerdo, G., Kaiser, N., Krzeminski, Z., Pray, D.P., Warner, B.D., Harris, A.W., Nolan, M.C., Howell, E.S., Benner, L.A.M., Margot, J.-L., Galád, A., Holliday, W., Hicks, M.D., Krugly, Yu.N., Tholen, D., Whiteley, R., Marchis, F., Degraff, D.R., Grauer, A., Larson, S., Velichko, F.P., Cooney, W.R., Stephens, R., Zhu, J., Kirsch, K., Dyvig, R., Snyder, L., Reddy, V., Moore, S., Gajdos, S., Világi, J., Masi, G., Higgins, D., Funkhouser, G., Knight, B., Slivan, S., Behrend, R., Grenon, M., Burki, G., Roy, R., Demeautis, C., Matter, D., Waelchli, N., Revaz, Y., Klotz, A., Rieugné, M., Thierry, P., Cotrez, V., Brunetto, L., Kober, G., 2006

  9. Extreme event statistics of daily rainfall: dynamical systems approach

    NASA Astrophysics Data System (ADS)

    Cigdem Yalcin, G.; Rabassa, Pau; Beck, Christian

    2016-04-01

    We analyse the probability densities of daily rainfall amounts at a variety of locations on Earth. The observed distributions of the amount of rainfall fit well to a q-exponential distribution with exponent q close to q≈ 1.3. We discuss possible reasons for the emergence of this power law. In contrast, the waiting time distribution between rainy days is observed to follow a near-exponential distribution. A careful investigation shows that a q-exponential with q≈ 1.05 yields the best fit of the data. A Poisson process where the rate fluctuates slightly in a superstatistical way is discussed as a possible model for this. We discuss the extreme value statistics for extreme daily rainfall, which can potentially lead to flooding. This is described by Fréchet distributions as the corresponding distributions of the amount of daily rainfall decay with a power law. Looking at extreme event statistics of waiting times between rainy days (leading to droughts for very long dry periods) we obtain from the observed near-exponential decay of waiting times extreme event statistics close to Gumbel distributions. We discuss superstatistical dynamical systems as simple models in this context.

  10. High Frequency Mechanical Pyroshock Simulations for Payload Systems

    SciTech Connect

    BATEMAN,VESTA I.; BROWN,FREDERICK A.; CAP,JEROME S.; NUSSER,MICHAEL A.

    1999-12-15

    Sandia National Laboratories (SNL) designs mechanical systems with components that must survive high frequency shock environments including pyrotechnic shock. These environments have not been simulated very well in the past at the payload system level because of weight limitations of traditional pyroshock mechanical simulations using resonant beams and plates. A new concept utilizing tuned resonators attached to the payload system and driven with the impact of an airgun projectile allow these simulations to be performed in the laboratory with high precision and repeatability without the use of explosives. A tuned resonator has been designed and constructed for a particular payload system. Comparison of laboratory responses with measurements made at the component locations during actual pyrotechnic events show excellent agreement for a bandwidth of DC to 4 kHz. The bases of comparison are shock spectra. This simple concept applies the mechanical pyroshock simulation simultaneously to all components with the correct boundary conditions in the payload system and is a considerable improvement over previous experimental techniques and simulations.

  11. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  12. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  13. Simulation of gas hydrate dissociation caused by repeated tectonic uplift events

    NASA Astrophysics Data System (ADS)

    Goto, Shusaku; Matsubayashi, Osamu; Nagakubo, Sadao

    2016-05-01

    Gas hydrate dissociation by tectonic uplift is often used to explain geologic and geophysical phenomena, such as hydrate accumulation probably caused by hydrate recycling and the occurrence of double bottom-simulating reflectors in tectonically active areas. However, little is known of gas hydrate dissociation resulting from tectonic uplift. This study investigates gas hydrate dissociation in marine sediments caused by repeated tectonic uplift events using a numerical model incorporating the latent heat of gas hydrate dissociation. The simulations showed that tectonic uplift causes upward movement of some depth interval of hydrate-bearing sediment immediately above the base of gas hydrate stability (BGHS) to the gas hydrate instability zone because the sediment initially maintains its temperature: in that interval, gas hydrate dissociates while absorbing heat; consequently, the temperature of the interval decreases to that of the hydrate stability boundary at that depth. Until the next uplift event, endothermic gas hydrate dissociation proceeds at the BGHS using heat mainly supplied from the sediment around the BGHS, lowering the temperature of that sediment. The cumulative effects of these two endothermic gas hydrate dissociations caused by repeated uplift events lower the sediment temperature around the BGHS, suggesting that in a marine area in which sediment with a highly concentrated hydrate-bearing layer just above the BGHS has been frequently uplifted, the endothermic gas hydrate dissociation produces a gradual decrease in thermal gradient from the seafloor to the BGHS. Sensitivity analysis for model parameters showed that water depth, amount of uplift, gas hydrate saturation, and basal heat flow strongly influence the gas hydrate dissociation rate and sediment temperature around the BGHS.

  14. Simulating advanced life support systems to test integrated control approaches

    NASA Astrophysics Data System (ADS)

    Kortenkamp, D.; Bell, S.

    Simulations allow for testing of life support control approaches before hardware is designed and built. Simulations also allow for the safe exploration of alternative control strategies during life support operation. As such, they are an important component of any life support research program and testbed. This paper describes a specific advanced life support simulation being created at NASA Johnson Space Center. It is a discrete-event simulation that is dynamic and stochastic. It simulates all major components of an advanced life support system, including crew (with variable ages, weights and genders), biomass production (with scalable plantings of ten different crops), water recovery, air revitalization, food processing, solid waste recycling and energy production. Each component is modeled as a producer of certain resources and a consumer of certain resources. The control system must monitor (via sensors) and control (via actuators) the flow of resources throughout the system to provide life support functionality. The simulation is written in an object-oriented paradigm that makes it portable, extensible and reconfigurable.

  15. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  16. Digital simulation of stiff linear dynamic systems.

    NASA Technical Reports Server (NTRS)

    Holland, L. D.; Walsh, J. R., Jr.; Kerr, J. H.

    1972-01-01

    A method is derived for digital computer simulation of linear time-invariant systems when the insignificant eigenvalues involved in such systems are eliminated by an ALSAP root removal technique. The method is applied to a thirteenth-order dynamic system representing a passive RLC network.

  17. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  18. IRIS observations and MHD simulations of explosive events in the transition region of the Sun

    NASA Astrophysics Data System (ADS)

    Guo, Lijia; Innes, Davina; Huang, Yi-Min; Bhattacharjee, Amitava

    2016-05-01

    Small-scale explosive events on the Sun are thought to be related to magnetic reconnection. While Petschek reconnection has been considered as a reconnection mechanism for explosive events on the Sun for quite a long time, the fragmentation of a current sheet in the high-Lundquist-number regime caused by the plasmoid instability has recently been proposed as a possible mechanism for fast reconnection. The actual reconnection sites are too small to be resolved with images but these reconnection mechanisms, Petschek and the plasmoid instability, have very different density and velocity structures and so can be distinguished by high-resolution line profiles observations. We use high-resolution sit-and-stare spectral observations of the Si IV line, obtained by the IRIS spectrometer, to identify sites of reconnection, and follow the development of line profiles. The aim is to obtain a survey of typical line profiles produced by small-scale reconnection events in the transition region and compare them with synthetic line profiles from numerical simulations of a reconnecting current sheet to determine whether reconnection occurs via the plasmoid instabilty or the Petschek mechanism. Direct comparison between IRIS observations and numerical results suggests that the observed Si IV profiles can be reproduced with a fragmented current layer subject to plasmoid instability but not by bi-directional jets that characterise the Petschek mechanism. This result suggests that if these small-scale events are reconnection sites, then fast reconnection proceeds via the plasmoid instability, rather than the Petschek mechanism during small-scale reconnection on the Sun.

  19. Numerical propulsion system simulation - An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  20. Numerical propulsion system simulation: An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  1. Colorimetric calibration of coupled infrared simulation system

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Fei, Jindong; Gao, Yang; Du, Jian

    2015-10-01

    In order to test 2-color infrared sensors, a coupled infrared simulation system can generate radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. There are two channels in the coupled simulation system, optically combined by a diachronic beam combiner. Each channel has an infrared blackbody, a filter, a diaphragm, and diaphragm-motors. The system is projected to the sensor under testing by a collimator. This makes it difficult to calibrate the system with only one-band thermal imager. Errors will be caused in the radiance levels measured by the narrow band thermal imager. This paper describes colorimetric temperature measurement techniques that have been developed to perform radiometric calibrations of these infrared simulation systems above. The calibration system consists of two infrared thermal imagers; one is operated at the wavelength range of MW-IR, and the other at the range of LW-IR.

  2. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  3. The analyses of extreme climate events over China based on CMIP5 historical and future simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Dong, W.; Feng, J.; Chou, J.

    2013-12-01

    The extreme climate events have a serious influence on human society. Based on observations and 12 simulations from Coupled Model Intercomparison Project Phase 5 (CMIP5), Climatic extremes and their changes over china in history and future scenarios of three Representative Concentration Pathways (RCPs) are analyzed. Because of the background of global warming, in observations, the frost days (FD) and low-temperature threshold days (TN10P) have decreasing trend, and summer days (SU), high-temperature threshold days (TX90P), the heavy precipitation days (R20) and contribution of heavy precipitation days (P95T) show an increasing trend. Most coupled models can basically simulate main characteristics of most extreme indexes. The models reproduce the mean FD and TX90P value best and can give basic trends of the FD, TN10P, SU and TX90P. High correlation coefficients between simulated results and observation are found in FD, SU and P95T. For FD and SU index, most of the models have good ability to capture the spatial differences between the mean state of the 1986-2005 and 1961-1980 periods, but for other indexes, most of models' simulation ability for spatial disparity are not so satisfactory and have to be promoted. Under the high emission scenario of RCP8.5, the century-scale linear changes of Multi-Model Ensembles (MME) for FD, SU, TN10P, TX90P, R20 and P95T are -46.9, 46.0, -27.1, 175.4, 2.9 days and 9.9%, respectively. Due to the complexities of physical process parameterizations and the limitation of forcing data, a large uncertainty still exists in the simulations of climatic extremes. Fig.1 Observed and modeled multi-year average for each index (Dotted line: observation) Table1. Extreme index definition

  4. Mechanism for stickiness suppression during extreme events in Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Krüger, Taline Suellen; Galuzio, Paulo Paneque; Prado, Thiago de Lima; Viana, Ricardo Luiz; Szezech, José Danilo; Lopes, Sergio Roberto

    2015-06-01

    In this paper we study how hyperbolic and nonhyperbolic regions in the neighborhood of a resonant island perform an important role allowing or forbidding stickiness phenomenon around islands in conservative systems. The vicinity of the island is composed of nonhyperbolic areas that almost prevent the trajectory to visit the island edge. For some specific parameters tiny channels are embedded in the nonhyperbolic area that are associated to hyperbolic fixed points localized in the neighborhood of the islands. Such channels allow the trajectory to be injected in the inner portion of the vicinity. When the trajectory crosses the barrier imposed by the nonhyperbolic regions, it spends a long time abandoning the vicinity of the island, since the barrier also prevents the trajectory from escaping from the neighborhood of the island. In this scenario the nonhyperbolic structures are responsible for the stickiness phenomena and, more than that, the strength of the sticky effect. We show that those properties of the phase space allow us to manipulate the existence of extreme events (and the transport associated to it) responsible for the nonequilibrium fluctuation of the system. In fact we demonstrate that by monitoring very small portions of the phase space (namely, ≈1 ×10-5% of it) it is possible to generate a completely diffusive system eliminating long-time recurrences that result from the stickiness phenomenon.

  5. Mechanism for stickiness suppression during extreme events in Hamiltonian systems.

    PubMed

    Krüger, Taline Suellen; Galuzio, Paulo Paneque; Prado, Thiago de Lima; Viana, Ricardo Luiz; Szezech, José Danilo; Lopes, Sergio Roberto

    2015-06-01

    In this paper we study how hyperbolic and nonhyperbolic regions in the neighborhood of a resonant island perform an important role allowing or forbidding stickiness phenomenon around islands in conservative systems. The vicinity of the island is composed of nonhyperbolic areas that almost prevent the trajectory to visit the island edge. For some specific parameters tiny channels are embedded in the nonhyperbolic area that are associated to hyperbolic fixed points localized in the neighborhood of the islands. Such channels allow the trajectory to be injected in the inner portion of the vicinity. When the trajectory crosses the barrier imposed by the nonhyperbolic regions, it spends a long time abandoning the vicinity of the island, since the barrier also prevents the trajectory from escaping from the neighborhood of the island. In this scenario the nonhyperbolic structures are responsible for the stickiness phenomena and, more than that, the strength of the sticky effect. We show that those properties of the phase space allow us to manipulate the existence of extreme events (and the transport associated to it) responsible for the nonequilibrium fluctuation of the system. In fact we demonstrate that by monitoring very small portions of the phase space (namely, ≈1×10(-5)% of it) it is possible to generate a completely diffusive system eliminating long-time recurrences that result from the stickiness phenomenon. PMID:26172768

  6. Simulating system dynamics with arbitrary time step

    NASA Astrophysics Data System (ADS)

    Kantorovich, L.

    2007-02-01

    We suggest a dynamic simulation method that allows efficient and realistic modeling of kinetic processes, such as atomic diffusion, in which time has its actual meaning. Our method is similar in spirit to widely used kinetic Monte Carlo (KMC) techniques; however, in our approach, the time step can be chosen arbitrarily. This has an advantage in some cases, e.g., when the transition rates change with time sufficiently fast over the period of the KMC time step (e.g., due to time dependence of some external factors influencing kinetics such as moving scanning probe microscopy tip or external time-dependent field) or when the clock time is set by some external conditions, and it is convenient to use equal time steps instead of the random choice of the KMC algorithm in order to build up probability distribution functions. We show that an arbitrary choice of the time step can be afforded by building up the complete list of events including the “residence site” and multihop transitions. The idea of the method is illustrated in a simple “toy” model of a finite one-dimensional lattice of potential wells with unequal jump rates to either side, which can be studied analytically. We show that for any choice of the time step, our general kinetics method reproduces exactly the solution of the corresponding master equations for any choice of the time steps. The final kinetics also matches the standard KMC, and this allows better understanding of this algorithm, in which the time step is chosen in a certain way and the system always advances by a single hop.

  7. Electric-Power System Simulator

    NASA Technical Reports Server (NTRS)

    Caldwell, R. W.; Grumm, R. L.; Biedebach, B. L.

    1984-01-01

    Shows different combinations of generation, storage, and load components: display, video monitor with keyboard input to microprocessor, and video monitor for display of load curves and power generation. Planning tool for electric utilities, regulatory agencies, and laymen in understanding basics of electric-power systems operation.

  8. Effects of long range transboundary pollutants on air quality in Japan - numerical simulation of a yellow sand event

    SciTech Connect

    Ueda, Hiromasa; Kang, Seuk Jea

    1996-12-31

    Air quality in the East Asia may worsen drastically as a consequence of accelerated development of fossil fuel systems and highest economic and population growth rates of the world. The expansion of these energy systems combined with a major fuel shift to indigenous coal, will result in a significant acid deposition and photochemical oxidant pollution in this region. Frequently, during clean spring days large scale wind systems develop in order to transport pollutants from the East Asian mainland towards the Pacific Ocean. Therefore, in order to evaluate the air quality of the western Pacific Ocean and Japan, the effects of emissions of the adjacent continent must be taken into consideration. The present paper reports on a series of numerical simulations for clear spring time episodes using an Eulerian transport/chemistry/deposition model to obtain the concentration changes of air pollutants over this area. The simulation was done from 9:00 JST of 1 April to midnight of 3 April 1993. On this day a yellow sand event showing good evidence of long range transport from the continent toward the Western Pacific Ocean occurred. At first, the simulation results show a fair agreement with the observed values. Secondly, the numerical simulation showed the formation of a high air pollution belt in East Asia, connecting the eastern area of China, the southern area of Korea and the western area of Japan clearly. In the case of NO{sub x}, the formation of a air pollution belt is weak, but well displayed for sulfate, nitrate and the ozone. Specially, in the region covered by the air pollution belt (Western Pacific Ocean, Japan Sea and Western Japan) emissions are small, but the concentration of ozone, sulfate and nitrate are high. Ozone concentration in Japan, due to long range transport from the continent is already near the environmental standard value of 60 ppb. In this area tropospheric ozone and acid deposition were suggested to be a serious problem in the future.

  9. Simulations of Interplanetary Coronal Mass Ejection Events with Simple Model of Solar Wind

    NASA Astrophysics Data System (ADS)

    Ogawa, Tomoya; den, Mitsue; Watari, Shinichi; Yamashita, Kazuyuki

    Propagation of an interplanetary shock wave depends on a solar wind situation. While a precise solar wind model in numerical simulations will produce good results in prediction of passing time of an interplanetary shock wave caused by a coronal mass ejection (CME), one attempting the prediction with a detail model will face difficulty in inputting parameters, because such quantities will often be estimated only with insufficient precisions. This study aimed to build a model whose parameters are available before a shock wave arrival. We performed simulations of interplanetaly CME events with a simple model of solar wind. The model has slow wind blowing out on a solar geodesic line into global fast wind. Tilt and phase of the plane including the geodesic line are presumed by a previous solar magnetic field. A CME is put into the simulation region. The CME model we use needs two major parameters, position and velocity. Position is assumed at the accompanied X-ray flare and velocity is estimated by LASCO observation. Some other parameters were fixed experientially after trial and error. We compare resulting fluctuation near the Earth with ACE data, and discuss limits and potentialities of the model in space weather prediction.

  10. Wireless address event representation system for biological sensor networks

    NASA Astrophysics Data System (ADS)

    Folowosele, Fopefolu; Tapson, Jonathan; Etienne-Cummings, Ralph

    2007-05-01

    We describe wireless networking systems for close proximity biological sensors, as would be encountered in artificial skin. The sensors communicate to a "base station" that interprets the data and decodes its origin. Using a large bundle of ultra thin metal wires from the sensors to the "base station" introduces significant technological hurdles for both the construction and maintenance of the system. Fortunately, the Address Event Representation (AER) protocol provides an elegant and biomorphic method for transmitting many impulses (i.e. neural spikes) down a single wire/channel. However, AER does not communicate any sensory information within each spike, other that the address of the origination of the spike. Therefore, each sensor must provide a number of spikes to communicate its data, typically in the form of the inter-spike intervals or spike rate. Furthermore, complex circuitry is required to arbitrate access to the channel when multiple sensors communicate simultaneously, which results in spike delay. This error is exacerbated as the number of sensors per channel increases, mandating more channels and more wires. We contend that despite the effectiveness of the wire-based AER protocol, its natural evolution will be the wireless AER protocol. A wireless AER system: (1) does not require arbitration to handle multiple simultaneous access of the channel, (2) uses cross-correlation delay to encode sensor data in every spike (eliminating the error due to arbitration delay), and (3) can be reorganized and expanded with little consequence to the network. The system uses spread spectrum communications principles, implemented with a low-power integrate-and-fire neurons. This paper discusses the design, operation and capabilities of such a system. We show that integrate-and-fire neurons can be used to both decode the origination of each spike and extract the data contained within in. We also show that there are many technical obstacles to overcome before this version

  11. Physics Detector Simulation Facility Phase II system software description

    SciTech Connect

    Scipioni, B.; Allen, J.; Chang, C.; Huang, J.; Liu, J.; Mestad, S.; Pan, J.; Marquez, M.; Estep, P.

    1993-05-01

    This paper presents the Physics Detector Simulation Facility (PDSF) Phase II system software. A key element in the design of a distributed computing environment for the PDSF has been the separation and distribution of the major functions. The facility has been designed to support batch and interactive processing, and to incorporate the file and tape storage systems. By distributing these functions, it is often possible to provide higher throughput and resource availability. Similarly, the design is intended to exploit event-level parallelism in an open distributed environment.

  12. Evaluation of the southern California seismic velocity models through simulation of recorded events

    NASA Astrophysics Data System (ADS)

    Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Khoshnevis, Naeem; Cheng, Keli

    2016-06-01

    Significant effort has been devoted over the last two decades to the development of various seismic velocity models for the region of southern California, United States. These models are mostly used in forward wave propagation simulation studies, but also as base models for tomographic and source inversions. Two of these models, the community velocity models CVM-S and CVM-H, are among the most commonly used for this region. This includes two alternative variations to the original models, the recently released CVM-S4.26 which incorporates results from a sequence of tomographic inversions into CVM-S, and the user-controlled option of CVM-H to replace the near-surface profiles with a VS30-based geotechnical model. Although either one of these models is regarded as acceptable by the modeling community, it is known that they have differences in their representation of the crustal structure and sedimentary deposits in the region, and thus can lead to different results in forward and inverse problems. In this paper, we evaluate the accuracy of these models when used to predict the ground motion in the greater Los Angeles region by means of an assessment of a collection of simulations of recent events. In total, we consider 30 moderate-magnitude earthquakes (3.5 < Mw < 5.5) between 1998 and 2014, and compare synthetics with data recorded by seismic networks during these events. The simulations are done using a finite-element parallel code, with numerical models that satisfy a maximum frequency of 1 Hz and a minimum shear wave velocity of 200 m s-1. The comparisons between data and synthetics are ranked quantitatively by means of a goodness-of-fit (GOF) criteria. We analyse the regional distribution of the GOF results for all events and all models, and draw conclusions from the results and how these correlate to the models. We find that, in light of our comparisons, the model CVM-S4.26 consistently yields better results.

  13. Evaluation of the Southern California Seismic Velocity Models through Simulation of Recorded Events

    NASA Astrophysics Data System (ADS)

    Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Khoshnevis, Naeem; Cheng, Keli

    2016-03-01

    Significant effort has been devoted over the last two decades to the development of various seismic velocity models for the region of southern California, United States. These models are mostly used in forward wave propagation simulation studies, but also as base models for tomographic and source inversions. Two of these models, the community velocity models CVM-S and CVM-H, are among the most commonly used for this region. This includes two alternative variations to the original models, the recently released CVM-S4.26 which incorporates results from a sequence of tomographic inversions into CVM-S, and the user-controlled option of CVM-H to replace the near-surface profiles with a VS30-based geotechnical (GTL) model. Although either one of these models is regarded as acceptable by the modeling community, it is known that they have differences in their representation of the crustal structure and sedimentary deposits in the region, and thus can lead to different results in forward and inverse problems. In this article we evaluate the accuracy of these models when used to predict the ground motion in the greater Los Angeles region by means of an assessment of a collection of simulations of recent events. In total, we consider 30 moderate-magnitude earthquakes (3.5 < Mw < 5.5) between 1998 and 2014, and compare synthetics with data recorded by seismic networks during these events. The simulations are done using a finite element parallel code, with numerical models that satisfy a maximum frequency of 1 Hz and a minimum shear wave velocity of 200 m/s. The comparisons between data and synthetics are ranked quantitatively by means of a goodness-of-fit (GOF) criteria. We analyze the regional distribution of the GOF results for all events and all models, and draw conclusions from the results and how these correlate to the models. We find that, in light of our comparisons, the model CVM-S4.26 consistently yields better results.

  14. Final Technical Report "Multiscale Simulation Algorithms for Biochemical Systems"

    SciTech Connect

    Petzold, Linda R.

    2012-10-25

    Biochemical systems are inherently multiscale and stochastic. In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. An analysis tool that respects these dynamical characteristics is the stochastic simulation algorithm (SSA, Gillespie, 1976), a numerical simulation procedure that is essentially exact for chemical systems that are spatially homogeneous or well stirred. Despite recent improvements, as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the multiscale nature of the underlying problem: (1) stiffness, i.e. the presence of multiple timescales, the fastest of which are stable; and (2) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). This project has focused on the development of fast and adaptive algorithms, and the fun- damental theory upon which they must be based, for the multiscale simulation of biochemical systems. Areas addressed by this project include: (1) Theoretical and practical foundations for ac- celerated discrete stochastic simulation (tau-leaping); (2) Dealing with stiffness (fast reactions) in an efficient and well-justified manner in discrete stochastic simulation; (3) Development of adaptive multiscale algorithms for spatially homogeneous discrete stochastic simulation; (4) Development of high-performance SSA algorithms.

  15. TRANSIMS: Transportation analysis and simulation system

    SciTech Connect

    Smith, L.; Beckman, R.; Baggerly, K.

    1995-07-01

    This document summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  16. TRANSIMS: TRansportation ANalysis and SIMulation System

    SciTech Connect

    Smith, L.; Beckman, R.; Anson, D.; Nagel, K.; Williams, M.

    1995-08-01

    This paper summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  17. Explicit simulation of a midlatitude Mesoscale Convective System

    SciTech Connect

    Alexander, G.D.; Cotton, W.R.

    1996-04-01

    We have explicitly simulated the mesoscale convective system (MCS) observed on 23-24 June 1985 during PRE-STORM, the Preliminary Regional Experiment for the Stormscale Operational and Research and Meterology Program. Stensrud and Maddox (1988), Johnson and Bartels (1992), and Bernstein and Johnson (1994) are among the researchers who have investigated various aspects of this MCS event. We have performed this MCS simulation (and a similar one of a tropical MCS; Alexander and Cotton 1994) in the spirit of the Global Energy and Water Cycle Experiment Cloud Systems Study (GCSS), in which cloud-resolving models are used to assist in the formulation and testing of cloud parameterization schemes for larger-scale models. In this paper, we describe (1) the nature of our 23-24 June MCS dimulation and (2) our efforts to date in using our explicit MCS simulations to assist in the development of a GCM parameterization for mesoscale flow branches. The paper is organized as follows. First, we discuss the synoptic situation surrounding the 23-24 June PRE-STORM MCS followed by a discussion of the model setup and results of our simulation. We then discuss the use of our MCS simulation. We then discuss the use of our MCS simulations in developing a GCM parameterization for mesoscale flow branches and summarize our results.

  18. Numerically simulating the sandwich plate system structures

    NASA Astrophysics Data System (ADS)

    Feng, Guo-Qing; Li, Gang; Liu, Zhi-Hui; Niu, Huai-Lei; Li, Chen-Feng

    2010-09-01

    Sandwich plate systems (SPS) are advanced materials that have begun to receive extensive attention in naval architecture and ocean engineering. At present, according to the rules of classification societies, a mixture of shell and solid elements are required to simulate an SPS. Based on the principle of stiffness decomposition, a new numerical simulation method for shell elements was proposed. In accordance with the principle of stiffness decomposition, the total stiffness can be decomposed into the bending stiffness and shear stiffness. Displacement and stress response related to bending stiffness was calculated with the laminated shell element. Displacement and stress response due to shear was calculated by use of a computational code write by FORTRAN language. Then the total displacement and stress response for the SPS was obtained by adding together these two parts of total displacement and stress. Finally, a rectangular SPS plate and a double-bottom structure were used for a simulation. The results show that the deflection simulated by the elements proposed in the paper is larger than the same simulated by solid elements and the analytical solution according to Hoff theory and approximate to the same simulated by the mixture of shell-solid elements, and the stress simulated by the elements proposed in the paper is approximate to the other simulating methods. So compared with calculations based on a mixture of shell and solid elements, the numerical simulation method given in the paper is more efficient and easier to do.

  19. High Resolution Simulation of a Colorado Rockies Extreme Snow and Rain Event in both a Current and Future Climate

    NASA Astrophysics Data System (ADS)

    Rasmussen, Roy; Ikeda, Kyoko; Liu, Changhai; Gutmann, Ethan; Gochis, David

    2016-04-01

    Modeling of extreme weather events often require very finely resolved treatment of atmospheric circulation structures in order to produce and localize the large moisture fluxes that result in extreme precipitation. This is particularly true for cool season orographic precipitation processes where the representation of the landform can significantly impact vertical velocity profiles and cloud moisture entrainment rates. This study presents results for high resolution regional climate modeling study of the Colorado Headwaters region using an updated version of the Weather Research and Forecasting (WRF) model run at 4 km horizontal resolution and a hydrological extension package called WRF-Hydro. Previous work has shown that the WRF modeling system can produce credible depictions of winter orographic precipitation over the Colorado Rockies if run at horizontal resolutions < 6 km. Here we present results from a detailed study of an extreme springtime snowfall event that occurred along the Colorado Front Range in March 2003. Results from the impact of warming on total precipitation, snow-rain partitioning and surface hydrological fluxes (evapotranspiration and runoff) will be discussed in the context of how potential changes in temperature impact the amount of precipitation, the phase of precipitation (rain vs. snow) and the timing and amplitude of streamflow responses. The results show using the Pseudo Global Warming technique that intense precipitation rates significantly increased during the event and a significant fraction of the snowfall converts to rain which significantly amplifies the runoff response from one where runoff is produced gradually to one in which runoff is rapidly translated into streamflow values that approach significant flooding risks. Results from a new, CONUS scale high resolution climate simulation of extreme events in a current and future climate will be presented as time permits.

  20. Behavioral and Physiological Responses of Calves to Marshalling and Roping in a Simulated Rodeo Event

    PubMed Central

    Sinclair, Michelle; Keeley, Tamara; Lefebvre, Anne-Cecile; Phillips, Clive J. C.

    2016-01-01

    Simple Summary Rodeos often include a calf roping event, where calves are first lassoed by a rider on a horse, who then dismounts, ties the calves’ legs, lifts it from the ground and releases it back to the floor. We tested whether calves that were familiar to the roping experience stress during the roping event, and found increased concentrations of stress hormones in their blood after the roping. We also found increased concentrations of stress hormones in the blood of calves that had never been roped before but were just marshelled across the arena by the horse and rider. We conclude that the roping event in rodeos is stressful for both experienced and naïve calves. Abstract Rodeos are public events at which stockpeople face tests of their ability to manage cattle and horses, some of which relate directly to rangeland cattle husbandry. One of these is calf roping, in which a calf released from a chute is pursued by a horse and rider, who lassoes, lifts and drops the calf to the ground and finally ties it around the legs. Measurements were made of behavior and stress responses of ten rodeo-naïve calves marshalled by a horse and rider, and ten rodeo-experienced calves that were roped. Naïve calves marshalled by a horse and rider traversed the arena slowly, whereas rodeo-experienced calves ran rapidly until roped. Each activity was repeated once after two hours. Blood samples taken before and after each activity demonstrated increased cortisol, epinephrine and nor-epinephrine in both groups. However, there was no evidence of a continued increase in stress hormones in either group by the start of the repeated activity, suggesting that the elevated stress hormones were not a response to a prolonged effect of the initial blood sampling. It is concluded that both the marshalling of calves naïve to the roping chute by stockpeople and the roping and dropping of experienced calves are stressful in a simulated rodeo calf roping event. PMID:27136590

  1. Automatically Recognizing Medication and Adverse Event Information From Food and Drug Administration’s Adverse Event Reporting System Narratives

    PubMed Central

    Polepalli Ramesh, Balaji; Belknap, Steven M; Li, Zuofeng; Frid, Nadya; West, Dennis P

    2014-01-01

    Background The Food and Drug Administration’s (FDA) Adverse Event Reporting System (FAERS) is a repository of spontaneously-reported adverse drug events (ADEs) for FDA-approved prescription drugs. FAERS reports include both structured reports and unstructured narratives. The narratives often include essential information for evaluation of the severity, causality, and description of ADEs that are not present in the structured data. The timely identification of unknown toxicities of prescription drugs is an important, unsolved problem. Objective The objective of this study was to develop an annotated corpus of FAERS narratives and biomedical named entity tagger to automatically identify ADE related information in the FAERS narratives. Methods We developed an annotation guideline and annotate medication information and adverse event related entities on 122 FAERS narratives comprising approximately 23,000 word tokens. A named entity tagger using supervised machine learning approaches was built for detecting medication information and adverse event entities using various categories of features. Results The annotated corpus had an agreement of over .9 Cohen’s kappa for medication and adverse event entities. The best performing tagger achieves an overall performance of 0.73 F1 score for detection of medication, adverse event and other named entities. Conclusions In this study, we developed an annotated corpus of FAERS narratives and machine learning based models for automatically extracting medication and adverse event information from the FAERS narratives. Our study is an important step towards enriching the FAERS data for postmarketing pharmacovigilance. PMID:25600332

  2. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  3. Communication Simulations for Power System Applications

    SciTech Connect

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.; Fisher, Andrew R.; Hauer, Matthew L.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systems will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.

  4. Simulation and intelligent vehicle highway systems

    SciTech Connect

    Rathi, A.K. ); Santiago, A.J. )

    1992-01-01

    Intelligent Vehicle Highway Systems (IVHS) is based on the premise of using advanced technologies in telecommunication, electronics, and computers to improve the nature and quality of highway travel while making it safer and more efficient. The safety benefits of the IVHS systems are unquestioned; however, there are different levels of optimism about the operational benefits of these systems. While there is a broad consensus that IVHS can improve the flow of traffic, and thus mobility, currently there is very limited empirical evidence or analytical basis to support this optimism. The lack of analytical framework for design, analysis, and evaluation of IVHS concepts will continue to fuel the debate between the skeptics and the advocates of IVHS. Computer simulation is likely to play a major role in the analysis and assessment of the IVHS technologies. In this paper, we attempt to identify the simulation modelling needs to support the IVHS functional areas dealing with traffic flow on highway networks. The paper outlines the envisioned IVHS operational environment. Functional requirements for the simulation modelling system that could be used to support the development and testing of IVHS concepts, namely Advanced Traffic Management Systems (ATMS) and Advanced Traveller Information Systems (ATIS), are defined. Simulation modelling research and development needs to support the design and evaluations of IVHS concepts are described. The paper concludes by presenting on-going work on the traffic simulation models at the Oak Ridge National Laboratory.

  5. Simulation and intelligent vehicle highway systems

    SciTech Connect

    Rathi, A.K.; Santiago, A.J.

    1992-09-01

    Intelligent Vehicle Highway Systems (IVHS) is based on the premise of using advanced technologies in telecommunication, electronics, and computers to improve the nature and quality of highway travel while making it safer and more efficient. The safety benefits of the IVHS systems are unquestioned; however, there are different levels of optimism about the operational benefits of these systems. While there is a broad consensus that IVHS can improve the flow of traffic, and thus mobility, currently there is very limited empirical evidence or analytical basis to support this optimism. The lack of analytical framework for design, analysis, and evaluation of IVHS concepts will continue to fuel the debate between the skeptics and the advocates of IVHS. Computer simulation is likely to play a major role in the analysis and assessment of the IVHS technologies. In this paper, we attempt to identify the simulation modelling needs to support the IVHS functional areas dealing with traffic flow on highway networks. The paper outlines the envisioned IVHS operational environment. Functional requirements for the simulation modelling system that could be used to support the development and testing of IVHS concepts, namely Advanced Traffic Management Systems (ATMS) and Advanced Traveller Information Systems (ATIS), are defined. Simulation modelling research and development needs to support the design and evaluations of IVHS concepts are described. The paper concludes by presenting on-going work on the traffic simulation models at the Oak Ridge National Laboratory.

  6. Simulation Of A Photofission-Based Cargo Interrogation System

    SciTech Connect

    King, Michael; Gozani, Tsahi; Stevenson, John; Shaw, Timothy

    2011-06-01

    A comprehensive model has been developed to characterize and optimize the detection of Bremsstrahlung x-ray induced fission signatures from nuclear materials hidden in cargo containers. An effective active interrogation system should not only induce a large number of fission events but also efficiently detect their signatures. The proposed scanning system utilizes a 9-MV commercially available linear accelerator and the detection of strong fission signals i.e. delayed gamma rays and prompt neutrons. Because the scanning system is complex and the cargo containers are large and often highly attenuating, the simulation method segments the model into several physical steps, representing each change of radiation particle. Each approximation is carried-out separately, resulting in a major reduction in computational time and a significant improvement in tally statistics. The model investigates the effect on the fission rate and detection rate by various cargo types, densities and distributions. Hydrogenous and metallic cargos, homogeneous and heterogeneous, as well as various locations of the nuclear material inside the cargo container were studied. We will show that for the photofission-based interrogation system simulation, the final results are not only in good agreement with a full, single-step simulation but also with experimental results, further validating the full-system simulation.

  7. Single-event response of the SiGe HBT in TCAD simulations and laser microbeam experiment

    NASA Astrophysics Data System (ADS)

    Li, Pei; Guo, Hong-Xia; Guo, Qi; Zhang, Jin-Xin; Xiao, Yao; Wei, Ying; Cui, Jiang-Wei; Wen, Lin; Liu, Mo-Han; Wang, Xin

    2015-08-01

    In this paper the single-event responses of the silicon germanium heterojunction bipolar transistors (SiGe HBTs) are investigated by TCAD simulations and laser microbeam experiment. A three-dimensional (3D) simulation model is established, the single event effect (SEE) simulation is further carried out on the basis of SiGe HBT devices, and then, together with the laser microbeam test, the charge collection behaviors are analyzed, including the single event transient (SET) induced transient terminal currents, and the sensitive area of SEE charge collection. The simulations and experimental results are discussed in detail and it is demonstrated that the nature of the current transient is controlled by the behaviors of the collector-substrate (C/S) junction and charge collection by sensitive electrodes, thereby giving out the sensitive area and electrode of SiGe HBT in SEE. Project supported by the National Natural Science Foundation of China (Grant No. 61274106).

  8. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  9. Rare-event Simulation for Stochastic Korteweg-de Vries Equation

    SciTech Connect

    Xu, Gongjun; Lin, Guang; Liu, Jingchen

    2014-01-01

    An asymptotic analysis of the tail probabilities for the dynamics of a soliton wave $U(x,t)$ under a stochastic time-dependent force is developed. The dynamics of the soliton wave $U(x,t)$ is described by the Korteweg-de Vries Equation with homogeneous Dirichlet boundary conditions under a stochastic time-dependent force, which is modeled as a time-dependent Gaussian noise with amplitude $\\epsilon$. The tail probability we considered is $w(b) :=P(\\sup_{t\\in [0,T]} U(x,t) > b ),$ as $b\\rightarrow \\infty,$ for some constant $T>0$ and a fixed $x$, which can be interpreted as tail probability of the amplitude of water wave on shallow surface of a fluid or long internal wave in a density-stratified ocean. Our goal is to characterize the asymptotic behaviors of $w(b)$ and to evaluate the tail probability of the event that the soliton wave exceeds a certain threshold value under a random force term. Such rare-event calculation of $w(b)$ is very useful for fast estimation of the risk of the potential damage that could caused by the water wave in a density-stratified ocean modeled by the stochastic KdV equation. In this work, the asymptotic approximation of the probability that the soliton wave exceeds a high-level $b$ is derived. In addition, we develop a provably efficient rare-event simulation algorithm to compute $w(b)$. The efficiency of the algorithm only requires mild conditions and therefore it is applicable to a general class of Gaussian processes and many diverse applications.

  10. Reduced salinity increases susceptibility of zooxanthellate jellyfish to herbicide toxicity during a simulated rainfall event.

    PubMed

    Klein, Shannon G; Pitt, Kylie A; Carroll, Anthony R

    2016-02-01

    Accurately predicting how marine biota are likely to respond to changing ocean conditions requires accurate simulation of interacting stressors, exposure regimes and recovery periods. Jellyfish populations have increased in some parts of the world and, despite few direct empirical tests, are hypothesised to be increasing because they are robust to a range of environmental stressors. Here, we investigated the effects of contaminated runoff on a zooxanthellate jellyfish by exposing juvenile Cassiopea sp. medusae to a photosystem II (PSII) herbicide, atrazine and reduced salinity conditions that occur following rainfall. Four levels of atrazine (0ngL(-1), 10ngL(-1), 2μgL(-1), 20μgL(-1)) and three levels of salinity (35 ppt, 25 ppt, 17 ppt) were varied, mimicking the timeline of light, moderate and heavy rainfall events. Normal conditions were then slowly re-established over four days to mimic the recovery of the ecosystem post-rain and the experiment continued for a further 7 days to observe potential recovery of the medusae. Pulse-amplitude modulated (PAM) chlorophyll fluorescence, growth and bell contraction rates of medusae were measured. Medusae exposed to the combination of high atrazine and lowest salinity died. After 3 days of exposure, bell contraction rates were reduced by 88% and medusae were 16% smaller in the lowest salinity treatments. By Day 5 of the experiment, all medusae that survived the initial pulse event began to recover quickly. Although atrazine decreased YII under normal salinity conditions, YII was further reduced when medusae were exposed to both low salinity and atrazine simultaneously. Atrazine breakdown products were more concentrated in jellyfish tissues than atrazine at the end of the experiment, suggesting that although bioaccumulation occurred, atrazine was metabolised. Our results suggest that reduced salinity may increase the susceptibility of medusae to herbicide exposure during heavy rainfall events. PMID:26647170

  11. Lunar Rocks: Available for Year of the Solar System Events

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2010-12-01

    sections may be use requested for college and university courses where petrographic microscopes are available for viewing. Requestors should contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov NASA also loans sets of Moon rocks for use in classrooms, libraries, museums, and planetariums through the Lunar Sample Education Program. Lunar samples (three soils and three rocks) are encapsulated in a six-inch diameter clear plastic disk. A CD with PowerPoint presentations, analogue samples from Earth, a classroom activity guide, and additional printed material accompany the disks. Educators may qualify for the use of these disks by attending a content and security certification workshop sponsored by NASA's Aerospace Education Services Program (AESP). Contact Ms. Margaret Maher, AESP Director. Email address: mjm67@psu.edu NASA makes these precious samples available for the public and encourages the use of lunar rocks to highlight Year of the Solar System events. Surely these interesting specimens of another world will enhance the experience of all YSS participants so please take advantage of these lunar samples and borrow them for events and classes.

  12. Behavioral and Physiological Responses of Calves to Marshalling and Roping in a Simulated Rodeo Event.

    PubMed

    Sinclair, Michelle; Keeley, Tamara; Lefebvre, Anne-Cecile; Phillips, Clive J C

    2016-01-01

    Rodeos are public events at which stockpeople face tests of their ability to manage cattle and horses, some of which relate directly to rangeland cattle husbandry. One of these is calf roping, in which a calf released from a chute is pursued by a horse and rider, who lassoes, lifts and drops the calf to the ground and finally ties it around the legs. Measurements were made of behavior and stress responses of ten rodeo-naïve calves marshalled by a horse and rider, and ten rodeo-experienced calves that were roped. Naïve calves marshalled by a horse and rider traversed the arena slowly, whereas rodeo-experienced calves ran rapidly until roped. Each activity was repeated once after two hours. Blood samples taken before and after each activity demonstrated increased cortisol, epinephrine and nor-epinephrine in both groups. However, there was no evidence of a continued increase in stress hormones in either group by the start of the repeated activity, suggesting that the elevated stress hormones were not a response to a prolonged effect of the initial blood sampling. It is concluded that both the marshalling of calves naïve to the roping chute by stockpeople and the roping and dropping of experienced calves are stressful in a simulated rodeo calf roping event. PMID:27136590

  13. Applications Of Monte Carlo Radiation Transport Simulation Techniques For Predicting Single Event Effects In Microelectronics

    SciTech Connect

    Warren, Kevin; Reed, Robert; Weller, Robert; Mendenhall, Marcus; Sierawski, Brian; Schrimpf, Ronald

    2011-06-01

    MRED (Monte Carlo Radiative Energy Deposition) is Vanderbilt University's Geant4 application for simulating radiation events in semiconductors. Geant4 is comprised of the best available computational physics models for the transport of radiation through matter. In addition to basic radiation transport physics contained in the Geant4 core, MRED has the capability to track energy loss in tetrahedral geometric objects, includes a cross section biasing and track weighting technique for variance reduction, and additional features relevant to semiconductor device applications. The crucial element of predicting Single Event Upset (SEU) parameters using radiation transport software is the creation of a dosimetry model that accurately approximates the net collected charge at transistor contacts as a function of deposited energy. The dosimetry technique described here is the multiple sensitive volume (MSV) model. It is shown to be a reasonable approximation of the charge collection process and its parameters can be calibrated to experimental measurements of SEU cross sections. The MSV model, within the framework of MRED, is examined for heavy ion and high-energy proton SEU measurements of a static random access memory.

  14. An RCM-E simulation of a steady magnetospheric convection event

    NASA Astrophysics Data System (ADS)

    Yang, J.; Toffoletto, F.; Wolf, R.; Song, Y.

    2009-12-01

    We present simulation results of an idealized steady magnetospheric convection (SMC) event using the Rice Convection Model coupled with an equilibrium magnetic field solver (RCM-E). The event is modeled by placing a plasma distribution with substantially depleted entropy parameter PV5/3 on the RCM's high latitude boundary. The calculated magnetic field shows a highly depressed configuration due to the enhanced westward current around geosynchronous orbit where the resulting partial ring current is stronger and more symmetric than in a typical substorm growth phase. The magnitude of BZ component in the mid plasma sheet is large compared to empirical magnetic field models. Contrary to some previous results, there is no deep BZ minimum in the near-Earth plasma sheet. This suggests that the magnetosphere could transfer into a strong adiabatic earthward convection mode without significant stretching of the plasma-sheet magnetic field, when there are flux tubes with depleted plasma content continuously entering the inner magnetosphere from the mid-tail. Virtual AU/AL and Dst indices are also calculated using a synthetic magnetogram code and are compared to typical features in published observations.

  15. Observing System Simulation Experiments: An Overview

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.; Errico, Ronald M.

    2016-01-01

    An overview of Observing System Simulation Experiments (OSSEs) will be given, with focus on calibration and validation of OSSE frameworks. Pitfalls and practice will be discussed, including observation error characteristics, incestuousness, and experimental design. The potential use of OSSEs for investigation of the behaviour of data assimilation systems will be explored, including some results from experiments using the NASAGMAO OSSE.

  16. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.

  17. System simulation and verification facility (SSVF)

    NASA Astrophysics Data System (ADS)

    Irvine, M. M.; Bégin, M.-E.; Eickhoff, J.; de Kruyf, J.

    2002-07-01

    Systems engineering tools can be used in conjunction with concurrent engineering techniques to significantly reduce the cost, schedule and risk of space mission design, development and operation. This paper describes the System Simulation and Verification Facility (SSVF) project performed by the VEGA Group PLC and ASTRIUM GmbH (formerly Dornier Satellitesysteme GmbH) for the European Space Agency's European Space Technology Centre (ESTEC). The SSVF concept integrates a high fidelity hard real-time simulator, a checkout system and control system and a common mission information-base. These components are highly configurable to enable them to support different parts of the mission lifecycle. An SSVF simulator incorporates environment, dynamics and equipment models, and interfaces to hardware, onboard software algorithms, the onboard software itself and onboard processors. SSVF is able to support what-if analyses during design and development activities, and allows the overall system hardware and software design to be validated much earlier in the mission development lifecycle than is currently the case. SSVF also supports hybrid simulations: part software models and part real hardware (breadboard, engineering or flight models). As they become available, these can be tested and validated in a high-fidelity simulated environment. The SSVF project has been performed in two phases.

  18. Thermal enclosure system functional simulation user's manual

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1994-01-01

    A form and function simulation of the thermal enclosure system (TES) for a microgravity protein crystal growth experiment has been developed as part of an investigation of the benefits and limitations of intravehicular telerobotics to aid in microgravity science and production. A user can specify the time, temperature, and sample rate profile for a given experiment, and menu options and status are presented on an LCD display. This report describes the features and operational procedures for the functional simulation.

  19. AMCOM RDEC ladar HWIL simulation system development

    NASA Astrophysics Data System (ADS)

    Kim, Hajin J.; Mobley, Scottie B.; Buford, James A., Jr.

    2003-09-01

    Hardware-in-the-loop (HWIL) testing has, for many years, been an integral part of the modeling and simulation efforts at the U.S. Army Aviation and Missile Command"s (AMCOM) Aviation and Missile Research, Engineering, and Development Center (AMRDEC). AMCOM"s history includes the development, characterization, and implementation of several unique technologies for the creation of synthetic environments in the visible, infrared, and radio frequency spectral regions and AMCOM has continued significant efforts in these areas. This paper describes recent advancements at AMCOM"s Advanced Simulation Center (ASC) and concentrates on Ladar HWIL simulation system development.

  20. Direct Numerical Simulation of Fluid Driven Geomechanical Events during Carbon Sequestration

    NASA Astrophysics Data System (ADS)

    Morris, J. P.; Johnson, S. M.

    2008-12-01

    Geologic storage of supercritical CO2 introduces significant stress perturbations into the target formation and overburden. In some cases, this could activate existing fractures and faults, or drive new fractures through the caprock. We will present results of a recent investigation to identify conditions that will activate existing fractures/faults or make new fractures within the caprock using a range of computational tools. Understanding the geomechanical sources of risk to successful CO2 containment involves a wide range of scales. For example, at the largest scale, bounding fault stability must be considered. Many proposed sequestration targets are bounded by impermeable fault zones that are presumed to become flow paths if they slip. Such geologic features are activated at the field scale by pore pressure elevations. In contrast, fluid driven fracturing events that may introduce new flow paths across a caprock occur at smaller scales. Determining whether the creation of new fractures in the caprock leads to CO2 leakage, however, requires knowledge of how the new fracture intersects and interacts with prior networks of fractures within the caprock. A range of computational tools have been developed at LLNL to consider these scenarios at the most appropriate scales and including the most appropriate physical models. For example, the Livermore Distinct Element Code (LDEC) has been used to simulate the mechanical deformation of fracture networks extending up to 100m. Frac-HMC was developed to evaluate network permeability evolution at the scale of many tens of meters incorporating both mechanical and chemical effects. Massively parallel, non-linear continuum codes that use a smeared fault representation have also been developed to predict activation of multiple faults at the field scale. Most recently, LDEC has been extended to simulate hydraulic driven fracture events at the scale of individual fractures. We will present results spanning the range of small scale

  1. Classification of single-trial auditory events using dry-wireless EEG during real and motion simulated flight.

    PubMed

    Callan, Daniel E; Durantin, Gautier; Terzibas, Cengiz

    2015-01-01

    Application of neuro-augmentation technology based on dry-wireless EEG may be considerably beneficial for aviation and space operations because of the inherent dangers involved. In this study we evaluate classification performance of perceptual events using a dry-wireless EEG system during motion platform based flight simulation and actual flight in an open cockpit biplane to determine if the system can be used in the presence of considerable environmental and physiological artifacts. A passive task involving 200 random auditory presentations of a chirp sound was used for evaluation. The advantage of this auditory task is that it does not interfere with the perceptual motor processes involved with piloting the plane. Classification was based on identifying the presentation of a chirp sound vs. silent periods. Evaluation of Independent component analysis (ICA) and Kalman filtering to enhance classification performance by extracting brain activity related to the auditory event from other non-task related brain activity and artifacts was assessed. The results of permutation testing revealed that single trial classification of presence or absence of an auditory event was significantly above chance for all conditions on a novel test set. The best performance could be achieved with both ICA and Kalman filtering relative to no processing: Platform Off (83.4% vs. 78.3%), Platform On (73.1% vs. 71.6%), Biplane Engine Off (81.1% vs. 77.4%), and Biplane Engine On (79.2% vs. 66.1%). This experiment demonstrates that dry-wireless EEG can be used in environments with considerable vibration, wind, acoustic noise, and physiological artifacts and achieve good single trial classification performance that is necessary for future successful application of neuro-augmentation technology based on brain-machine interfaces. PMID:25741249

  2. Classification of single-trial auditory events using dry-wireless EEG during real and motion simulated flight

    PubMed Central

    Callan, Daniel E.; Durantin, Gautier; Terzibas, Cengiz

    2015-01-01

    Application of neuro-augmentation technology based on dry-wireless EEG may be considerably beneficial for aviation and space operations because of the inherent dangers involved. In this study we evaluate classification performance of perceptual events using a dry-wireless EEG system during motion platform based flight simulation and actual flight in an open cockpit biplane to determine if the system can be used in the presence of considerable environmental and physiological artifacts. A passive task involving 200 random auditory presentations of a chirp sound was used for evaluation. The advantage of this auditory task is that it does not interfere with the perceptual motor processes involved with piloting the plane. Classification was based on identifying the presentation of a chirp sound vs. silent periods. Evaluation of Independent component analysis (ICA) and Kalman filtering to enhance classification performance by extracting brain activity related to the auditory event from other non-task related brain activity and artifacts was assessed. The results of permutation testing revealed that single trial classification of presence or absence of an auditory event was significantly above chance for all conditions on a novel test set. The best performance could be achieved with both ICA and Kalman filtering relative to no processing: Platform Off (83.4% vs. 78.3%), Platform On (73.1% vs. 71.6%), Biplane Engine Off (81.1% vs. 77.4%), and Biplane Engine On (79.2% vs. 66.1%). This experiment demonstrates that dry-wireless EEG can be used in environments with considerable vibration, wind, acoustic noise, and physiological artifacts and achieve good single trial classification performance that is necessary for future successful application of neuro-augmentation technology based on brain-machine interfaces. PMID:25741249

  3. River system environmental modeling and simulation methodology

    SciTech Connect

    Rao, N.B.

    1981-01-01

    Several computer models have been built to examine pollution in rivers. However, the current state of the art in this field emphasizes problem solving using specific programs. A general methodology for building and simulating models of river systems is lacking. Thus, the purpose of this research was to develop a methodology which can be used to conceptualize, visualize, construct and analyze using simulation, models of pollution in river systems. The conceptualization and visualization of these models was facilitated through a network representation. The implementation of the models was accomplished using the capabilities of an existing simulation language, GASP V. The methodology also provides data management facilities for model outputs through the use of the Simulation Data Language (SDL), and high quality plotting facilities through the use of the graphics package DISSPLA (Display Integrated Software System and Plotting Language). Using this methodology, a river system is modeled as consisting of certain elements, namely reaches, junctions, dams, reservoirs, withdrawals and pollutant sources. All these elements of the river system are described in a standard form which has been implemented on a computer. This model, when executed, produces spatial and temporal distributions of the pollutants in the river system. Furthermore, these outputs can be stored in a database and used to produce high quality plots. The result of this research is a methodology for building, implementing and examining the results of models of pollution in river systems.

  4. Multipurpose simulation systems for regional development forecasting

    SciTech Connect

    Kostina, N.I.

    1995-09-01

    We examine the development of automaton-modeling multipurpose simulation systems as an efficient form of simulation software for MIS. Such systems constitute a single problem-oriented package of applications based on a general simulation model, which is equipped with a task source language, interaction tools, file management tools, and an output document editor. The simulation models are described by the method of probabilistic-automaton modeling, which ensures standard representation of models and standardization of the modeling algorithm. Examples of such systems include the demographic forecasting system DEPROG, the VOKON system for assessing the quality of consumer services in terms of free time, and the SONET system for servicing partially accessible customers. The development of computer-aided systems for production and economic control is now moving to the second state, namely operationalization of optimization and forecasting problems, whose solution may account for the main economic effect of MIS. Computation and information problems, which were the main focus of the first stage of MIS development, are thus acquiring the role of a source of information for optimization and forecasting problems in addition to their direct contribution to preparation and analysis of current production and economic information.

  5. Simulation of a rapid dropout event for highly relativistic electrons with the RBE model

    NASA Astrophysics Data System (ADS)

    Kang, S.-B.; Fok, M.-C.; Glocer, A.; Min, K.-W.; Choi, C.-R.; Choi, E.; Hwang, J.

    2016-05-01

    A flux dropout is a sudden and sizable decrease in the energetic electron population of the outer radiation belt on the time scale of a few hours. We simulated a flux dropout of highly relativistic >2.5 MeV electrons using the Radiation Belt Environment model, incorporating the pitch angle diffusion coefficients caused by electromagnetic ion cyclotron (EMIC) waves for the geomagnetic storm event of 23-26 October 2002. This simulation showed a remarkable decrease in the >2.5 MeV electron flux during main phase of the storm, compared to those without EMIC waves. This decrease was independent of magnetopause shadowing or drift loss to the magnetopause. We suggest that the flux decrease was likely to be primarily due to pitch angle scattering to the loss cone by EMIC waves. Furthermore, the >2.5 MeV electron flux calculated with EMIC waves correspond very well with that observed from Solar Anomalous and Magnetospheric Particle EXplorer spacecraft. EMIC wave scattering is therefore likely one of the key mechanisms to understand flux dropouts. We modeled EMIC wave intensities by the Kp index. However, the calculated dropout is a several hours earlier than the observed one. We propose that Kp is not the best parameter to predict EMIC waves.

  6. Extreme events in a vortex gas simulation of a turbulent half-jet

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Saikishan; Pathikonda, Gokul; Narasimha, Roddam

    2012-11-01

    Extensive simulations [arXiv:1008.2876v1 [physics.flu-dyn], BAPS.2010.DFD.LE.4] have shown that the temporally evolving vortex gas mixing layer has 3 regimes, including one which has a universal spreading rate. The present study explores the development of spatially evolving mixing layers, using a vortex gas model based on Basu et al. (1995 Appl. Math. Modelling). The effects of the velocity ratio (r) are analyzed via the most extensive simulations of this kind till date, involving up to 10000 vortices and averaging over up to 1000 convective times. While the temporal limit is approached as r approaches unity, striking features such as extreme events involving coherent structures, bending, deviation of the convection velocity from mean velocity, spatial feedback and greater sensitivity to downstream and free stream boundary conditions are observed in the half-jet (r = 0) limit. A detailed statistical analysis reveals possible causes for the large scatter across experiments, as opposed to the commonly adopted explanation of asymptotic dependence on initial conditions. Supported in part by contract no. Intel/RN/4288.

  7. Simulation study on single event burnout in linear doping buffer layer engineered power VDMOSFET

    NASA Astrophysics Data System (ADS)

    Yunpeng, Jia; Hongyuan, Su; Rui, Jin; Dongqing, Hu; Yu, Wu

    2016-02-01

    The addition of a buffer layer can improve the device's secondary breakdown voltage, thus, improving the single event burnout (SEB) threshold voltage. In this paper, an N type linear doping buffer layer is proposed. According to quasi-stationary avalanche simulation and heavy ion beam simulation, the results show that an optimized linear doping buffer layer is critical. As SEB is induced by heavy ions impacting, the electric field of an optimized linear doping buffer device is much lower than that with an optimized constant doping buffer layer at a given buffer layer thickness and the same biasing voltages. Secondary breakdown voltage and the parasitic bipolar turn-on current are much higher than those with the optimized constant doping buffer layer. So the linear buffer layer is more advantageous to improving the device's SEB performance. Project supported by the National Natural Science Foundation of China (No. 61176071), the Doctoral Fund of Ministry of Education of China (No. 20111103120016), and the Science and Technology Program of State Grid Corporation of China (No. SGRI-WD-71-13-006).

  8. Spatial Aspects in Biological System Simulations

    SciTech Connect

    Resat, Haluk; Costa, Michelle N.; Shankaran, Harish

    2011-01-30

    Mathematical models of the dynamical properties of biological systems aim to improve our understanding of the studied system with the ultimate goal of being able to predict system responses in the absence of experimentation. Despite the enormous advances that have been made in biological modeling and simulation, the inherently multiscale character of biological systems and the stochasticity of biological processes continue to present significant computational and conceptual challenges. Biological systems often consist of well-organized structural hierarchies, which inevitably lead to multiscale problems. This chapter introduces and discusses the advantages and shortcomings of several simulation methods that are being used by the scientific community to investigate the spatio-temporal properties of model biological systems. We first describe the foundations of the methods and then describe their relevance and possible application areas with illustrative examples from our own research. Possible ways to address the encountered computational difficulties are also discussed.

  9. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to

  10. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to

  11. Aid For Simulating Digital Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hartman, Richard M.

    1991-01-01

    DIVERS translator is computer program to convert descriptions of digital flight-control systems (DFCS) into computer program. Language developed to represent design charts of DFCS. Translator converts DIVERS source code into easily transportable language, while minimizing probability that results are affected by interpretation of programmer. Final translated program used as standard of comparison to verify operation of actual flight-control systems. Applicable to simulation of other control systems; for example, electrical circuits and logic processes. Written in C.

  12. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable

  13. Software for Simulating Remote Sensing Systems

    NASA Technical Reports Server (NTRS)

    Zanoni, Vicki; Ryan, Robert; Blonski, Slawomir; Russell, Jeffrey; Gasser, Gerald; Greer, Randall

    2003-01-01

    The Application Research Toolbox (ART) is a collection of computer programs that implement algorithms and mathematical models for simulating remote sensing systems. The ART is intended to be especially useful for performing design-tradeoff studies and statistical analyses to support the rational development of design requirements for multispectral imaging systems. Among other things, the ART affords a capability to synthesize coarser-spatial-resolution image-data sets from finer-spatial-resolution data sets and multispectral-image-data products from hyperspectral-image-data products. The ART also provides for synthesis of image-degradation effects, including point-spread functions, misregistration of spectral images, and noise. The ART can utilize real or synthetic data sets, along with sensor specifications, to create simulated data sets. In one example of a typical application, simulated data pertaining to an existing multispectral sensor system are used to verify the data collected by the system in operation. In the case of a proposed sensor system, the simulated data can be used to conduct trade studies and statistical analyses to ensure that the sensor system will satisfy the requirements of potential scientific, academic, and commercial user communities.

  14. Shuttle Propulsion System Major Events and the Final 22 Flights

    NASA Technical Reports Server (NTRS)

    Owen, James W.

    2011-01-01

    Numerous lessons have been documented from the Space Shuttle Propulsion elements. Major events include loss of the Solid Rocket Boosters (SRB's) on STS-4 and shutdown of a Space Shuttle Main Engine (SSME) during ascent on STS-51F. On STS-112 only half the pyrotechnics fired during release of the vehicle from the launch pad, a testament for redundancy. STS-91 exhibited freezing of a main combustion chamber pressure measurement and on STS-93 nozzle tube ruptures necessitated a low liquid level oxygen cut off of the main engines. A number of on pad aborts were experienced during the early program resulting in delays. And the two accidents, STS-51L and STS-107, had unique heritage in history from early program decisions and vehicle configuration. Following STS-51L significant resources were invested in developing fundamental physical understanding of solid rocket motor environments and material system behavior. And following STS-107, the risk of ascent debris was better characterized and controlled. Situational awareness during all mission phases improved, and the management team instituted effective risk assessment practices. The last 22 flights of the Space Shuttle, following the Columbia accident, were characterized by remarkable improvement in safety and reliability. Numerous problems were solved in addition to reduction of the ascent debris hazard. The Shuttle system, though not as operable as envisioned in the 1970's, successfully assembled the International Space Station (ISS). By the end of the program, the remarkable Space Shuttle Propulsion system achieved very high performance, was largely reusable, exhibited high reliability, and was a heavy lift earth to orbit propulsion system. During the program a number of project management and engineering processes were implemented and improved. Technical performance, schedule accountability, cost control, and risk management were effectively managed and implemented. Award fee contracting was implemented to provide

  15. Adaptive periodic event-triggered consensus for multi-agent systems subject to input saturation

    NASA Astrophysics Data System (ADS)

    Yin, Xiuxia; Yue, Dong; Hu, Songlin

    2016-04-01

    This paper investigates the distributed adaptive event-triggered consensus control for a class of nonlinear agents. Each agent is subject to input saturation. Two kinds of distributed event-triggered control scheme are introduced, one is continuous-time-based event-triggered scheme and the other is sampled-data-based event-triggered scheme. Compared with the traditional event-triggered schemes in the existing literatures, the parameters of the event-triggered schemes in this paper are adaptively adjusted by using some event-error-dependent adaptive laws. The problem of simultaneously deriving the controller gain matrix and the event-triggering parameter matrix, and tackling the saturation nonlinearity is cast into standard linear matrix inequalities problem. A convincing simulation example is given to demonstrate the theoretical results.

  16. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  17. Simulating Astronomical Adaptive Optics Systems Using Yao

    NASA Astrophysics Data System (ADS)

    Rigaut, François; Van Dam, Marcos

    2013-12-01

    Adaptive Optics systems are at the heart of the coming Extremely Large Telescopes generation. Given the importance, complexity and required advances of these systems, being able to simulate them faithfully is key to their success, and thus to the success of the ELTs. The type of systems envisioned to be built for the ELTs cover most of the AO breeds, from NGS AO to multiple guide star Ground Layer, Laser Tomography and Multi-Conjugate AO systems, with typically a few thousand actuators. This represents a large step up from the current generation of AO systems, and accordingly a challenge for existing AO simulation packages. This is especially true as, in the past years, computer power has not been following Moore's law in its most common understanding; CPU clocks are hovering at about 3GHz. Although the use of super computers is a possible solution to run these simulations, being able to use smaller machines has obvious advantages: cost, access, environmental issues. By using optimised code in an already proven AO simulation platform, we were able to run complex ELT AO simulations on very modest machines, including laptops. The platform is YAO. In this paper, we describe YAO, its architecture, its capabilities, the ELT-specific challenges and optimisations, and finally its performance. As an example, execution speed ranges from 5 iterations per second for a 6 LGS 60x60 subapertures Shack-Hartmann Wavefront sensor Laser Tomography AO system (including full physical image formation and detector characteristics) up to over 30 iterations/s for a single NGS AO system.

  18. Construction and fielding of TRS (thermal radiation simulation) units for the Mill Race high explosive event. Project officer's report

    SciTech Connect

    Dishon, J.F. III

    1981-12-18

    Thermal radiation simulation units were developed and fielded on the MILL RACE event. The units released over 1 billion calories of radiant energy at a peak power rate of 1.4 x 10 to the 9th power watts. The units were fired in conjunction with the 600 ton MILL RACE event to produce blast and thermal radiation loadings on a variety of structures.

  19. Hybrid system modeling, simulation, and visualization: a crane system

    NASA Astrophysics Data System (ADS)

    Hiniduma Udugama Gamage, Sahan S.; Palmer, Patrick R.

    2003-08-01

    Modeling and visualization of a complex hybrid system with different domains of energy flow and signal flow are described in this paper. It is a crane system situated in a barge complete with the load, electrical power, drive and control systems. A dynamically and functionally accurate model of the crane was developed. The implementation is in the freely available software suit of Virtual Test Bed (VTB) for simulation and Visual Extension Engine (VXE) for visualization. The bidirectional interaction of simulator and visualizer is fully utilized in this application. The further challenges confronted in implementing this particular system and any other complex system are discussed and possible solutions are suggested.

  20. Re-awakening Magmatic Systems: The Mechanics of an Open-system Event

    NASA Astrophysics Data System (ADS)

    Bergantz, George; Burgisser, Alain; Schleicher, Jillian

    2016-04-01

    The re-awakening of magmatic systems requires new magma input, which often induces mixing with a resident magma existing as a crystal-rich mush. This is expressed by complex phenocryst populations, many of which preserve evidence of multiple episodes of recycling. The unlocking and mobilization of these resident mushes conditions the progress of re-awakening, however their processes are poorly understood. Crystal-rich but mobile systems, dominated by their granular mechanics, are not satisfactorily explained from either fluid or solid-like models. We will present a generalizing framework for describing the mechanics of crystal-rich mushes based on the notion of force chains. Force chains arise from crystal-crystal contacts and describe the highly non-uniform way that stress is transmitted in a crystal-rich mush. Using CFD-DEM simulations that resolve crystal-scale mechanics, we will show how the populations of crystal mush force chains and their spatial fabric change during an open-system event. We will show how the various forms of dissipation, such as: fluid drag, particle-fluid drag, particle normal and shear lubrication, and contact friction, jointly contribute to the processes of magma mush unlocking, mobilization and fabric formation. We will also describe non-intuitive constitutive behavior such as non-local and non-affine deformation as well as complex, rheological transitions from continuous to discontinuous shear thickening as a function of the dimensionless shear rate. One implication of this is that many of the commonly-invoked postulates about magma behavior such as lock-up at a critical crystallinity and suspension rheology, are better understood from a micro-physical (crystal-scale) perspective as a combination of far-field geometrical controls, local frictional thickening and shear jamming, each with distinct time scales. This kind of crystal-based unifying framework can simultaneously recover diverse processes such as strain-localization, shear

  1. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  2. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  3. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    NASA Technical Reports Server (NTRS)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  4. Mobilization of PAHs and PCBs from In-Place Contaminated Marine Sediments During Simulated Resuspension Events

    NASA Astrophysics Data System (ADS)

    Latimer, J. S.; Davis, W. R.; Keith, D. J.

    1999-10-01

    A particle entrainment simulator was used to experimentally produce representative estuarine resuspension conditions to investigate the resulting transport of polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs) to the overlying water column. Contaminants were evaluated in bulk sediments, size fractionated sediments, resuspended particulate material and in some cases, dissolved phases during the experiments. The two types of sediments used in the experiments, dredged material and bedded estuarine sediment, represented gradients in contaminant loadings and sediment textural characteristics. For the bedded sediment, resuspension tended to winnow the sediments of finer particles. However, in the case of the more highly contaminated dredge material, non-selective resuspension was most common. Resuspension resulted in up to orders of magnitude higher particle-bound organic contaminant concentrations in the overlying water column. Dissolved phase PAH changes during resuspension were variable and at most, increased by a factor of three. The sifting process resulted in the partitioning of fine and coarse particle contaminant loading. For bedded sediments, accurate predictions of PAH and PCB loadings on resuspended particles were made using the mass of resuspended particles of different sizes and the concentrations of contaminants in the particle pools of the bulk sediment. However, due possibly to contributions from other unmeasured particles (e.g. colloids), predictions were not possible for the dredge material. Thus, knowledge of the redistribution and fate of colloids may be important. The partitioning of PAHs between the dissolved and particulate phases during resuspension events was predicted to within a factor of two from the amount of organic carbon in each of the resuspended samples. These experiments show that contaminant transport is a function of the chemistry and textural characteristics of the bulk sediment and the winnowing action

  5. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  6. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria. PMID:26154230

  7. Simulations of two types of El Niño events by an optimal forcing vector approach

    NASA Astrophysics Data System (ADS)

    Tian, Ben; Duan, Wansuo; Xu, Hui

    2014-05-01

    In this paper, an optimal forcing vector (OFV) approach is proposed. The OFV offsets tendency errors and optimizes the agreement of the model simulation with observation. We apply the OFV approach to the well-known Zebiak-Cane model and simulate several observed eastern Pacific (EP) El Niño and central Pacific (CP) El Niño events during 1980-2004. It is found that the Zebiak-Cane model with a proper initial condition often reproduces the EP-El Niño events; however, the Zebiak-Cane model fails to reproduce the CP-El Niño events. The model may be much more influenced by model errors when simulating the CP-El Nino events. As expected, when we use the OFV to correct the Zebiak-Cane model, the model reproduces the three CP-El Niño events well. Furthermore, the simulations of the corresponding winds and thermocline depths are also acceptable. In particular, the thermocline depth simulations for the three CP-El Niño events lead us to believe that the discharge process of the equatorial heat content associated with the CP-El Niño is not efficient and emphasizes the role of the zonal advection in the development of the CP-El Nino events. The OFVs associated with the three CP-El Niño events often exhibit a sea surface temperature anomaly (SSTA) tendency with positive anomalies in the equatorial eastern Pacific; therefore, the SST tendency errors occurring in the equatorial eastern Pacific may dominate the uncertainties of the Zebiak-Cane model while simulating CP-El Nino events. A further investigation demonstrates that one of the model errors offset by the OFVs is of a pattern similar to the SST cold-tongue cooling mode, which may then provide one of the climatological conditions for the frequent occurrence of CP-El Nino events. The OFV may therefore be a useful tool for correcting forecast models and then for helping improve the forecast skill of the models.

  8. Simulations of two types of El Niño events by an optimal forcing vector approach

    NASA Astrophysics Data System (ADS)

    Duan, Wansuo; Tian, Ben; Xu, Hui

    2014-09-01

    In this paper, an optimal forcing vector (OFV) approach is proposed. The OFV offsets tendency errors and optimizes the agreement of the model simulation with observation. We apply the OFV approach to the well-known Zebiak-Cane model and simulate several observed eastern Pacific (EP) El Niño and central Pacific (CP) El Niño events during 1980-2004. It is found that the Zebiak-Cane model with a proper initial condition often reproduces the EP-El Niño events; however, the Zebiak-Cane model fails to reproduce the CP-El Niño events. The model may be much more influenced by model errors when simulating the CP-El Nino events. As expected, when we use the OFV to correct the Zebiak-Cane model, the model reproduces the three CP-El Niño events well. Furthermore, the simulations of the corresponding winds and thermocline depths are also acceptable. In particular, the thermocline depth simulations for the three CP-El Niño events lead us to believe that the discharge process of the equatorial heat content associated with the CP-El Niño is not efficient and emphasizes the role of the zonal advection in the development of the CP-El Nino events. The OFVs associated with the three CP-El Niño events often exhibit a sea surface temperature anomaly (SSTA) tendency with positive anomalies in the equatorial eastern Pacific; therefore, the SST tendency errors occurring in the equatorial eastern Pacific may dominate the uncertainties of the Zebiak-Cane model while simulating CP-El Nino events. A further investigation demonstrates that one of the model errors offset by the OFVs is of a pattern similar to the SST cold-tongue cooling mode, which may then provide one of the climatological conditions for the frequent occurrence of CP-El Nino events. The OFV may therefore be a useful tool for correcting forecast models and then for helping improve the forecast skill of the models.

  9. Theory and Simulations of Solar System Plasmas

    NASA Technical Reports Server (NTRS)

    Goldstein, Melvyn L.

    2011-01-01

    "Theory and simulations of solar system plasmas" aims to highlight results from microscopic to global scales, achieved by theoretical investigations and numerical simulations of the plasma dynamics in the solar system. The theoretical approach must allow evidencing the universality of the phenomena being considered, whatever the region is where their role is studied; at the Sun, in the solar corona, in the interplanetary space or in planetary magnetospheres. All possible theoretical issues concerning plasma dynamics are welcome, especially those using numerical models and simulations, since these tools are mandatory whenever analytical treatments fail, in particular when complex nonlinear phenomena are at work. Comparative studies for ongoing missions like Cassini, Cluster, Demeter, Stereo, Wind, SDO, Hinode, as well as those preparing future missions and proposals, like, e.g., MMS and Solar Orbiter, are especially encouraged.

  10. LHC RF System Time-Domain Simulation

    SciTech Connect

    Mastorides, T.; Rivetta, C.; /SLAC

    2010-09-14

    Non-linear time-domain simulations have been developed for the Positron-Electron Project (PEP-II) and the Large Hadron Collider (LHC). These simulations capture the dynamic behavior of the RF station-beam interaction and are structured to reproduce the technical characteristics of the system (noise contributions, non-linear elements, and more). As such, they provide useful results and insight for the development and design of future LLRF feedback systems. They are also a valuable tool for the study of diverse longitudinal beam dynamics effects such as coupled-bunch impedance driven instabilities and single bunch longitudinal emittance growth. Results from these studies and related measurements from PEP-II and LHC have been presented in multiple places. This report presents an example of the time-domain simulation implementation for the LHC.

  11. Electric System Intra-hour Operation Simulator

    Energy Science and Technology Software Center (ESTSC)

    2014-03-07

    ESIOS is a software program developed at Pacific Northwest National Laboratory (PNNL) that performs intra-hour dispatch and automatic generation control (AGC) simulations for electric power system frequency regulation and load/variable generation following. The program dispatches generation resources at minute interval to meet control performance requirements, while incorporating stochastic models of forecast errors and variability with generation, load, interchange and market behaviors. The simulator also contains an operator model that mimics manual actions to adjust resourcemore » dispatch and maintain system reserves. Besides simulating generation fleet intra-hour dispatch, ESIOS can also be used as a test platform for the design and verification of energy storage, demand response, and other technologies helping to accommodate variable generation.« less

  12. An investigation into pilot and system response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Materials relating to the study of pilot and system response to critical in-flight events (CIFE) are given. An annotated bibliography and a trip summary outline are presented, as are knowledge surveys with accompanying answer keys. Performance profiles of pilots and performance data from the simulations of CIFE's are given. The paper and pencil testing materials are reproduced. Conditions for the use of the additive model are discussed. A master summary of data for the destination diversion scenario is given. An interview with an aircraft mechanic demonstrates the feasibility of system problem diagnosis from a verbal description of symptoms and shows the information seeking and problem solving logic used by an expert to narrow the list of probable causes of aircraft failure.

  13. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  14. Aviation spectral camera infinity target simulation system

    NASA Astrophysics Data System (ADS)

    Liu, Xinyue; Ming, Xing; Liu, Jiu; Guo, Wenji; Lv, Gunbo

    2014-11-01

    With the development of science and technology, the applications of aviation spectral camera becoming more widely. Developing a test system of dynamic target is more important. Aviation spectral camera infinity target simulation system can be used to test the resolution and the modulation transfer function of camera. The construction and work principle of infinity target simulation system were introduced in detail. Dynamic target generator based digital micromirror device (DMD) and required performance of collimation System were analyzed and reported. The dynamic target generator based on DMD had the advantages of replacing image convenient, size small and flexible. According to the requirement of tested camera, by rotating and moving mirror, has completed a full field infinity dynamic target test plan.

  15. Signatures of small-scale heating events in EUV spectral lines as modeled from 3D MHD simulations

    NASA Astrophysics Data System (ADS)

    Guerreiro, Nuno; Haberreiter, Margit; Hansteen, Viggo; Curdt, Werner; Schmutz, Werner

    2014-05-01

    We aim at understanding the implications of small scale heating events in the solar atmosphere for the variations of the solar spectral irradiance. We present a technique for identification and characterization of these events in 3D simulations of the solar atmosphere. An accurate property determination of these events in time and space will help us to understand how spectral lines, in particular in the EUV, respond to them and which kind of spectral signatures one would expect to find in observations as from SOHO/SUMER and eventually from future space missions, as for example observations by SPICE on board Solar Orbiter.

  16. Identification and characterisation of small-scale heating events in the solar atmosphere from 3D MHD simulations

    NASA Astrophysics Data System (ADS)

    Guerreiro, Nuno; Haberreiter, Margit; Hansteen, Viggo; Schmutz, Werner

    2015-04-01

    We study the properties of the small-scale heating events in the solar atmosphere in the nano flare and micro flare energy scale using 3D MHD simulations. We put forward a method to identify and track the heating events in time to study their life times, frequency distributions and spectral signatures. These results aim to better understand the observations from future space missions such as the EUI and SPICE instruments onboard Solar Orbiter and improve our knowledge of the role of small-scale heating events in the heating of the corona.

  17. Simulation of 2003 Bam (Iran) earthquake using empirical Green's function method via very small and near-fault events

    NASA Astrophysics Data System (ADS)

    Riahi, Ali; Sadeghi, Hossein; Hosseini, Sayyed Keivan

    2015-06-01

    The 2003 Bam, Iran, earthquake (Mw = 6.6) was recorded by the BAM accelerometer station. Since the causative fault was located just below the city, the accelerometer recorded the main shock, a foreshock and several local aftershocks. To study the scenario of rupturing, we simulated all components of the observed main shock waveform via the empirical Green's function method. 28 selected aftershocks and the single foreshock are used to simulate the main shock in the frequency range of 0.5-5 Hz. Since the events were very close to the station, some small events may not have similar path effects to the main shock. Therefore, it is essential to employ some appropriate changes to the waveforms to alleviate path difference effects. The starting point of the rupture is identified in the centre of the strong motion generation area and is located approximately 5 km south of the BAM station and in depth of about 7 km. The horizontal simulated components imply that the main shock was located west of the BAM station. In contrast, significant variation in the ratio of amplitudes in EW and NS components may be used to discuss the possibility of dissimilarity in the focal mechanism of the small events. Most aftershocks with similar mechanisms to the main shock, that is similar EW/NS maximum amplitude ratio, have capacity to simulate certain peaks of their horizontal components. However, some small events with different mechanisms are only able to simulate the peaks of up to one horizontal component. Some changes were applied to the empirical Green's function method to incorporate two small events by using a combined fault model. While the two aftershocks have different mechanisms, some combinations may improve simulations. The rupture initiating point at the middle of the fault plane and improved simulations by combination of two fault surfaces with different focal mechanisms may suggest a bilateral rupture and combination of two focal mechanisms for the main shock of the Bam

  18. The systems biology simulation core algorithm

    PubMed Central

    2013-01-01

    Background With the increasing availability of high dimensional time course data for metabolites, genes, and fluxes, the mathematical description of dynamical systems has become an essential aspect of research in systems biology. Models are often encoded in formats such as SBML, whose structure is very complex and difficult to evaluate due to many special cases. Results This article describes an efficient algorithm to solve SBML models that are interpreted in terms of ordinary differential equations. We begin our consideration with a formal representation of the mathematical form of the models and explain all parts of the algorithm in detail, including several preprocessing steps. We provide a flexible reference implementation as part of the Systems Biology Simulation Core Library, a community-driven project providing a large collection of numerical solvers and a sophisticated interface hierarchy for the definition of custom differential equation systems. To demonstrate the capabilities of the new algorithm, it has been tested with the entire SBML Test Suite and all models of BioModels Database. Conclusions The formal description of the mathematics behind the SBML format facilitates the implementation of the algorithm within specifically tailored programs. The reference implementation can be used as a simulation backend for Java™-based programs. Source code, binaries, and documentation can be freely obtained under the terms of the LGPL version 3 from http://simulation-core.sourceforge.net. Feature requests, bug reports, contributions, or any further discussion can be directed to the mailing list simulation-core-development@lists.sourceforge.net. PMID:23826941

  19. Rotor systems research aircraft simulation mathematical model

    NASA Technical Reports Server (NTRS)

    Houck, J. A.; Moore, F. L.; Howlett, J. J.; Pollock, K. S.; Browne, M. M.

    1977-01-01

    An analytical model developed for evaluating and verifying advanced rotor concepts is discussed. The model was used during in both open loop and real time man-in-the-loop simulation during the rotor systems research aircraft design. Future applications include: pilot training, preflight of test programs, and the evaluation of promising concepts before their implementation on the flight vehicle.

  20. Probabilistic dynamics of mistuned bladed disc systems using subset simulation

    NASA Astrophysics Data System (ADS)

    Yuan, Jie; Allegri, Giuliano; Scarpa, Fabrizio; Rajasekaran, Ramesh; Patsias, Sophoclis

    2015-08-01

    The work describes an assessment of subset simulation (SubSim) techniques to increase the computational efficiency for the predictions of probabilistic dynamic behaviour in mistuned bladed disc systems. SubSim is an adaptive stochastic procedure to efficiently compute small failure probabilities, which are expressed as a product of large conditional failures probabilities by introducing intermediate failure events. The original version of SubSim with a classical modified Markov chain Monte Carlo (MCMC) method is used in this work to generate samples related to intermediate failure events. A 2-DOFs model with lumped parameters identified from a high-fidelity finite element model is used to represent a bladed disc. The statistics associated to the maximum forced frequency response amplitudes are evaluated from different levels of the blade mistuning using stiffness perturbations of the blades. Direct Monte Carlo simulations (MCS) are used to benchmark the results from the SubSim. The proposed methodology is shown to capture efficiently the statistical properties of the mistuned blades with less than 5% samples compared to the direct MCS method. Trade-off parametric studies of the SubSim method indicate that 2000 samples at each level yield an overall good computational efficiency and accuracy for the bladed disk system considered in this work. The study confirms that SubSim techniques can be effectively used in stochastic analysis of bladed disc systems with uncertainty related to the blade configurations.

  1. Plans for wind energy system simulation

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.

    1978-01-01

    A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.

  2. Impact of cloud microphysics and cumulus parameterization on simulation of heavy rainfall event during 7-9 October 2007 over Bangladesh

    NASA Astrophysics Data System (ADS)

    Mahbub Alam, M.

    2014-03-01

    In the present study, the Advanced Research WRF (ARW) version 3.2.1 has been used to simulate the heavy rainfall event that occurred between 7 and 9 October 2007 in the southern part of Bangladesh. Weather Research and Forecast (WRF-ARW version) modelling system with six different microphysics (MP) schemes and two different cumulus parameterization (CP) schemes in a nested configuration was chosen for simulating the event. The model domains consist of outer and inner domains having 9 and 3 km horizontal resolution, respectively with 28 vertical sigma levels. The impacts of cloud microphysical processes by means of precipitation, wind and reflectivity, kinematic and thermodynamic characteristics of the event have been studied. Sensitivity experiments have been conducted with the WRF model to test the impact of microphysical and cumulus parameterization schemes in capturing the extreme weather event. NCEP FNL data were used for the initial and boundary condition. The model ran for 72 h using initial data at 0000 UTC of 7 October 2007. The simulated rainfall shows that WSM6-KF combination gives better results for all combinations and after that Lin-KF combination. WSM3-KF has simulated, less area average rainfall out of all MP schemes that were coupled with KF scheme. The sharp peak of relative humidity up to 300 hPa has been simulated along the vertical line where maximum updraft has been found for all MPs coupled with KF and BMJ schemes. The simulated rain water and cloud water mixing ratio were maximum at the position where the vertical velocity and reflectivity has also been maximum. The production of rain water mixing ratio depends on MP schemes as well as CP schemes. Rainfall depends on rain water mixing ratio between 950 and 500 hPa. Rain water mixing ratio above 500 hPa level has no effect on surface rain.

  3. Assessment of WRF microphysics schemes to simulate extreme precipitation events from the perspective of GMI radiative signatures

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Shin, D. B.; Joh, M.

    2015-12-01

    Numerical simulations of precipitation depend to a large degree on the assumed cloud microphysics schemes representing the formation, growth and fallout of cloud droplets and ice crystals. Recent studies show that assumed cloud microphysics play a major role not only in forecasting precipitation, especially in cases of extreme precipitation events, but also in the quality of the passive microwave rainfall estimation. Evaluations of the various Weather Research Forecasting (WRF) model microphysics schemes in this study are based on a method that was originally developed to construct the a-priori databases of precipitation profiles and associated brightness temperatures (TBs) for precipitation retrievals. This methodology generates three-dimensional (3D) precipitation fields by matching the GPM dual frequency radar (DPR) reflectivity profiles with those calculated from cloud resolving model (CRM)-derived hydrometeor profiles. The method eventually provides 3D simulated precipitation fields over the DPR scan swaths. That is, atmospheric and hydrometeor profiles can be generated at each DPR pixel based on CRM and DPR reflectivity profiles. The generated raining systems over DPR observation fields can be applied to any radiometers that are unaccompanied with a radar for microwave radiative calculation with consideration of each sensor's channel and field of view. Assessment of the WRF model microphysics schemes for several typhoon cases in terms of emission and scattering signals of GMI will be discussed.

  4. EMERGENCY BRAKING IN ADULTS VERSUS NOVICE TEEN DRIVERS: RESPONSE TO SIMULATED SUDDEN DRIVING EVENTS

    PubMed Central

    Kandadai, Venk; McDonald, Catherine C.; Winston, Flaura K.

    2015-01-01

    Motor vehicle crashes remain the leading cause of death in teens in the United States. Newly licensed drivers are the group most at risk for crashes. Their driving skills are very new, still very often untested, so that their ability to properly react in an emergency situation remains a research question. Since it is impossible to expose human subjects to critical life threatening driving scenarios, researchers have been increasingly using driving simulators to assess driving skills. This paper summarizes the results of a driving scenario in a study comparing the driving performance of novice teen drivers (n=21) 16–17 year olds with 90 days of provisional licensure with that of experienced adult drivers (n=17) 25–50 year olds with at least 5 years of PA licensure, at least 100 miles driven per week and no self-reported collisions in the previous 3 years. As part of a 30 to 35 simulated drive that encompassed the most common scenarios that result in serious crashes, participants were exposed to a sudden car event. As the participant drove on a suburban road, a car surged from a driveway hidden by a fence on the right side of the road. To avoid the crash, participants must hard brake, exhibiting dynamic control over both attentional and motor resources. The results showed strong differences between the experienced adult and novice teen drivers in the brake pressure applied. When placed in the same situation, the novice teens decelerated on average 50% less than the experienced adults (p<0.01). PMID:26709330

  5. A simulation of data acquisition system for SSC experiments

    SciTech Connect

    Watase, Y.; Ikeda, H.

    1989-04-01

    A simulation on some parts of the data acquisition system was performed using a general purpose simulation language GPSS. Several results of the simulation are discussed for the data acquisition system for the SSC experiment.

  6. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  7. Modular Aero-Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2006-01-01

    The Modular Aero-Propulsion System Simulation (MAPSS) is a graphical simulation environment designed for the development of advanced control algorithms and rapid testing of these algorithms on a generic computational model of a turbofan engine and its control system. MAPSS is a nonlinear, non-real-time simulation comprising a Component Level Model (CLM) module and a Controller-and-Actuator Dynamics (CAD) module. The CLM module simulates the dynamics of engine components at a sampling rate of 2,500 Hz. The controller submodule of the CAD module simulates a digital controller, which has a typical update rate of 50 Hz. The sampling rate for the actuators in the CAD module is the same as that of the CLM. MAPSS provides a graphical user interface that affords easy access to engine-operation, engine-health, and control parameters; is used to enter such input model parameters as power lever angle (PLA), Mach number, and altitude; and can be used to change controller and engine parameters. Output variables are selectable by the user. Output data as well as any changes to constants and other parameters can be saved and reloaded into the GUI later.

  8. Java simulations of embedded control systems.

    PubMed

    Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt. PMID:22163674

  9. Java Simulations of Embedded Control Systems

    PubMed Central

    Farias, Gonzalo; Cervin, Anton; Årzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt. PMID:22163674

  10. Cassini radar : system concept and simulation results

    NASA Astrophysics Data System (ADS)

    Melacci, P. T.; Orosei, R.; Picardi, G.; Seu, R.

    1998-10-01

    The Cassini mission is an international venture, involving NASA, the European Space Agency (ESA) and the Italian Space Agency (ASI), for the investigation of the Saturn system and, in particular, Titan. The Cassini radar will be able to see through Titan's thick, optically opaque atmosphere, allowing us to better understand the composition and the morphology of its surface, but the interpretation of the results, due to the complex interplay of many different factors determining the radar echo, will not be possible without an extensive modellization of the radar system functioning and of the surface reflectivity. In this paper, a simulator of the multimode Cassini radar will be described, after a brief review of our current knowledge of Titan and a discussion of the contribution of the Cassini radar in answering to currently open questions. Finally, the results of the simulator will be discussed. The simulator has been implemented on a RISC 6000 computer by considering only the active modes of operation, that is altimeter and synthetic aperture radar. In the instrument simulation, strict reference has been made to the present planned sequence of observations and to the radar settings, including burst and single pulse duration, pulse bandwidth, pulse repetition frequency and all other parameters which may be changed, and possibly optimized, according to the operative mode. The observed surfaces are simulated by a facet model, allowing the generation of surfaces with Gaussian or non-Gaussian roughness statistic, together with the possibility of assigning to the surface an average behaviour which can represent, for instance, a flat surface or a crater. The results of the simulation will be discussed, in order to check the analytical evaluations of the models of the average received echoes and of the attainable performances. In conclusion, the simulation results should allow the validation of the theoretical evaluations of the capabilities of microwave instruments, when

  11. On the use of Paleo DEMS for Simulation of historical Tsunami Events

    NASA Astrophysics Data System (ADS)

    Wronna, Martin; Baptista, Maria Ana; Götz, Joachim

    2016-04-01

    In this study, we present a methodology to reconstruct a Paleo Digital Elevation Model (PDEM) to alter geomorphological contexts between the present and the desired paleo period. We aim to simulate a historical tsunami propagation in the same geomorphological contexts of the time of the event. The methodology uses a combination of historical data, GPS-measurements with more recent LIDAR data to build PDEMs. Antique maps are georeferenced; altitude elevations are attributed through descriptions, and old pictures are used to estimate the original outline of a given site. Antique maps are georeferenced to obtain the location of landform and building features. Analysis and interpretation of the historical accounts, descriptions and old pictures serve to attribute an approximate elevation to landform and building features. River mouths and water courses outline can be rebuilt by the boundaries as given in the antique maps. Analysis of present day river mouths with similar characteristics permits the reconstruction of the antique water courses. GPS-RTK measurements along chosen river mouths' in similar geomorphologic environments is used to derive their inclination. We applied this methodology to the 1st November 1755 flooding of Cascais-Portugal. Our results show that using the PDEM we can reproduce the inundation described in most of the historical accounts. This study received funding from project ASTARTE- Assessment Strategy and Risk Reduction for Tsunamis in Europe a collaborative project Grant 603839, FP7-ENV2013 6.4-3

  12. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; Vlachoudis, Vasilis

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  13. Topological events in two-dimensional grain growth: Experiments and simulations

    SciTech Connect

    Fradkov, V.E.; Glicksman, M.E.; Palmer, M.; Rajan, K. . Materials Engineering Dept.)

    1994-08-01

    Grain growth in polycrystals is a process that occurs as a result of the vanishing of small grains. The mean topological class of vanishing two-dimensional (2-D) grains was found experimentally to be about 4.5. This result suggests that most vanishing grains are either 4- or 5-sided. A recent theory of 2-D grain growth is explicitly based on this fact, treating the switching as random events. The process of shrinking of 4- and 5-sided two-dimensional grains was observed experimentally on polycrystalline films of transparent, pure succinonitrile (SCN). Grain shrinking was studied theoretically and simulated by computer (both dynamic and Monte Carlo). It was found that most shrinking grains are topologically stable and remain within their topological class until they are much smaller than their neighbors. They discuss differences which were found with respect to the behavior of 2-D polycrystals, a 2-D ideal soap froth, and a 2-D section of a 3-D grain structure.

  14. Three dimensional numerical simulation of the April 2000 CME event with a magnetized plasma blob model

    NASA Astrophysics Data System (ADS)

    Shen, Fang

    A three-dimensional time-dependent, numerical magnetohydrodynamic (MHD) model, with the asynchronous and parallel time-marching method is used to investigate the propagation of coronal mass ejections (CMEs) in the nonhomogenous background solar wind flow. The solar wind background with a self-consistent source surface structures as initial-boundary conditions is first presented, from the source surface of 2.5 Rs to the Earth's orbit (215 Rs) and beyond. The CMEs are simulated by means of a very simple flux rope model: a high density and high velocity magnetized plasma blob is superposed on a background steady state solar wind model with an initial velocity and launch direction. The dynamical interaction of a CME with the background solar wind flow between 2.5 and 220 Rs is investigated. We have chosen the well-defined halo-CME event of 4-6 April 2000 as a test case. In this validation study, we find that this 3D MHD model, with the asynchronous and parallel time-marching method, the self-consistent source surface and the simple flux rope model, provides a relatively satisfactory comparison with the ACE spacecraft observations at L1 point.

  15. Hybrid stochastic simulations of intracellular reaction-diffusion systems

    PubMed Central

    Kalantzis, Georgios

    2009-01-01

    With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a novel hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed a dynamic partitioning strategy using fractional propensities. In that way processes with high frequency are simulated mostly with deterministic rate-based equations, and those with low frequency mostly with the exact stochastic algorithm of Gillespie. In this way we preserve the stochastic behavior of cellular pathways while being able to apply it to large populations of molecules. In this article we describe this hybrid algorithmic approach, and we demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors. PMID:19414282

  16. Simulation of Flywheel Energy Storage System Controls

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Wolff, Frederick J.; Dravid, Narayan

    2001-01-01

    This paper presents the progress made in the controller design and operation of a flywheel energy storage system. The switching logic for the converter bridge circuit has been redefined to reduce line current harmonics, even at the highest operating speed of the permanent magnet motor-generator. An electromechanical machine model is utilized to simulate charge and discharge operation of the inertial energy in the flywheel. Controlling the magnitude of phase currents regulates the rate of charge and discharge. The resulting improvements are demonstrated by simulation.

  17. Simulation of traffic control signal systems

    NASA Technical Reports Server (NTRS)

    Connolly, P. J.; Concannon, P. A.; Ricci, R. C.

    1974-01-01

    In recent years there has been considerable interest in the development and testing of control strategies for networks of urban traffic signal systems by simulation. Simulation is an inexpensive and timely method for evaluating the effect of these traffic control strategies since traffic phenomena are too complex to be defined by analytical models and since a controlled experiment may be hazardous, expensive, and slow in producing meaningful results. This paper describes the application of an urban traffic corridor program, to evaluate the effectiveness of different traffic control strategies for the Massachusetts Avenue TOPICS Project.

  18. Computer simulation of bounded plasma systems

    SciTech Connect

    Lawson, W.S.

    1987-03-05

    The physical and numerical problems of kinetic simulation of a bounded electrostatic plasma system in one planar dimension are examined, and solutions to them are presented. These problems include particle absorption, reflection and emission at boundaries, the solution of Poisson's equation under non-periodic boundary conditions, and the treatment of an external circuit connecting the boundaries. Some comments are also made regarding the problems of higher dimensions. The methods which are described here are implemented in a code named PDW1, which is available from Professor C.K. Birdsall, Plasma Theory and Simulation Group, Cory Hall, University of California, Berkeley, CA 94720.

  19. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  20. Uncertainty analysis of numerical model simulations and HFR measurements during high energy events

    NASA Astrophysics Data System (ADS)

    Donncha, Fearghal O.; Ragnoli, Emanuele; Suits, Frank; Updyke, Teresa; Roarty, Hugh

    2013-04-01

    The identification and decomposition of sensor and model shortcomings is a fundamental component of any coastal monitoring and predictive system. In this research, numerical model simulations are combined with high-frequency radar (HFR) measurements to provide insights into the statistical accuracy of the remote sensing unit. A combination of classical tidal analysis and quantitative measures of correlation evaluate the performance of both across the bay. A network of high frequency radars is deployed within the Chesapeake study site, on the East coast of the United States, as a backbone component of the Integrated Ocean Observing System (IOOS). This system provides real-time synoptic measurements of surface currents in the zonal and meridional direction at hourly intervals in areas where at least two stations overlap, and radial components elsewhere. In conjunction with this numerical simulations using EFDC (Environmental Fluid Dynamics Code), an advanced three-dimensional model, provide additional details on flows, encompassing both surface dynamics and volumetric transports, while eliminating certain fundamental error inherent in the HFR system such as geometric dilution of precision (GDOP) and range dependencies. The aim of this research is an uncertainty estimate of both these datasets allowing for a degree of inaccuracy in both. The analysis focuses on comparisons between both the vector and radial component of flows returned by the HFR relative to numerical predictions. The analysis provides insight into the reported accuracy of both the raw radial data and the post-processed vector current data computed from combining the radial data. Of interest is any loss of accuracy due to this post-processing. Linear regression techniques decompose the surface currents based on dominant flow processes (tide and wind); statistical analysis and cross-correlation techniques measure agreement between the processed signal and dominant forcing parameters. The tidal signal