Science.gov

Sample records for event system simulation

  1. Synchronous Parallel System for Emulation and Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  2. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  3. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  4. Parallel/distributed simulation via event-reservation approach for parametric study of discrete event systems

    NASA Astrophysics Data System (ADS)

    Bhatti, Ghulam M.; Vakili, Pirooz

    1997-06-01

    There are significant opportunities for the development of parallel/distributed simulation algorithms in the context of parametric study of discrete event systems. In such studies, simulation of multiple (often a large number of) parametric variants is required in order to, for example, identify significant parameters (factor screening), determine directions for response improvement (gradient estimation), find optimal parameter settings (response optimization), or construct a model of the response (meta-modeling). The computational burden in this case is to a large extent due to the large number of alternatives that need to be simulated. An effective strategy in this context is to concurrently simulate a number of parametric variants: the structural similarity of the variants often allows for significant amount of sharing of the simulation work, and the code for concurrent simulation of the variants can often be implemented in a parallel/distributed environment. In this paper, we describe two methods of parallel/distributed/concurrent simulation called the standard clock (SC) and the general shared clock (GSC) simulation. Both approaches rely on an event-reservation approach: by contrast to most discrete-event simulation approaches that are based on an event-scheduling approach, in the SC and GSC simulation, the occurrence instances of all events are reserved on the time axis. These instances may or may not be used. This event-reservation approach frees the clock mechanism of the simulation from needing feedback from the state-update mechanism. Due to this autonomy of the clock mechanism, a single clock can be used to drive a number (possibly large) of variants concurrently and in parallel. The autonomy of the clock mechanism is also the key to the different implementation strategies we adopt. To illustrate, we describe the simulation of parametric versions of wireless communication networks on message passing and shared memory environments.

  5. Multi-threaded, discrete event simulation of distributed computing systems

    NASA Astrophysics Data System (ADS)

    Legrand, Iosif; MONARC Collaboration

    2001-10-01

    The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.

  6. Simulating rare events in equilibrium or nonequilibrium stochastic systems.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-01-14

    We present three algorithms for calculating rate constants and sampling transition paths for rare events in simulations with stochastic dynamics. The methods do not require a priori knowledge of the phase-space density and are suitable for equilibrium or nonequilibrium systems in stationary state. All the methods use a series of interfaces in phase space, between the initial and final states, to generate transition paths as chains of connected partial paths, in a ratchetlike manner. No assumptions are made about the distribution of paths at the interfaces. The three methods differ in the way that the transition path ensemble is generated. We apply the algorithms to kinetic Monte Carlo simulations of a genetic switch and to Langevin dynamics simulations of intermittently driven polymer translocation through a pore. We find that the three methods are all of comparable efficiency, and that all the methods are much more efficient than brute-force simulation.

  7. An event-based hydrologic simulation model for bioretention systems.

    PubMed

    Roy-Poirier, A; Filion, Y; Champagne, P

    2015-01-01

    Bioretention systems are designed to treat stormwater and provide attenuated drainage between storms. Bioretention has shown great potential at reducing the volume and improving the quality of stormwater. This study introduces the bioretention hydrologic model (BHM), a one-dimensional model that simulates the hydrologic response of a bioretention system over the duration of a storm event. BHM is based on the RECARGA model, but has been adapted for improved accuracy and integration of pollutant transport models. BHM contains four completely-mixed layers and accounts for evapotranspiration, overflow, exfiltration to native soils and underdrain discharge. Model results were evaluated against field data collected over 10 storm events. Simulated flows were particularly sensitive to antecedent water content and drainage parameters of bioretention soils, which were calibrated through an optimisation algorithm. Temporal disparity was observed between simulated and measured flows, which was attributed to preferential flow paths formed within the soil matrix of the field system. Modelling results suggest that soil water storage is the most important short-term hydrologic process in bioretention, with exfiltration having the potential to be significant in native soils with sufficient permeability.

  8. Rare event simulation of the chaotic Lorenz 96 dynamical system

    NASA Astrophysics Data System (ADS)

    Wouters, Jeroen; Bouchet, Freddy

    2015-04-01

    The simulation of rare events is becoming increasingly important in the climate sciences. Several sessions are devoted to rare and extreme events at this meeting and the IPCC has devoted a special report to risk management of extreme events (SREX). Brute force simulation of rare events can however be very costly. To obtain satisfactory statistics on a 1/1000y event, one needs to perform simulations over several thousands of years. Recently, a class of rare event simulation algorithms has been introduced that could yield significant increases in performance with respect to brute force simulations (see e.g. [1]). In these algorithms an ensemble of simulations is evolved in parallel, while at certain interaction times ensemble members are killed and cloned so as to have better statistics in the region of phase space that is relevant to the rare event of interest. We will discuss the implementational issues and performance gains for these algorithms. We also present results on a first application of a rare event simulation algorithm to a toy model for chaos in the atmosphere, the Lorenz 96 model. We demonstrate that for the estimation of the histogram tail of the energy observable, the algorithm gives a significant error reduction. We will furthermore discuss first results and an outlook on the application of rare event simulation algorithms to study blocking atmospheric circulation and heat wave events in the PlaSim climate model [2]. [1] Del Moral, P. & Garnier, J. Genealogical particle analysis of rare events. The Annals of Applied Probability 15, 2496-2534 (2005). [2] http://www.mi.uni-hamburg.de/Planet-Simul.216.0.html

  9. Enhancing Complex System Performance Using Discrete-Event Simulation

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    2010-01-01

    In this paper, we utilize discrete-event simulation (DES) merged with human factors analysis to provide the venue within which the separation and deconfliction of the system/human operating principles can occur. A concrete example is presented to illustrate the performance enhancement gains for an aviation cargo flow and security inspection system achieved through the development and use of a process DES. The overall performance of the system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and total number of pallets waiting for inspection in the queue. These metrics are performance indicators of the system's ability to service current needs and respond to additional requests. We studied and analyzed different scenarios by changing various model parameters such as the number of pieces per pallet ratio, number of inspectors and cargo handling personnel, number of forklifts, number and types of detection systems, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures identified effective ways to meet inspection requirements while maintaining or reducing overall operational cost and eliminating any shipping delays associated with any proposed changes in inspection requirements. With this understanding effective operational strategies can be developed to optimally use personnel while still maintaining plant efficiency, reducing process interruptions, and holding or reducing costs.

  10. Simulating adverse event spontaneous reporting systems as preferential attachment networks: application to the Vaccine Adverse Event Reporting System.

    PubMed

    Scott, J; Botsis, T; Ball, R

    2014-01-01

    Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.

  11. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  12. Parametric Parallel Simulation of Discrete Event Systems on SIMD Supercomputers

    DTIC Science & Technology

    1994-05-01

    Arrival @ Node i )r, - i. (5.20) qmaxBE P(Accepting Departure @ Node i => Join Nodej )1•. - •i’,P, - (5.21) qmax,BE k XDri + g) P(Null Event)!P,.,.a =W1...network. The departure rate from node j is 0 when that node is in state 0 and g, otherwise. Departure Rate from Nodej = 0* n(0Oj) + j(l - (0j)) 168

  13. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  14. An Early Warning System for Loan Risk Assessment Based on Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Zhou, Hong; Qiu, Yue; Wu, Yueqin

    System simulation is one of important tool for risk assessment. In this paper, a new method is presented to deal with credit risk assessment problems for commercial banks based on rare event simulation. The failure probability of repaying loans of listed company is taken as the criterion to measure the level of credit risk. The rare-event concept is adopted to construct the model of credit risk identification in commercial banks, and cross-entropy scheme is designed to implement the rare event simulation, based on which the loss probability can be assessed. Numerical experiments have shown that the method has a strong capability to identify the credit risk for commercial banks and offers a good tool for early warning.

  15. Discrete-event simulation for the design and evaluation of physical protection systems

    SciTech Connect

    Jordan, S.E.; Snell, M.K.; Madsen, M.M.; Smith, J.S.; Peters, B.A.

    1998-08-01

    This paper explores the use of discrete-event simulation for the design and control of physical protection systems for fixed-site facilities housing items of significant value. It begins by discussing several modeling and simulation activities currently performed in designing and analyzing these protection systems and then discusses capabilities that design/analysis tools should have. The remainder of the article then discusses in detail how some of these new capabilities have been implemented in software to achieve a prototype design and analysis tool. The simulation software technology provides a communications mechanism between a running simulation and one or more external programs. In the prototype security analysis tool, these capabilities are used to facilitate human-in-the-loop interaction and to support a real-time connection to a virtual reality (VR) model of the facility being analyzed. This simulation tool can be used for both training (in real-time mode) and facility analysis and design (in fast mode).

  16. Discrete event simulation as a tool in optimization of a professional complex adaptive system.

    PubMed

    Nielsen, Anders Lassen; Hilwig, Helmer; Kissoon, Niranjan; Teelucksingh, Surujpal

    2008-01-01

    Similar urgent needs for improvement of health care systems exist in the developed and developing world. The culture and the organization of an emergency department in developing countries can best be described as a professional complex adaptive system, where each agent (employee) are ignorant of the behavior of the system as a whole; no one understands the entire system. Each agent's action is based on the state of the system at the moment (i.e. lack of medicine, unavailable laboratory investigation, lack of beds and lack of staff in certain functions). An important question is how one can improve the emergency service within the given constraints. The use of simulation signals is one new approach in studying issues amenable to improvement. Discrete event simulation was used to simulate part of the patient flow in an emergency department. A simple model was built using a prototyping approach. The simulation showed that a minor rotation among the nurses could reduce the mean number of visitors that had to be refereed to alternative flows within the hospital from 87 to 37 on a daily basis with a mean utilization of the staff between 95.8% (the nurses) and 87.4% (the doctors). We conclude that even faced with resource constraints and lack of accessible data discrete event simulation is a tool that can be used successfully to study the consequences of changes in very complex and self organizing professional complex adaptive systems.

  17. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  18. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  19. An Adaptive Simulation Framework for the Exploration of Extreme and Unexpected Events in Dynamic Engineered Systems.

    PubMed

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2017-01-01

    The end states reached by an engineered system during an accident scenario depend not only on the sequences of the events composing the scenario, but also on their timing and magnitudes. Including these additional features within an overarching framework can render the analysis infeasible in practical cases, due to the high dimension of the system state-space and the computational effort correspondingly needed to explore the possible system evolutions in search of the interesting (and very rare) ones of failure. To tackle this hurdle, in this article we introduce a framework for efficiently probing the space of event sequences of a dynamic system by means of a guided Monte Carlo simulation. Such framework is semi-automatic and allows embedding the analyst's prior knowledge about the system and his/her objectives of analysis. Specifically, the framework allows adaptively and intelligently allocating the simulation efforts preferably on those sequences leading to outcomes of interest for the objectives of the analysis, e.g., typically those that are more safety-critical (and/or rare). The emerging diversification in the filling of the state-space by the preference-guided exploration allows also the retrieval of critical system features, which can be useful to analysts and designers for taking appropriate means of prevention and mitigation of dangerous and/or unexpected consequences. A dynamic system for gas transmission is considered as a case study to demonstrate the application of the method.

  20. Validation of ground-motion simulations for historical events using SDoF systems

    USGS Publications Warehouse

    Galasso, C.; Zareian, F.; Iervolino, I.; Graves, R.W.

    2012-01-01

    The study presented in this paper is among the first in a series of studies toward the engineering validation of the hybrid broadband ground‐motion simulation methodology by Graves and Pitarka (2010). This paper provides a statistical comparison between seismic demands of single degree of freedom (SDoF) systems subjected to past events using simulations and actual recordings. A number of SDoF systems are selected considering the following: (1) 16 oscillation periods between 0.1 and 6 s; (2) elastic case and four nonlinearity levels, from mildly inelastic to severely inelastic systems; and (3) two hysteretic behaviors, in particular, nondegrading–nonevolutionary and degrading–evolutionary. Demand spectra are derived in terms of peak and cyclic response, as well as their statistics for four historical earthquakes: 1979 Mw 6.5 Imperial Valley, 1989 Mw 6.8 Loma Prieta, 1992 Mw 7.2 Landers, and 1994 Mw 6.7 Northridge.

  1. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  2. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    PubMed

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Teleradiology system analysis using a discrete event-driven block-oriented network simulator

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Dwyer, Samuel J., III

    1992-07-01

    Performance evaluation and trade-off analysis are the central issues in the design of communication networks. Simulation plays an important role in computer-aided design and analysis of communication networks and related systems, allowing testing of numerous architectural configurations and fault scenarios. We are using the Block Oriented Network Simulator (BONeS, Comdisco, Foster City, CA) software package to perform discrete, event- driven Monte Carlo simulations in capacity planning, tradeoff analysis and evaluation of alternate architectures for a high-speed, high-resolution teleradiology project. A queuing network model of the teleradiology system has been devise, simulations executed and results analyzed. The wide area network link uses a switched, dial-up N X 56 kbps inverting multiplexer where the number of digital voice-grade lines (N) can vary from one (DS-0) through 24 (DS-1). The proposed goal of such a system is 200 films (2048 X 2048 X 12-bit) transferred between a remote and local site in an eight hour period with a mean delay time less than five minutes. It is found that: (1) the DS-1 service limit is around 100 films per eight hour period with a mean delay time of 412 +/- 39 seconds, short of the goal stipulated above; (2) compressed video teleconferencing can be run simultaneously with image data transfer over the DS-1 wide area network link without impacting the performance of the described teleradiology system; (3) there is little sense in upgrading to a higher bandwidth WAN link like DS-2 or DS-3 for the current system; and (4) the goal of transmitting 200 films in an eight hour period with a mean delay time less than five minutes can be achieved simply if the laser printer interface is updated from the current DR-11W interface to a much faster SCSI interface.

  4. Simulation of a 36 h solar particle event at LLUMC using a proton beam scanning system

    NASA Astrophysics Data System (ADS)

    Coutrakon, G. B.; Benton, E. R.; Gridley, D. S.; Hickey, T.; Hubbard, J.; Koss, P.; Moyers, M. F.; Nelson, G. A.; Pecaut, M. J.; Sanders, E.; Shahnazi, K.

    2007-08-01

    A radiation biology experiment was performed in the research room of the proton therapy facility at Loma Linda University Medical Center to simulate the proton exposure produced by a solar particle event. The experiment used two scanning magnets for X and Y deflection of the proton beam and covered a usable target area of nearly 1 m2. The magnet scanning control system consisted of Lab View 6.0 software running on a PC. The goal of this experiment was to study the immune system response of 48 mice simultaneously exposed to 2 Gy of protons that simulated the dose rate and energy spectrum of the September 1989 solar particle event. The 2 Gy dose was delivered to the entrance of the mice cages over 36 h. Both ion chamber and TLD measurements indicated that the dose delivered was within 9% of the intended value. A spot scanning technique using one spot per accelerator cycle (2.2 s) was used to deliver doses as low as 1 μGy per beam spot. Rapid beam termination (less than 5 ms) on each spot was obtained by energizing a quadrupole in the proton synchrotron once the dose limit was reached for each spot. A parallel plate ion chamber placed adjacent to the mice cages provided fluence (or dose) measurements for each beam energy during each hour of the experiment. An intensity modulated spot scanning technique can be used in a variety of ways for radiation biology and a second experiment is being designed with this proton beam scanning system to simultaneously irradiate four groups of mice with different dose rates within the 1 m2 area. Also, large electronic devices being tested for radiation damage have been exposed in this beam without the use of patch fields. The same scanning system has potential application for intensity modulated proton therapy (IMPT) as well. This paper discusses the beam delivery system and dosimetry of the irradiation.

  5. Using Discrete Event Simulation to Model Attacker Interactions with Cyber and Physical Security Systems

    DOE PAGES

    Perkins, Casey; Muller, George

    2015-10-08

    The number of connections between physical and cyber security systems is rapidly increasing due to centralized control from automated and remotely connected means. As the number of interfaces between systems continues to grow, the interactions and interdependencies between them cannot be ignored. Historically, physical and cyber vulnerability assessments have been performed independently. This independent evaluation omits important aspects of the integrated system, where the impacts resulting from malicious or opportunistic attacks are not easily known or understood. Here, we describe a discrete event simulation model that uses information about integrated physical and cyber security systems, attacker characteristics and simple responsemore » rules to identify key safeguards that limit an attacker's likelihood of success. Key features of the proposed model include comprehensive data generation to support a variety of sophisticated analyses, and full parameterization of safeguard performance characteristics and attacker behaviours to evaluate a range of scenarios. Lastly, we also describe the core data requirements and the network of networks that serves as the underlying simulation structure.« less

  6. Using Discrete Event Simulation to Model Attacker Interactions with Cyber and Physical Security Systems

    SciTech Connect

    Perkins, Casey; Muller, George

    2015-10-08

    The number of connections between physical and cyber security systems is rapidly increasing due to centralized control from automated and remotely connected means. As the number of interfaces between systems continues to grow, the interactions and interdependencies between them cannot be ignored. Historically, physical and cyber vulnerability assessments have been performed independently. This independent evaluation omits important aspects of the integrated system, where the impacts resulting from malicious or opportunistic attacks are not easily known or understood. Here, we describe a discrete event simulation model that uses information about integrated physical and cyber security systems, attacker characteristics and simple response rules to identify key safeguards that limit an attacker's likelihood of success. Key features of the proposed model include comprehensive data generation to support a variety of sophisticated analyses, and full parameterization of safeguard performance characteristics and attacker behaviours to evaluate a range of scenarios. Lastly, we also describe the core data requirements and the network of networks that serves as the underlying simulation structure.

  7. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  8. The IDES framework: A case study in development of a parallel discrete-event simulation system

    SciTech Connect

    Nicol, D.M.; Johnson, M.M.; Yoshimura, A.S.

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  9. Standardized Simulated Events for Provocative Testing of Medical Care System Rescue Capabilities

    DTIC Science & Technology

    2005-01-01

    from the available literature.* Observed simulated event behavior. While an apneic event was initiated on room air and 100 percent O2, the PaO2 ...Pediatric Advanced Life Support. Hypoxia and hypotension were defined as SpO2 ឬ percent and systolic BPអ mm Hg, respectively, as these parameters...ETCO2 1 0 Cont. auscultation 1 0 Oxygenation SpO2 1 1 Cont. Tone/Beep 1 1 Alarm for SpO2 1 1 Perfusion SpO2 pleth 1 1 SpO2 HR 1 1

  10. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  11. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations.

  12. Weighted-ensemble Brownian dynamics simulation: sampling of rare events in nonequilibrium systems.

    PubMed

    Kromer, Justus A; Schimansky-Geier, Lutz; Toral, Raul

    2013-06-01

    We provide an algorithm based on weighted-ensemble (WE) methods, to accurately sample systems at steady state. Applying our method to different one- and two-dimensional models, we succeed in calculating steady-state probabilities of order 10(-300) and reproduce the Arrhenius law for rates of order 10(-280). Special attention is payed to the simulation of nonpotential systems where no detailed balance assumption exists. For this large class of stochastic systems, the stationary probability distribution density is often unknown and cannot be used as preknowledge during the simulation. We compare the algorithm's efficiency with standard Brownian dynamics simulations and the original WE method.

  13. Integration of scheduling and discrete event simulation systems to improve production flow planning

    NASA Astrophysics Data System (ADS)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  14. Simulation-Based Model Checking for Nondeterministic Systems and Rare Events

    DTIC Science & Technology

    2016-03-24

    very large systems; this research is critical to allow statistical methods to reason about realistic systems involving nondeterminism and low...about low-probability events. Outcome/Impact: Statistical model checking methods scale better than traditional analytic methods for very large systems...implementation was based on the text by Edelkamp and Schrödl [2]. We were hampered by a substantial error in the book’s presentation of the algorithm. We have

  15. Dynamic simulation recalls condensate piping event

    SciTech Connect

    Farrell, R.J.; Reneberg, K.O. ); Moy, H.C. )

    1994-05-01

    This article describes how experience gained from simulating and reconstructing a condensate piping event will be used by Consolidated Edison to analyze control system problems. A cooperative effort by Con Edison and the Chemical Engineering Department at Polytechnic University used modular modeling system to investigate the probable cause of a Con Edison condensate piping event. Con Edison commissioned the work to serve as a case study for the more general problem of control systems analysis using dynamic simulation and MMS.

  16. StochKit2: software for discrete stochastic simulation of biochemical systems with events.

    PubMed

    Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R

    2011-09-01

    StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.

  17. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  18. A State Event Detection Algorithm for Numerically Simulating Hybrid Systems with Model Singularities

    DTIC Science & Technology

    2007-01-01

    introduced there as well. However, in these early works as well as in Hay and Griffin [1979], Joglekar and Reklaitis [1984], and Prestin and Berzine...1995. Nonlinear Control Systems. Springer, London. Joglekar, G. and Reklaitis , G. 1984. A simulator for batch and semi-continuous processes

  19. Forward flux sampling for rare event simulations.

    PubMed

    Allen, Rosalind J; Valeriani, Chantal; Rein Ten Wolde, Pieter

    2009-11-18

    Rare events are ubiquitous in many different fields, yet they are notoriously difficult to simulate because few, if any, events are observed in a conventional simulation run. Over the past several decades, specialized simulation methods have been developed to overcome this problem. We review one recently developed class of such methods, known as forward flux sampling. Forward flux sampling uses a series of interfaces between the initial and final states to calculate rate constants and generate transition paths for rare events in equilibrium or nonequilibrium systems with stochastic dynamics. This review draws together a number of recent advances, summarizes several applications of the method and highlights challenges that remain to be overcome.

  20. Algorithmic scalability in globally constrained conservative parallel discrete event simulations of asynchronous systems.

    PubMed

    Kolakowska, A; Novotny, M A; Korniss, G

    2003-04-01

    We consider parallel simulations for asynchronous systems employing L processing elements that are arranged on a ring. Processors communicate only among the nearest neighbors and advance their local simulated time only if it is guaranteed that this does not violate causality. In simulations with no constraints, in the infinite L limit the utilization scales [Korniss et al., Phys. Rev. Lett. 84, 1351 (2000)]; but, the width of the virtual time horizon diverges (i.e., the measurement phase of the algorithm does not scale). In this work, we introduce a moving Delta-window global constraint, which modifies the algorithm so that the measurement phase scales as well. We present results of systematic studies in which the system size (i.e., L and the volume load per processor) as well as the constraint are varied. The Delta constraint eliminates the extreme fluctuations in the virtual time horizon, provides a bound on its width, and controls the average progress rate. The width of the Delta window can serve as a tuning parameter that, for a given volume load per processor, could be adjusted to optimize the utilization, so as to maximize the efficiency. This result may find numerous applications in modeling the evolution of general spatially extended short-range interacting systems with asynchronous dynamics, including dynamic Monte Carlo studies.

  1. Simulations of rare events in fiber optics by interacting particle systems

    NASA Astrophysics Data System (ADS)

    Garnier, Josselin; Moral, Pierre Del

    2006-11-01

    In this paper we study the robustness of linear pulses, solitons, and dispersion-managed solitons, under the influence of random perturbations. First, we address the problem of the estimation of the outage probability due to polarization-mode dispersion. Second, we compare the pulse broadening due to random fluctuations of the group-velocity dispersion. We use an original interacting particle system to estimate the tails of the probability density functions of the pulse widths. A new adaptative Monte Carlo method is applied that enforces the simulations to probe the regions of practical importance by selection and mutation steps.

  2. Agent-based modeling to simulate contamination events and evaluate threat management strategies in water distribution systems.

    PubMed

    Zechman, Emily M

    2011-05-01

    In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent-based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal-based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word-of-mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.

  3. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    SciTech Connect

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  4. Weighted next reaction method and parameter selection for efficient simulation of rare events in biochemical reaction systems.

    PubMed

    Xu, Zhouyi; Cai, Xiaodong

    2011-07-25

    The weighted stochastic simulation algorithm (wSSA) recently developed by Kuwahara and Mura and the refined wSSA proposed by Gillespie et al. based on the importance sampling technique open the door for efficient estimation of the probability of rare events in biochemical reaction systems. In this paper, we first apply the importance sampling technique to the next reaction method (NRM) of the stochastic simulation algorithm and develop a weighted NRM (wNRM). We then develop a systematic method for selecting the values of importance sampling parameters, which can be applied to both the wSSA and the wNRM. Numerical results demonstrate that our parameter selection method can substantially improve the performance of the wSSA and the wNRM in terms of simulation efficiency and accuracy.

  5. Discrete-event simulation applied to analysis of waiting lists. Evaluation of a prioritization system for cataract surgery.

    PubMed

    Comas, Mercè; Castells, Xavier; Hoffmeister, Lorena; Román, Rubén; Cots, Francesc; Mar, Javier; Gutiérrez-Moreno, Santiago; Espallargues, Mireia

    2008-12-01

    To outline the methods used to build a discrete-event simulation model for use in decision-making in the context of waiting list management strategies for cataract surgery by comparing a waiting list prioritization system with the routinely used first-in, first-out (FIFO) discipline. The setting was the Spanish health system. The model reproduced the process of cataract, from incidence of need of surgery (meeting indication criteria), through demand, inclusion on a waiting list, and surgery. "Nonexpressed Need" represented the population that, even with need, would not be included on a waiting list. Parameters were estimated from administrative data and research databases. The impact of introducing a prioritization system on the waiting list compared with the FIFO system was assessed. For all patients entering the waiting list, the main outcome variable was waiting time weighted by priority score. A sensitivity analysis with different scenarios of mean waiting time was used to compare the two alternatives. The prioritization system shortened waiting time (weighted by priority score) by 1.55 months (95% CI: 1.47 to 1.62) compared with the FIFO system. This difference was statistically significant for all scenarios (which were defined from a waiting time of 4 months to 24 months under the FIFO system). A tendency to greater time savings in scenarios with longer waiting times was observed. Discrete-event simulation is useful in decision-making when assessing health services. Introducing a waiting list prioritization system produced greater benefit than allocating surgery by waiting time only. Use of the simulation model would allow the impact of proposed policies to reduce waiting lists or assign resources more efficiently to be tested.

  6. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  7. Agent Frameworks for Discrete Event Social Simulations

    DTIC Science & Technology

    2010-03-01

    of a general modeling approach to social simulation that embeds a multi - agent system within a DES framework, and propose several reusable agent... agent system to simulate changes in the beliefs, values, and interests (BVIs) of large social groups (Alt, Jackson, Hudak, & Steven Lieberman, 2010...to events from A. 2.3 Cultural Geography Model The Cultural Geography (CG) Model is an implementation of a DESS that uses an embedded multi

  8. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  9. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  10. Simulations of coronal disconnection events

    SciTech Connect

    Linker, J.A.; Van Hoven, G. ); McComas, D.J. )

    1992-09-01

    The lack of evidence for magnetic disconnection of coronal mass ejection (CMEs) from the Sun has long been a puzzle, as it implies a buildup of the interplanetary magnetic field (IMF) magnitude over time. Such a buildup is ruled out by observations. Magnetic reconnection above helmet streamer configurations could provide a mechanism for maintaining the observed relative constancy of the (IMF) [McComas et al., 1989]; McComas et al. [1991] showed observational evidence of reconnection above a streamer. The authors investigate this interpretation using time-dependent MHD simulations. They model the opening of new magnetic flux on the Sun (as might occur in a CME or other transient event) as an increase in magnetic flux at the poles of a simulated corona. They find that this perturbation can in fact cause reconnection above an equatorial helmet streamer, and the resultant density signature is similar to the observations of McComas et al. [1991].

  11. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    Kelton, & Kelton, 1991; R. E. Shannon, 1998). Banks writes extensively on this topic and the DOD MSCO references the process as a best practice (Morse...research into DEVS, writing the DEVS pseudocode and how using DEVS can increase the validity of QKD simulation. 21 3. Quantum Key Distribution: A...1990). "Cryptology,” Chapter 13 of Handbook of Theoretical Computer Science, (ed. J. Van Leeuwen ) vol. 1 (Elsevier, 1990), 717-755. Retrieved 12 March

  12. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2017-05-17

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  13. Numerical Simulations of Two Wildfire Events Using a Combined Modeling System (HIGRAD/BEHAVE)

    SciTech Connect

    Reisner, J.; Bossert, J.; Winterkamp, J.

    1997-12-31

    The ability to accurately forecast the spread of a wildfire would significantly reduce human suffering and loss of life, the destruction of property, and expenditures for assessment and recovery. To help achieve this goal we have developed a model which accurately simulates the interactions between winds and the heat source associated with a wildfire. We have termed our new model HIGRAD or High resolution model for strong GRA-Dient applications. HIGRAD employs a sophisticated numerical technique to prevent numerical Oscillations from occurring in the vicinity of the lire. Of importance for fire modeling, HIGRAD uses a numerical technique which allows for the use of a compressible equation set, but without the time-step restrictions associated with the propagation of sound-waves.

  14. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  15. Ocean Dynamics Simulation during an Extreme Bora Event using a Two-Way Coupled Atmosphere-Ocean Modeling System

    NASA Astrophysics Data System (ADS)

    Licer, Matjaz; Smerkol, Peter; Fettich, Anja; Ravdas, Michalis; Papapostolou, Alexandros; Mantziafou, Anneta; Cedilnik, Jure; Strajnar, Benedikt; Jeromel, Maja; Pristov, Neva; Jerman, Jure; Petan, Saso; Malacic, Vlado; Sofianos, Sarantis

    2015-04-01

    The response of the Adriatic Sea to cold north-easterly Bora wind forcing has been modelled numerous times, but usually using one-way coupling techniques. One of the most significant events of the kind took place in February 2012, when hurricane force Bora was blowing over the Northern Adriatic almost continuously for over three weeks, causing extreme air-sea interactions leading to severe water cooling (below 4 degrees Celsius) and extensive dense water formation (with density anomalies above 30.5 kg/m3). The intensity of the atmosphere-ocean interactions during such conditions calls for a two-way atmosphere-ocean coupling approach. We compare the performances of a) fully two-way coupled atmosphere-ocean modelling system and b) one way coupled ocean model (forced by the atmospheric model hourly output) to the available in-situ measurements (coastal buoy, CTD). The models used were ALADIN (4.4 km resolution) on the atmospheric side and POM (1/30°× 1/30° resolution) on the ocean side. The atmosphere-ocean coupling was implemented using the OASIS3-MCT model coupling toolkit. We show that the atmosphere-ocean two-way coupling significantly improves the simulated temperature and density response of the ocean since it represents short-termed transient features much better than the offline version of the ocean model.

  16. A Simbol-X Event Simulator

    SciTech Connect

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-05-11

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  17. A Simbol-X Event Simulator

    NASA Astrophysics Data System (ADS)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  18. Parallel discrete event simulation using shared memory

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  19. Running Parallel Discrete Event Simulators on Sierra

    SciTech Connect

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  20. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  1. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  2. Simulation of event-based and long-term spatial redistribution of Chernobyl-derived radiocaesium within catchments using geographical information system embedded models

    NASA Astrophysics Data System (ADS)

    van der Perk, Marcel; Slávik, Ondrej

    2003-04-01

    The Chernobyl accident contaminated vast areas of Europe with radiocaesium (137Cs) in 1986. To evaluate long-term and event-based redistribution of Chernobyl-derived 137Cs at the catchment scale, two geographical information system embedded models have been developed. The first model simulates 137Cs redistribution using a monthly time step based on a long-term soil erosion model. The second model simulates lateral radiocaesium transport at the event scale based on the existing Limburg soil erosion model. This model accounts for surface runoff, soil erosion and deposition, and radiocaesium exchange between the topsoil layer, runoff water, and suspended sediment. Both models have been tested and applied to the Mochovce catchment, western Slovakia. The spatial distribution of 137Cs activity in soil simulated by the long-term model was used as input for the event-based model to assess the changes in 137Cs transport during rainfall events between 1986 and 2002. Soil erosion events in the first months after initial fallout input before ploughing caused a considerable decline in the 137Cs soil inventories, which were estimated at 8·9% of the total initial inventory. The majority of 137Cs transport during rainfall events occurs in particulate form. Both the absolute amounts of particulate 137Cs transport and the fraction of particulate 137Cs transport were shown to be positively related to suspended sediment transport. Between 1986 and 2002, dissolved 137Cs transport has declined by a factor of about 26, which can be largely attributed to the increased sorption to sediment particles. Particulate 137Cs transport has declined by a factor of about two, which can be largely attributed to the decrease in soil 137Cs. The 137Cs inventories in soil have decreased by a factor between three and four at the steep hillslopes, but have remained at about the same level as the initial fallout input at the valley bottoms.

  3. Assessing the Effectiveness of Biosurveillance Via Discrete Event Simulation

    DTIC Science & Technology

    2011-03-01

    EFFECTIVENESS OF BIOSURVEILLANCE VIA DISCRETE EVENT SIMULATION by Jason H. Dao March 2011 Thesis Advisor: Ronald D. Fricker, Jr. Second Reader...TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Assessing the Effectiveness of Biosurveillance Via Discrete Event Simulation 6...the potential for disastrous outcomes is greater than it has ever been. In order to confront this threat, biosurveillance systems are utilized to

  4. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  5. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  6. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  7. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  8. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  9. Scanning picosecond tunable laser system for simulating MeV heavy ion-induced charge collection events as a function of temperature.

    PubMed

    Laird, Jamie Stuart; Chen, Yuan; Scheick, Leif; Vo, Tuan; Johnston, Allan

    2008-08-01

    A new methodology for using scanning picosecond laser microscopy to simulate cosmic ray induced radiation effects as a function of temperature is described in detail. The built system is centered on diffraction-limited focusing of the output from a broadband (690-960 nm) ultrafast Ti:sapphire Tsunami laser pumped by a 532 nm Millennia laser. An acousto-optic modulator is used to provide pulse picking down to event rates necessary for the technologies and effects under study. The temperature dependence of the charge generation process for ions and photons is briefly reviewed and the need for wavelength tunability is discussed. An appropriate wavelength selection is critical for proper emulation of ion events over a wide temperature range. The system developed is detailed and illustrated by way of example on a deep-submicron complementary metal-oxide semiconductor test structure.

  10. MHD simulation of the Bastille day event

    SciTech Connect

    Linker, Jon Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-25

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 10{sup 33} ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  11. MHD simulation of the Bastille day event

    NASA Astrophysics Data System (ADS)

    Linker, Jon; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-01

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 1033 ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  12. Discrete-Event Simulation Applied to Apparel Manufacturing

    DTIC Science & Technology

    1990-06-01

    manufacturing , e.g., machine tools, vehicles, appliances, etc. Very few applications of simulation and, particularly, of discrete- event simulation in the...industry has shown renewed interest in applications of computer-based tools to manufacturing systems. Simulation, which has been a widely used tool in...other industries, has received considerable attention for its possible applications in apparel manufacturing . !2 ! To date, however, little application

  13. Heinrich events modeled in transient glacial simulations

    NASA Astrophysics Data System (ADS)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  14. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  15. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, Ramona; Randrup, Joergen

    2008-04-17

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  16. Discrete Event Simulation of Distributed Team Communication

    DTIC Science & Technology

    2012-03-22

    executable system architecture approach to discrete events system modeling using sysml in conjunction with colored petri net . In Systems Conference, 2008 2nd...operators. Mitchell found that IMPRINT predictions of communication times and frequencies correlated with recorded communications amongst a platoon of

  17. An evaluation of an expert system for detecting critical events during anesthesia in a human patient simulator: a prospective randomized controlled study.

    PubMed

    Görges, Matthias; Winton, Pamela; Koval, Valentyna; Lim, Joanne; Stinson, Jonathan; Choi, Peter T; Schwarz, Stephan K W; Dumont, Guy A; Ansermino, J Mark

    2013-08-01

    Perioperative monitoring systems produce a large amount of uninterpreted data, use threshold alarms prone to artifacts, and rely on the clinician to continuously visually track changes in physiological data. To address these deficiencies, we developed an expert system that provides real-time clinical decisions for the identification of critical events. We evaluated the efficacy of the expert system for enhancing critical event detection in a simulated environment. We hypothesized that anesthesiologists would identify critical ventilatory events more rapidly and accurately with the expert system. We used a high-fidelity human patient simulator to simulate an operating room environment. Participants managed 4 scenarios (anesthetic vapor overdose, tension pneumothorax, anaphylaxis, and endotracheal tube cuff leak) in random order. In 2 of their 4 scenarios, participants were randomly assigned to the expert system, which provided trend-based alerts and potential differential diagnoses. Time to detection and time to treatment were measured. Workload questionnaires and structured debriefings were completed after each scenario, and a usability questionnaire at the conclusion of the session. Data were analyzed using a mixed-effects linear regression model; Fisher exact test was used for workload scores. Twenty anesthesiology trainees and 15 staff anesthesiologists with a combined median (range) of 36 (29-66) years of age and 6 (1-38) years of anesthesia experience participated. For the endotracheal tube cuff leak, the expert system caused mean reductions of 128 (99% confidence interval [CI], 54-202) seconds in time to detection and 140 (99% CI, 79-200) seconds in time to treatment. In the other 3 scenarios, a best-case decrease of 97 seconds (lower 99% CI) in time to diagnosis for anaphylaxis and a worst-case increase of 63 seconds (upper 99% CI) in time to treatment for anesthetic vapor overdose were found. Participants were highly satisfied with the expert system (median

  18. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models.

  19. Complete event simulations of nuclear fission

    NASA Astrophysics Data System (ADS)

    Vogt, Ramona

    2015-10-01

    For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.

  20. Conservative parallel discrete-event simulation: Principles and practice

    SciTech Connect

    Wagner, D.B.

    1989-01-01

    Simulation is one of the most important computational technologies in use today. Unfortunately, its importance is matched by its appetite for computational resources. These factors make parallel simulation a topic with far-reaching consequences in all fields of science and engineering. This thesis is concerned with one approach to this problem, conservative loose event-driven parallel simulation, the objective of which is to apply multiple processors to a single simulation run in an effort to reduce its time-to-completion. There are several factors that make parallel simulation difficult. First, the fact that a physical system has a high degree of concurrency does not necessarily mean that a simulation of that system will benefit from parallelism. The author introduces two simple analytic techniques that can be used to bound from above the speedup potential of parallel simulations. Second, a parallel simulation requires synchronization to ensure that the results obtained are equivalent to those of a sequential simulation of the problem. He argues that the availability of inexpensive, medium-scale and shared-memory multiprocessors mandates a re-examination of synchronization algorithms for conservative loose event-driven parallel simulation. His investigations lead to a novel synchronization technique called lazy blocking avoidance. His measurements show that lazy blocking avoidance performs at least as well as, and often substantially better than, two other synchronization methods that have been widely discussed in the literature (deadlock detection and recovery), and eager blocking avoidance.

  1. Distributed discrete event simulation. Final report

    SciTech Connect

    De Vries, R.C.

    1988-02-01

    The presentation given here is restricted to discrete event simulation. The complexity of and time required for many present and potential discrete simulations exceeds the reasonable capacity of most present serial computers. The desire, then, is to implement the simulations on a parallel machine. However, certain problems arise in an effort to program the simulation on a parallel machine. In one category of methods deadlock care arise and some method is required to either detect deadlock and recover from it or to avoid deadlock through information passing. In the second category of methods, potentially incorrect simulations are allowed to proceed. If the situation is later determined to be incorrect, recovery from the error must be initiated. In either case, computation and information passing are required which would not be required in a serial implementation. The net effect is that the parallel simulation may not be much better than a serial simulation. In an effort to determine alternate approaches, important papers in the area were reviewed. As a part of that review process, each of the papers was summarized. The summary of each paper is presented in this report in the hopes that those doing future work in the area will be able to gain insight that might not otherwise be available, and to aid in deciding which papers would be most beneficial to pursue in more detail. The papers are broken down into categories and then by author. Conclusions reached after examining the papers and other material, such as direct talks with an author, are presented in the last section. Also presented there are some ideas that surfaced late in the research effort. These promise to be of some benefit in limiting information which must be passed between processes and in better understanding the structure of a distributed simulation. Pursuit of these ideas seems appropriate.

  2. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  3. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    PubMed

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  4. Rare event simulation in radiation transport

    SciTech Connect

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.

  5. Lightning Potential Index performances in multimicrophysical cloud-resolving simulations of a back-building mesoscale convective system: The Genoa 2014 event

    NASA Astrophysics Data System (ADS)

    Lagasio, M.; Parodi, A.; Procopio, R.; Rachidi, F.; Fiori, E.

    2017-04-01

    Severe weather events are responsible for hundreds of fatalities and millions of euros of damage every year on the Mediterranean basin. Lightning activity is a characteristic phenomenon of severe weather and often accompanies torrential rainfall, which, under certain conditions like terrain type, slope, drainage, and soil saturation, may turn into flash flood. Building on the existing relationship between significant lightning activity and deep convection and precipitation, the performance of the Lightning Potential Index, as a measure of the potential for charge generation and separation that leads to lightning occurrence in clouds, is here evaluated for the V-shape back-building Mesoscale Convective System which hit Genoa city (Italy) in 2014. An ensemble of Weather Research and Forecasting simulations at cloud-permitting grid spacing (1 km) with different microphysical parameterizations is performed and compared to the available observational radar and lightning data. The results allow gaining a deeper understanding of the role of lightning phenomena in the predictability of V-shape back-building Mesoscale Convective Systems often producing flash flood over western Mediterranean complex topography areas. Moreover, they support the relevance of accurate lightning forecasting for the predictive ability of these severe events.

  6. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  7. Discrete event simulation in an artificial intelligence environment: Some examples

    SciTech Connect

    Roberts, D.J.; Farish, T.

    1991-01-01

    Several Los Alamos National Laboratory (LANL) object-oriented discrete-event simulation efforts have been completed during the past three years. One of these systems has been put into production and has a growing customer base. Another (started two years earlier than the first project) was completed but has not yet been used. This paper will describe these simulation projects. Factors which were pertinent to the success of the one project, and to the failure of the second project will be discussed (success will be measured as the extent to which the simulation model was used as originally intended). 5 figs.

  8. Detecting plastic events in emulsions simulations

    NASA Astrophysics Data System (ADS)

    Lulli, Matteo; Matteo Lulli, Massimo Bernaschi, Mauro Sbragaglia Team

    2016-11-01

    Emulsions are complex systems which are formed by a number of non-coalescing droplets dispersed in a solvent leading to non-trivial effects in the overall flowing dynamics. Such systems possess a yield stress below which an elastic response to an external forcing occurs, while above the yield stress the system flows as a non-Newtonian fluid, i.e. the stress is not proportional to the shear. In the solid-like regime the network of the droplets interfaces stores the energy coming from the work exerted by an external forcing, which can be used to move the droplets in a non-reversible way, i.e. causing plastic events. The Kinetic-Elasto-Plastic (KEP) theory is an effective theory describing some features of the flowing regime relating the rate of plastic events to a scalar field called fluidity f =γ˙/σ , i.e. the inverse of an effective viscosity. Boundary conditions have a non-trivial role not captured by the KEP description. In this contribution we will compare numerical results against experiments concerning the Poiseuille flow of emulsions in microchannels with complex boundary geometries. Using an efficient computational tool we can show non-trivial results on plastic events for different realizations of the rough boundaries. The research leading to these results has received funding from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007- 2013)/ERC Grant Agreement no. [279004].

  9. Sensitivity analysis of some critical factors affecting simulated intrusion volumes during a low pressure transient event in a full-scale water distribution system.

    PubMed

    Ebacher, G; Besner, M C; Clément, B; Prévost, M

    2012-09-01

    Intrusion events caused by transient low pressures may result in the contamination of a water distribution system (DS). This work aims at estimating the range of potential intrusion volumes that could result from a real downsurge event caused by a momentary pump shutdown. A model calibrated with transient low pressure recordings was used to simulate total intrusion volumes through leakage orifices and submerged air vacuum valves (AVVs). Four critical factors influencing intrusion volumes were varied: the external head of (untreated) water on leakage orifices, the external head of (untreated) water on submerged air vacuum valves, the leakage rate, and the diameter of AVVs' outlet orifice (represented by a multiplicative factor). Leakage orifices' head and AVVs' orifice head levels were assessed through fieldwork. Two sets of runs were generated as part of two statistically designed experiments. A first set of 81 runs was based on a complete factorial design in which each factor was varied over 3 levels. A second set of 40 runs was based on a latin hypercube design, better suited for experimental runs on a computer model. The simulations were conducted using commercially available transient analysis software. Responses, measured by total intrusion volumes, ranged from 10 to 366 L. A second degree polynomial was used to analyze the total intrusion volumes. Sensitivity analyses of both designs revealed that the relationship between the total intrusion volume and the four contributing factors is not monotonic, with the AVVs' orifice head being the most influential factor. When intrusion through both pathways occurs concurrently, interactions between the intrusion flows through leakage orifices and submerged AVVs influence intrusion volumes. When only intrusion through leakage orifices is considered, the total intrusion volume is more largely influenced by the leakage rate than by the leakage orifices' head. The latter mainly impacts the extent of the area affected by

  10. Autocharacterization feasibility system on Hunters Trophy event

    SciTech Connect

    Mills, R.A.

    1993-09-01

    An automated system to characterize cable systems at NTS has been developed to test the feasibility of such a system. A rack of electronic equipment including a fast pulse generator, digital sampling scope, coaxial switch matrix and GPIB controller was installed downhole at NTS for the Hunters Trophy event. It was used to test automated characterization. Recorded measurements of simulation and other instrument data were gathered to determine if a full scale automated system would be practical in full scale underground nuclear effects tests. The benefits of such a full scale system would be fewer personnel required downhole; more instrument control in the uphole recording room; faster acquisition of cable parameter data.

  11. Empirical study of simulated two-planet microlensing events

    SciTech Connect

    Zhu, Wei; Gould, Andrew; Penny, Matthew; Mao, Shude; Gendron, Rieul

    2014-10-10

    We undertake the first study of two-planet microlensing models recovered from simulations of microlensing events generated by realistic multiplanet systems in which 292 planetary events, including 16 two-planet events, were detected from 6690 simulated light curves. We find that when two planets are recovered, their parameters are usually close to those of the two planets in the system most responsible for the perturbations. However, in 1 of the 16 examples, the apparent mass of both detected planets was more than doubled by the unmodeled influence of a third, massive planet. This fraction is larger than but statistically consistent with the roughly 1.5% rate of serious mass errors due to unmodeled planetary companions for the 274 cases from the same simulation in which a single planet is recovered. We conjecture that an analogous effect due to unmodeled stellar companions may occur more frequently. For 7 out of 23 cases in which two planets in the system would have been detected separately, only one planet was recovered because the perturbations due to the two planets had similar forms. This is a small fraction (7/274) of all recovered single-planet models, but almost a third of all events that might plausibly have led to two-planet models. Still, in these cases, the recovered planet tends to have parameters similar to one of the two real planets most responsible for the anomaly.

  12. Rare Event Simulation in Radiation Transport

    NASA Astrophysics Data System (ADS)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous

  13. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  14. The effects of parallel processing architectures on discrete event simulation

    NASA Astrophysics Data System (ADS)

    Cave, William; Slatt, Edward; Wassmer, Robert E.

    2005-05-01

    As systems become more complex, particularly those containing embedded decision algorithms, mathematical modeling presents a rigid framework that often impedes representation to a sufficient level of detail. Using discrete event simulation, one can build models that more closely represent physical reality, with actual algorithms incorporated in the simulations. Higher levels of detail increase simulation run time. Hardware designers have succeeded in producing parallel and distributed processor computers with theoretical speeds well into the teraflop range. However, the practical use of these machines on all but some very special problems is extremely limited. The inability to use this power is due to great difficulties encountered when trying to translate real world problems into software that makes effective use of highly parallel machines. This paper addresses the application of parallel processing to simulations of real world systems of varying inherent parallelism. It provides a brief background in modeling and simulation validity and describes a parameter that can be used in discrete event simulation to vary opportunities for parallel processing at the expense of absolute time synchronization and is constrained by validity. It focuses on the effects of model architecture, run-time software architecture, and parallel processor architecture on speed, while providing an environment where modelers can achieve sufficient model accuracy to produce valid simulation results. It describes an approach to simulation development that captures subject area expert knowledge to leverage inherent parallelism in systems in the following ways: * Data structures are separated from instructions to track which instruction sets share what data. This is used to determine independence and thus the potential for concurrent processing at run-time. * Model connectivity (independence) can be inspected visually to determine if the inherent parallelism of a physical system is properly

  15. Data Systems Dynamic Simulator

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Clark, Melana; Davenport, Bill; Message, Philip

    1993-01-01

    The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique.

  16. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices.

  17. Event-based Simulation Model for Quantum Optics Experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, H.; Michielsen, K.

    2011-03-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. We also discuss the possibility to refute our corpuscular model.

  18. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  19. Detectability of Discrete Event Systems with Dynamic Event Observation

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2009-01-01

    Our previous work considers detectability of discrete event systems which is to determine the current state and subsequent states of a system based on event observation. We assume that event observation is static, that is, if an event is observable, then all its occurrences are observable. However, in practical systems such as sensor networks, event observation often needs to be dynamic, that is, the occurrences of same events may or may not be observable, depending on the state of the system. In this paper, we generalize static event observation into dynamic event observation and consider the detectability problem under dynamic event observation. We define four types of detectabilities. To check detectabilities, we construct the observer with exponential complexity. To reduce computational complexity, we can also construct a detector with polynomial complexity to check strong detectabilities. Dynamic event observation can be implemented in two possible ways: a passive observation and an active observation. For the active observation, we discuss how to find minimal event observation policies that preserve four types of detectabilities respectively. PMID:20161618

  20. Global MHD simulation of flux transfer events at the high-latitude magnetopause observed by the Cluster spacecraft and the SuperDARN radar system

    NASA Astrophysics Data System (ADS)

    Daum, P.; Wild, J. A.; Penz, T.; Woodfield, E. E.; RèMe, H.; Fazakerley, A. N.; Daly, P. W.; Lester, M.

    2008-07-01

    A global magnetohydrodynamic numerical simulation is used to study the large-scale structure and formation location of flux transfer events (FTEs) in synergy with in situ spacecraft and ground-based observations. During the main period of interest on the 14 February 2001 from 0930 to 1100 UT the Cluster spacecraft were approaching the Northern Hemisphere high-latitude magnetopause in the postnoon sector on an outbound trajectory. Throughout this period the magnetic field, electron, and ion sensors on board Cluster observed characteristic signatures of FTEs. A few minutes delayed to these observations the Super Dual Auroral Radar Network (SuperDARN) system indicated flow disturbances in the conjugate ionospheres. These "two-point" observations on the ground and in space were closely correlated and were caused by ongoing unsteady reconnection in the vicinity of the spacecraft. The three-dimensional structures and dynamics of the observed FTEs and the associated reconnection sites are studied by using the Block-Adaptive-Tree-Solarwind-Roe-Upwind-Scheme (BATS-R-US) MHD code in combination with a simple open flux tube motion model (Cooling). Using these two models the spatial and temporal evolution of the FTEs is estimated. The models fill the gaps left by measurements and allow a "point-to-point" mapping between the instruments in order to investigate the global structure of the phenomenon. The modeled results presented are in good correlation with previous theoretical and observational studies addressing individual features of FTEs.

  1. High-level simulation of JWST event-driven operations

    NASA Astrophysics Data System (ADS)

    Henry, R.; Kinzel, W.

    2012-09-01

    The James Webb Space Telescope (JWST) has an event-driven architecture: an onboard Observation Plan Executive (OPE) executes an Observation Plan (OP) consisting of a sequence of observing units (visits). During normal operations, ground action to update the OP is only expected to be necessary about once a week. This architecture is designed to tolerate uncertainty in visit duration, and occasional visit failures due to inability to acquire guide stars, without creating gaps in the observing timeline. The operations concept is complicated by the need for occasional scheduling of timecritical science and engineering visits that cannot tolerate much slippage without inducing gaps, and also by onboard momentum management. A prototype Python tool called the JWST Observation Plan Execution Simulator (JOPES) has recently been developed to simulate OP execution at a high level and analyze the response of the Observatory and OPE to both nominal and contingency scenarios. Incorporating both deterministic and stochastic behavior, JOPES has potential to be a powerful tool for several purposes: requirements analysis, system verification, systems engineering studies, and test data generation. It has already been successfully applied to a study of overhead estimation bias: whether to use conservative or average-case estimates for timing components that are inherently uncertain, such as those involving guide-star acquisition. JOPES is being enhanced to support interfaces to the operational Proposal Planning Subsystem (PPS) now being developed, with the objective of "closing the loop" between testing and simulation by feeding simulated event logs back into the PPS.

  2. A Software Framework for Blast Event Simulation

    DTIC Science & Technology

    2006-11-01

    The vehicles in the simulations will be modeled using the MPMICE MPM techniques (for approximate vehicle models), DYNA3D FE code and off-the-shelf...and decoupled preprocessor-based models, to a comprehensive, tightly coupled simulation tool based on the C-SAFE and DYNA3D codes. The model...involves integrating the MPMICE code, DYNA3D (for tightly coupled simulations), LS-DYNA (for uncoupled simulations) and a reduced order model

  3. Stochastic discrete event simulation of germinal center reactions

    NASA Astrophysics Data System (ADS)

    Figge, Marc Thilo

    2005-05-01

    We introduce a generic reaction-diffusion model for germinal center reactions and perform numerical simulations within a stochastic discrete event approach. In contrast to the frequently used deterministic continuum approach, each single reaction event is monitored in space and time in order to simulate the correct time evolution of this complex biological system. Germinal centers play an important role in the immune system by performing a reaction that aims at improving the affinity between antibodies and antigens. Our model captures experimentally observed features of this reaction, such as the development of the remarkable germinal center morphology and the maturation of antibody-antigen affinity in the course of time. We model affinity maturation within a simple affinity class picture and study it as a function of the distance between the initial antibody-antigen affinity and the highest possible affinity. The model reveals that this mutation distance may be responsible for the experimentally observed all-or-none behavior of germinal centers; i.e., they generate either mainly output cells of high affinity or no high-affinity output cells at all. Furthermore, the exact simulation of the system dynamics allows us to study the hypothesis of cell recycling in germinal centers as a mechanism for affinity optimization. A comparison of three possible recycling pathways indicates that affinity maturation is optimized by a recycling pathway that has previously not been taken into account in deterministic continuum models.

  4. Single event effects and laser simulation studies

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Schwartz, H.; Mccarty, K.; Coss, J.; Barnes, C.

    1993-01-01

    The single event upset (SEU) linear energy transfer threshold (LETTH) of radiation hardened 64K Static Random Access Memories (SRAM's) was measured with a picosecond pulsed dye laser system. These results were compared with standard heavy ion accelerator (Brookhaven National Laboratory (BNL)) measurements of the same SRAM's. With heavy ions, the LETTH of the Honeywell HC6364 was 27 MeV-sq cm/mg at 125 C compared with a value of 24 MeV-sq cm/mg obtained with the laser. In the case of the second type of 64K SRAM, the IBM640lCRH no upsets were observed at 125 C with the highest LET ions used at BNL. In contrast, the pulsed dye laser tests indicated a value of 90 MeV-sq cm/mg at room temperature for the SEU-hardened IBM SRAM. No latchups or multiple SEU's were observed on any of the SRAM's even under worst case conditions. The results of this study suggest that the laser can be used as an inexpensive laboratory SEU prescreen tool in certain cases.

  5. Numerical simulation of dust events in the Middle East

    NASA Astrophysics Data System (ADS)

    Hamidi, Mehdi; Kavianpour, Mohammad Reza; Shao, Yaping

    2014-06-01

    In this paper, the severe dust event of 3-8 July 2009 in the Middle East is simulated using the WRF-DuMo model. To improve the model capacity in dust emission estimates, the effect of soil salt on threshold friction velocity for wind erosion is taken into consideration. A soil-salt propagation map and the other input parameters are compiled based on remote sensing and a Geographic Information System. The satellite images and synoptic data are used for the validation of the model results. Synoptic analysis is done for the Middle East and the synoptic systems for the severe dust event are identified. Comparison of the model results with the observed data shows that in the Aral-Caspian Sea area, central Iran and the Dead Sea Basin, dust emission is suppressed due to the high soil-salt content. The model shows better performances when the soil-salt effect is considered.

  6. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  7. An adaptive synchronization protocol for parallel discrete event simulation

    SciTech Connect

    Bisset, K.R.

    1998-12-01

    Simulation, especially discrete event simulation (DES), is used in a variety of disciplines where numerical methods are difficult or impossible to apply. One problem with this method is that a sufficiently detailed simulation may take hours or days to execute, and multiple runs may be needed in order to generate the desired results. Parallel discrete event simulation (PDES) has been explored for many years as a method to decrease the time taken to execute a simulation. Many protocols have been developed which work well for particular types of simulations, but perform poorly when used for other types of simulations. Often it is difficult to know a priori whether a particular protocol is appropriate for a given problem. In this work, an adaptive synchronization method (ASM) is developed which works well on an entire spectrum of problems. The ASM determines, using an artificial neural network (ANN), the likelihood that a particular event is safe to process.

  8. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D

  9. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  10. The Advanced Photon Source event system

    SciTech Connect

    Lenkszus, F.R.; Laird, R.

    1995-12-31

    The Advanced Photon Source, like many other facilities, requires a means of transmitting timing information to distributed control system 1/0 controllers. The APS event system provides the means of distributing medium resolution/accuracy timing events throughout the facility. It consists of VME event generators and event receivers which are interconnected with 10OMbit/sec fiber optic links at distances of up to 650m in either a star or a daisy chain configuration. The systems event throughput rate is 1OMevents/sec with a peak-to-peak timing jitter down to lOOns depending on the source of the event. It is integrated into the EPICS-based A.PS control system through record and device support. Event generators broadcast timing events over fiber optic links to event receivers which are programmed to decode specific events. Event generators generate events in response to external inputs, from internal programmable event sequence RAMS, and from VME bus writes. The event receivers can be programmed to generate both pulse and set/reset level outputs to synchronize hardware, and to generate interrupts to initiate EPICS record processing. In addition, each event receiver contains a time stamp counter which is used to provide synchronized time stamps to EPICS records.

  11. Simulating Ellerman bomb-like events

    NASA Astrophysics Data System (ADS)

    Danilovic, S.

    2017-05-01

    Context. Ellerman bombs (EB) seem to be a part of a whole spectrum of phenomena that might have the same underlying physical mechanism: magnetic reconnection. Aims: The aim of this study is to investigate whether the proposed mechanism, applied to the circumstances of EBs, produces the observed characteristics. Methods: To this end, comprehensive three-dimensional magnetohydrodynamic (3D MHD) simulations were used. Two different cases are presented: the quiet Sun and an active region. Results: Both runs confirm that EB-like brightenings coincide with hot and dense plasma, which is in agreement with predictions of 1D and 2D modellings. The simulated EB-like phenomena assume the observed flame-like form which depends on the complexity of the ongoing reconnection and the viewing angle. At the layers sampled by Hα-wings, near temperature minimum and below, the magnetic field topology seems always to be the same. The field lines there trace the base of the current sheet and the reconnected Ω-loops. Conclusions: The EB features are caused by reconnection of strong-field patches of opposite polarity in the regions where the surface flows are the strongest. The weakest cases among them can be reproduced quantitatively by the current simulations.

  12. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  13. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  14. Multi-Transiting Systems and Exoplanet Mutual Events

    NASA Astrophysics Data System (ADS)

    Coughlin, Jared; Ragozzine, D.; Holman, M. J.

    2011-01-01

    Until recently, studies of transiting exoplanets- planets that cross in front of their host star- have focused almost exclusively upon systems where there is only one transiting planet. Those studies that have considered additional planets have mostly done so with the goal of determining the perturbing effects that additional planets would have upon the orbit, and therefore the light curve, of the transiting planet. This work considers, in detail, a specific type of event known as an exoplanet mutual event. Such events occur when one planet passes in front of another. While such events can occur whether or not these planets are transiting, predicting and understanding these events is best done in systems with multiple transiting planets. We estimate, through an ensemble simulation, how frequently exoplanet mutual events occur and which systems are most likely to undergo exoplanet mutual events. We also investigate what information can be learned about not only the planets themselves but also the orbital architecture in such systems. We conclude that while ODT (overlapping double-transit) events occur with a much lower frequency than PPO (planet-planet occultation) events, ODT mutual events are capable of producing detectable signals, that Kepler will detect a few, and recommend that candidate systems for these events, such as KOI 191, be observed in short cadence(Steffen et. al 2010, Holman et. al 2010). This work is supported in part by the NSF REU and DOD ASSURE programs under NSF grant no. 0754568 and by the Smithsonian Institution.

  15. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  16. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  17. Simulating Single-Event Upsets in Bipolar RAM's

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.

    1986-01-01

    Simulation technique saves testing. Uses interactive version of SPICE (Simulation Program with Integrated Circuit Emphasis). Device and subcircuit models available in software used to construct macromodel for an integrated bipolar transistor. Time-dependent current generators placed inside transistor macromodel to simulate charge collection from ion track. Significant finding of experiments is standard design practice of reducing power in unaddressed bipolar RAM cell increases sensitivity of cell to single-event upsets.

  18. Upstream gyrating ion events: Cluster observations and simulations

    SciTech Connect

    Sauer, K.; Fraenz, M.; Dubinin, E.; Korth, A.; Mazelle, C.; Reme, H.; Dandouras, I.

    2005-08-01

    Localized events of low-frequency quasi-monochromatic waves in the 30s range observed by Cluster in the upstream region of Earth are analyzed. They are associated with a gyro-motion of the two ion populations consisting of the incoming solar wind protons and the back-streaming ions from the shock. A coordinate system is chosen in which one axis is parallel to the ambient magnetic field B0 and the other one is in the vswxB0 direction. The variation of the plasma parameters is compared with the result of two-fluid Hall-MHD simulations using different beam densities and velocities. Keeping a fixed (relative) beam density (e.g. {alpha}=0.005), non-stationary 'shock-like' structures are generated if the beam velocity exceeds a certain threshold of about ten times the Alfven velocity. Below the threshold, the localized events represent stationary, nonlinear waves (oscillitons) in a beam-plasma system in which the Reynold's stresses of the plasma and beam ions are balanced by the magnetic field stress.

  19. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    NASA Astrophysics Data System (ADS)

    De Raedt, H.; Michielsen, K.

    2014-12-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. Invited paper presented at QTAP-6.

  20. Downtime Event Management Notification System (DEMN)

    PubMed Central

    Jennings, Karen A.

    2001-01-01

    Information Technology Services (ITS) supports 117 applications, the network and servers for the health center. ITS wasn't always aware of the impact when a system was taken down. There was no comprehensive, coordinated effort to communicate events, putting mission critical systems at risk and at times burdening the server team with too many events on the same day. DEMN was built using Cold Fusion® and the Remedy® Action Response system. DEMN is used for planning and approval of downtime events as well as a vehicle to communicate unplanned event to the Help Desk, and subsequently, the end-user.

  1. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  2. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  3. Rare Event Simulation for T-cell Activation

    NASA Astrophysics Data System (ADS)

    Lipsmeier, Florian; Baake, Ellen

    2009-02-01

    The problem of statistical recognition is considered, as it arises in immunobiology, namely, the discrimination of foreign antigens against a background of the body's own molecules. The precise mechanism of this foreign-self-distinction, though one of the major tasks of the immune system, continues to be a fundamental puzzle. Recent progress has been made by van den Berg, Rand, and Burroughs (J. Theor. Biol. 209:465-486, 2001), who modelled the probabilistic nature of the interaction between the relevant cell types, namely, T-cells and antigen-presenting cells (APCs). Here, the stochasticity is due to the random sample of antigens present on the surface of every APC, and to the random receptor type that characterises individual T-cells. It has been shown previously (van den Berg et al. in J. Theor. Biol. 209:465-486, 2001; Zint et al. in J. Math. Biol. 57:841-861, 2008) that this model, though highly idealised, is capable of reproducing important aspects of the recognition phenomenon, and of explaining them on the basis of stochastic rare events. These results were obtained with the help of a refined large deviation theorem and were thus asymptotic in nature. Simulations have, so far, been restricted to the straightforward simple sampling approach, which does not allow for sample sizes large enough to address more detailed questions. Building on the available large deviation results, we develop an importance sampling technique that allows for a convenient exploration of the relevant tail events by means of simulation. With its help, we investigate the mechanism of statistical recognition in some depth. In particular, we illustrate how a foreign antigen can stand out against the self background if it is present in sufficiently many copies, although no a priori difference between self and nonself is built into the model.

  4. Simulation and study of small numbers of random events

    NASA Technical Reports Server (NTRS)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  5. Event-by-event fission simulation code, generates complete fission events

    SciTech Connect

    2013-04-01

    FREYA is a computer code that generates complete fission events. The output includes the energy and momentum of these final state particles: fission products, prompt neutrons and prompt photons. The version of FREYA that is to be released is a module for MCNP6.

  6. High Resolution Modeling of Tropical Cyclones Using Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Plotkin, D. A.; Abbot, D. S.; Weare, J.

    2014-12-01

    Tropical cyclones (TCs) present a challenge to modeling using general circulation models (GCMs) because they involve processes and structures that are too fine for GCMs to resolve. TCs have fine structures - e.g. the eye, eyewall, and rain bands - with length scales on the order of 10 km, while GCMs have typical resolutions on the order of 50-100 km. High resolution GCM runs that are sufficiently long to exhibit multiple TCs can be prohibitively computationally expensive. Thus, while GCMs exhibit TC-like vortices with similar spatial and temporal frequencies to observed TCs, the ability of GCMs to reproduce fine TC structures remains largely untested. In this study, we use recently developed rare event analysis and simulation methods to selectively simulate TCs under GCMs at very high resolution. These rare event simulation methods have been developed mostly in the context of computational chemistry, but are broadly applicable. They allow (either by careful manipulation of the model or by selection of trajectories) direct and detailed interrogation of the event of interest without introducing error and without the need to simulated for long periods of time to see the event. By creating targeted, high resolution GCM simulations with many TCs, we hope to determine whether or not GCMs can capture fine TC structures such as eyewalls and individual rain bands.

  7. Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations

    DTIC Science & Technology

    2016-06-01

    Simkit, Component Based Approach, Layered Defense Systems, Formation Movements , Design of Experiments, Simulation Output Analysis 15. NUMBER OF PAGES...5  2.  Simple Movement and Detection in Discrete Event Simulation Using Simkit...Work ........................................... 6    SIMPLE MOVEMENT AND DETECTION .................................................. 7 II. A

  8. Estimating rare events in biochemical systems using conditional sampling

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  9. Variability of simulants used in recreating stab events.

    PubMed

    Carr, D J; Wainwright, A

    2011-07-15

    Forensic investigators commonly use simulants/backing materials to mount fabrics and/or garments on when recreating damage due to stab events. Such work may be conducted in support of an investigation to connect a particular knife to a stabbing event by comparing the severance morphology obtained in the laboratory to that observed in the incident. There does not appear to have been a comparison of the effect of simulant type on the morphology of severances in fabrics and simulants, nor on the variability of simulants. This work investigates three simulants (pork, gelatine, expanded polystyrene), two knife blades (carving, bread), and how severances in the simulants and an apparel fabric typically used to manufacture T-shirts (single jersey) were affected by (i) simulant type and (ii) blade type. Severances were formed using a laboratory impact apparatus to ensure a consistent impact velocity and hence impact energy independently of the other variables. The impact velocity was chosen so that the force measured was similar to that measured in human performance trials. Force-time and energy-time curves were analysed and severance morphology (y, z directions) investigated. Simulant type and knife type significantly affected the critical forensic measurements of severance length (y direction) in the fabric and 'skin' (Tuftane). The use of EPS resulted in the lowest variability in data, further the severances recorded in both the fabric and Tuftane more accurately reflected the dimensions of the impacting knives.

  10. "Orpheus" cardiopulmonary bypass simulation system.

    PubMed

    Morris, Richard W; Pybus, David A

    2007-12-01

    In this paper we describe a high-fidelity perfusion simulation system intended for use in the training and continuing education of perfusionists. The system comprises a hydraulic simulator, an electronic interface unit and a controlling computer with associated real-time computer models. It is designed for use within an actual operating theatre, or within a specialized simulation facility. The hydraulic simulator can be positioned on an operating table and physically connected to the circuit of the institutional heart-lung machine. The institutional monitoring system is used to display the arterial and central venous pressures, the ECG and the nasopharyngeal temperature using appropriate connections. The simulator is able to reproduce the full spectrum of normal and abnormal events that may present during the course of cardiopulmonary bypass. The system incorporates a sophisticated blood gas model that accurately predicts the behavior of a modern, hollow-fiber oxygenator. Output from this model is displayed in the manner of an in-line blood gas electrode and is updated every 500 msecs. The perfusionist is able to administer a wide variety of drugs during a simulation session including: vasoconstrictors (metaraminol, epinephrine and phenylephrine), a vasodilator (sodium nitroprusside), chronotropes (epinephrine and atropine), an inotrope (epinephrine) and modifiers of coagulation (heparin and protamine). Each drug has a pharmacokinetic profile based on a three-compartment model plus an effect compartment. The simulation system has potential roles in the skill training of perfusionists, the development of crisis management protocols, the certification and accreditation of perfusionists and the evaluation of new perfusion equipment and/or techniques.

  11. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  12. Active optics simulation system

    NASA Technical Reports Server (NTRS)

    Chi, C. H.

    1973-01-01

    The active optics simulation system (AOSS) is a set of computer programs and associated software to be used in the development, design, and evaluation of a primary mirror control system for a large space telescope, (e.g., the tentatively proposed 3-meter telescope). The mathematical models of component subsystems and the solution of the physical processes that occur within the mirror surface control system were obtained, and based on these models AOSS simulates the behavior of the entire mirror surface control system as well as the behavior of the component subsystems. The program has a modular structure so that any subsystem module can be replaced or modified with minimum disruption of the rest of the simulation program.

  13. Expert System Prototype for False Event Discrimination.

    DTIC Science & Technology

    1985-11-14

    This report discusses a prototype expert system for event discrimination. We wanted to determine whether applying an expert system to handle and...other potential sources of erroneous information. The expert system is an apt vehicle for growth of systems knowledge, for quick decision making, and

  14. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  15. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  16. Fission Reaction Event Yield Algorithm, FREYA - For event-by-event simulation of fission

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Randrup, J.; Vogt, R.

    2015-06-01

    From nuclear materials accountability to detection of special nuclear material, SNM, the need for better modeling of fission has grown over the past decades. Current radiation transport codes compute average quantities with great accuracy and performance, but performance and averaging come at the price of limited interaction-by-interaction modeling. For fission applications, these codes often lack the capability of modeling interactions exactly: energy is not conserved, energies of emitted particles are uncorrelated, prompt fission neutron and photon multiplicities are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g. the neutron multiplicity) and correlations between neutrons and photons. The new computational model, FREYA (Fission Reaction Event Yield Algorithm), aims to meet this need by modeling complete fission events. Thus it automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum. FREYA has been integrated into the LLNL Fission Library, and will soon be part of MCNPX2.7.0, MCNP6, TRIPOLI-4.9, and Geant4.10.

  17. Disaster Response Modeling Through Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  18. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  19. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  20. Towards Flexible Exascale Stream Processing System Simulation

    SciTech Connect

    Li, Cheng-Hong; Nair, Ravi; Ohba, Noboyuki; Shvadron, Uzi; Zaks, Ayal; Schenfeld, Eugen

    2012-01-01

    Stream processing is an important emerging computational model for performing complex operations on and across multi-source, high-volume, unpredictable dataflows. We present Flow, a platform for parallel and distributed stream processing system simulation that provides a flexible modeling environment for analyzing stream processing applications. The Flow stream processing system simulator is a high-performance, scalable simulator that automatically parallelizes chunks of the model space and incurs near-zero synchronization overhead for acyclic stream application graphs. We show promising parallel and distributed event rates exceeding 149 million events per second on a cluster with 512 processor cores.

  1. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    SciTech Connect

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  2. Simulating an Extreme Wind Event in a Topographically Complex Region

    NASA Astrophysics Data System (ADS)

    Lennard, Christopher

    2014-07-01

    Complex topography modifies local weather characteristics such as air temperature, rainfall and airflow within a larger regional extent. The Cape Peninsula around Cape Town, South Africa, is a complex topographical feature responsible for the modification of rainfall and wind fields largely downstream of the Peninsula. During the passage of a cold front on 2 October 2002, an extreme wind event associated with tornado-like damage occurred in the suburb of Manenberg, however synoptic conditions did not indicate convective activity typically associated with a tornado. A numerical regional climate model was operated at very high horizontal resolution (500 m) to investigate the dynamics of the event. The model simulated an interaction between the topography of the peninsula and an airflow direction change associated with the passage of the cold front. A small region of cyclonic circulation was simulated over Manenberg that was embedded in an area of negative vorticity and a leeward gravity wave. The feature lasted 14 min and moved in a north to south direction. Vertically, it was not evident above 220 m. The model assessment describes this event as a shallow but intense cyclonic vortex generated in the lee of the peninsula through an interaction between the peninsula and a change in wind direction as the cold front made landfall. The model did not simulate wind speeds associated with the observed damage suggesting that the horizontal grid resolution ought to be at the scale of the event to more completely understand such microscale airflow phenomena.

  3. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  4. SPICE: Simulation Package for Including Flavor in Collider Events

    NASA Astrophysics Data System (ADS)

    Engelhard, Guy; Feng, Jonathan L.; Galon, Iftah; Sanford, David; Yu, Felix

    2010-01-01

    We describe SPICE: Simulation Package for Including Flavor in Collider Events. SPICE takes as input two ingredients: a standard flavor-conserving supersymmetric spectrum and a set of flavor-violating slepton mass parameters, both of which are specified at some high "mediation" scale. SPICE then combines these two ingredients to form a flavor-violating model, determines the resulting low-energy spectrum and branching ratios, and outputs HERWIG and SUSY Les Houches files, which may be used to generate collider events. The flavor-conserving model may be any of the standard supersymmetric models, including minimal supergravity, minimal gauge-mediated supersymmetry breaking, and anomaly-mediated supersymmetry breaking supplemented by a universal scalar mass. The flavor-violating contributions may be specified in a number of ways, from specifying charges of fields under horizontal symmetries to completely specifying all flavor-violating parameters. SPICE is fully documented and publicly available, and is intended to be a user-friendly aid in the study of flavor at the Large Hadron Collider and other future colliders. Program summaryProgram title: SPICE Catalogue identifier: AEFL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 8153 No. of bytes in distributed program, including test data, etc.: 67 291 Distribution format: tar.gz Programming language: C++ Computer: Personal computer Operating system: Tested on Scientific Linux 4.x Classification: 11.1 External routines: SOFTSUSY [1,2] and SUSYHIT [3] Nature of problem: Simulation programs are required to compare theoretical models in particle physics with present and future data at particle colliders. SPICE determines the masses and decay branching ratios of

  5. Extreme events evaluation over African cities with regional climate simulations

    NASA Astrophysics Data System (ADS)

    Bucchignani, Edoardo; Mercogliano, Paola; Simonis, Ingo; Engelbrecht, Francois

    2013-04-01

    The warming of the climate system in recent decades is evident from observations and is mainly related to the increase of anthropogenic greenhouse gas concentrations (IPCC, 2012). Given the expected climate change conditions on the African continent, as underlined in different publications, and their associated socio-economic impacts, an evaluation of the specific effects on some strategic African cities on the medium and long-term is of crucial importance with regard to the development of adaptation strategies. Assessments usually focus on averages climate properties rather than on variability or extremes, but often these last ones have more impacts on the society than averages values. Global Coupled Models (GCM) are generally used to simulate future climate scenarios as they guarantee physical consistency between variables; however, due to the coarse spatial resolution, their output cannot be used for impact studies on local scales, which makes necessary the generation of higher resolution climate change data. Regional Climate Models (RCM) describe better the phenomena forced by orography or by coastal lines, or that are related to convection. Therefore they can provide more detailed information on climate extremes that are hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws. The normal bias of the RCM to represent the local climatology is reduced using adequate statistical techniques based on the comparison of the simulated results with long observational time series. In the framework of the EU-FP7 CLUVA (Climate Change and Urban Vulnerability in Africa) project, regional projections of climate change at high resolution (about 8 km), have been performed for selected areas surrounding five African cities. At CMCC, the regional climate model COSMO-CLM has been employed: it is a non-hydrostatic model. For each domain, two simulations have been performed, considering the RCP4.5 and RCP8.5 emission

  6. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  7. The annoyance of multiple noisy events. [ratings for simulated flyovers

    NASA Technical Reports Server (NTRS)

    Ahumada, A., Jr.; Nagel, D. C.

    1979-01-01

    A total of 24 subjects (17 M, 7 F) was tested in an experimental study of annoyance rating of multiple noisy events (30 sets of noise bursts). The scaling technique known as functional measurement was used to determine whether annoyance integrates additively over events and if so, to measure the power law exponent which relates the levels of the events to the additive scale values. To this end, groups of three noises were presented at three levels in a factorial arrangement to check the additivity hypothesis and to estimate the scaling function. Also, a series of sets of noises of constant level but varying in set size were considered. The functional measurement of annoyance ratings of sets of three simulated flyovers showed that the integration of annoyance can be represented as an additive process in terms of scale values that are power functions of the sound power with a power-law exponent near 0.7.

  8. Flash heat simulation events in the north Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Mazon, Jordi; Pino, David

    2013-04-01

    According to the definition of flash heat event proposed by Mazon et al. in the European Meteorology Meeting (2011 and 2012) from the studied case produced in the Northeast of the Iberian peninsula on 27th August 20120, some other flash heat events have been detected by automatic weather stations around the in the Mediterranean basin (South Italy, Crete island, South Greece and the northeast of the Iberian peninsula). Flash heat event covers those events in which a large increase of temperature last a spatial and temporal scale between heat wave (defined by the WMO as a phenomenon in which the daily maximum temperature of more than five consecutive days exceeds the average maximum temperature by 5°C, with respect to the 1961-1990 period) and heat burst (defined by the AMS as a rare atmospheric event characterized by gusty winds and a rapid increase in temperature and decrease in humidity that can last some minutes). Thus flash heat event may be considered as a rapid modification of the temperature that last several hours, lower than 48 hours, but usually less than 24 hours. Two different flash heat events have been simulated with the WRF mesoscale model in the Mediterranean basin. The results show that two different mechanisms are the main causes of these flash heat events. The first one occurred on 23rd March 2008 in Crete Island due to a strong Foehn effect caused by a strong south and southeast wind, in which the maximum temperature increased during some hours on the night at 32°C. The second one occurred on 1st August 2012 in the northeast of the Iberian Peninsula, caused by a rapid displacement of warm a ridge from North Africa that lasted around 24 hours.

  9. Analyses Of Transient Events In Complex Valve and Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Cavallo, Peter; Daines, Russell

    2005-01-01

    Valve systems in rocket propulsion systems and testing facilities are constantly subject to dynamic events resulting from the timing of valve motion leading to unsteady fluctuations in pressure and mass flow. Such events can also be accompanied by cavitation, resonance, system vibration leading to catastrophic failure. High-fidelity dynamic computational simulations of valve operation can yield important information of valve response to varying flow conditions. Prediction of transient behavior related to valve motion can serve as guidelines for valve scheduling, which is of crucial importance in engine operation and testing. In this paper, we present simulations of the diverse unsteady phenomena related to valve and feed systems that include valve stall, valve timing studies as well as cavitation instabilities in components utilized in the test loop.

  10. Three Dimensional Simulation of the Baneberry Nuclear Event

    SciTech Connect

    Lomov, I

    2003-07-16

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  11. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  12. Automated estimation of rare event probabilities in biochemical systems

    NASA Astrophysics Data System (ADS)

    Daigle, Bernie J.; Roh, Min K.; Gillespie, Dan T.; Petzold, Linda R.

    2011-01-01

    In biochemical systems, the occurrence of a rare event can be accompanied by catastrophic consequences. Precise characterization of these events using Monte Carlo simulation methods is often intractable, as the number of realizations needed to witness even a single rare event can be very large. The weighted stochastic simulation algorithm (wSSA) [J. Chem. Phys. 129, 165101 (2008)] and its subsequent extension [J. Chem. Phys. 130, 174103 (2009)] alleviate this difficulty with importance sampling, which effectively biases the system toward the desired rare event. However, extensive computation coupled with substantial insight into a given system is required, as there is currently no automatic approach for choosing wSSA parameters. We present a novel modification of the wSSA—the doubly weighted SSA (dwSSA)—that makes possible a fully automated parameter selection method. Our approach uses the information-theoretic concept of cross entropy to identify parameter values yielding minimum variance rare event probability estimates. We apply the method to four examples: a pure birth process, a birth-death process, an enzymatic futile cycle, and a yeast polarization model. Our results demonstrate that the proposed method (1) enables probability estimation for a class of rare events that cannot be interrogated with the wSSA, and (2) for all examples tested, reduces the number of runs needed to achieve comparable accuracy by multiple orders of magnitude. For a particular rare event in the yeast polarization model, our method transforms a projected simulation time of 600 years to three hours. Furthermore, by incorporating information-theoretic principles, our approach provides a framework for the development of more sophisticated influencing schemes that should further improve estimation accuracy.

  13. Automated estimation of rare event probabilities in biochemical systems

    PubMed Central

    Daigle, Bernie J.; Roh, Min K.; Gillespie, Dan T.; Petzold, Linda R.

    2011-01-01

    In biochemical systems, the occurrence of a rare event can be accompanied by catastrophic consequences. Precise characterization of these events using Monte Carlo simulation methods is often intractable, as the number of realizations needed to witness even a single rare event can be very large. The weighted stochastic simulation algorithm (wSSA) [J. Chem. Phys. 129, 165101 (2008)] and its subsequent extension [J. Chem. Phys. 130, 174103 (2009)] alleviate this difficulty with importance sampling, which effectively biases the system toward the desired rare event. However, extensive computation coupled with substantial insight into a given system is required, as there is currently no automatic approach for choosing wSSA parameters. We present a novel modification of the wSSA—the doubly weighted SSA (dwSSA)—that makes possible a fully automated parameter selection method. Our approach uses the information-theoretic concept of cross entropy to identify parameter values yielding minimum variance rare event probability estimates. We apply the method to four examples: a pure birth process, a birth-death process, an enzymatic futile cycle, and a yeast polarization model. Our results demonstrate that the proposed method (1) enables probability estimation for a class of rare events that cannot be interrogated with the wSSA, and (2) for all examples tested, reduces the number of runs needed to achieve comparable accuracy by multiple orders of magnitude. For a particular rare event in the yeast polarization model, our method transforms a projected simulation time of 600 years to three hours. Furthermore, by incorporating information-theoretic principles, our approach provides a framework for the development of more sophisticated influencing schemes that should further improve estimation accuracy. PMID:21280690

  14. A Deuteron Quasielastic Event Simulation for CLAS12

    NASA Astrophysics Data System (ADS)

    Alam, Omair; Gilfoyle, Gerard

    2014-09-01

    An experiment to measure the neutron magnetic form factor (GnM) is planned for the new CLAS12 detector in Hall B at Jefferson Lab. This form factor is extracted from the ratio of quasielastic electron-neutron to electron-proton scattering off a liquid deuterium (LD2) target. The QUasiElastic Event Generator (queeg) models the internal motion of the nucleons in deuterium. It extends a previous version used at Jefferson Lab. The program generates events that are used as input to the Geant4 Monte Carlo (gemc); a program that simulates the particle's interactions with each component of CLAS12 including the target material. The source code for queeg was modified to produce output in the LUND format, set the position of the center of the LD2 target, and simulate a realistic deuterium target. The event vertex was randomly distributed along the beamline in the target region and von Neumann rejection was used to select random points in the plane transverse to the beamline within a fixed radius from the beam. An initial study of the impact of the target structure and material revealed only limited effects. An experiment to measure the neutron magnetic form factor (GnM) is planned for the new CLAS12 detector in Hall B at Jefferson Lab. This form factor is extracted from the ratio of quasielastic electron-neutron to electron-proton scattering off a liquid deuterium (LD2) target. The QUasiElastic Event Generator (queeg) models the internal motion of the nucleons in deuterium. It extends a previous version used at Jefferson Lab. The program generates events that are used as input to the Geant4 Monte Carlo (gemc); a program that simulates the particle's interactions with each component of CLAS12 including the target material. The source code for queeg was modified to produce output in the LUND format, set the position of the center of the LD2 target, and simulate a realistic deuterium target. The event vertex was randomly distributed along the beamline in the target region and

  15. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  16. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  17. Simulated Changes in Extreme Temperature and Precipitation Events at 6 ka

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Bell, J. L.; Sloan, L. C.

    2003-12-01

    Paleoenviromental archives record a range of information about past environments. Three key influences shaping paleoclimate records at a given time plane are the mean state of the climate system, interannual variability, and the frequency and seasonality of extreme climate events. We have employed a high resolution regional climate model (RCM) to test the sensitivity of extreme climate events to 6 ka orbital forcing, using western North America as a case study. Extreme precipitation and temperature events were defined by the distribution of daily precipitation and temperature values in the control simulation. Simulated anomalies (6 ka - control) in the number of extreme precipitation events per year were positive throughout the RCM domain, as were anomalies in the percent of annual precipitation delivered by extreme precipitation events. These annual-scale positive anomalies in extreme precipitation were driven by changes in the seasonality of extreme precipitation events at 6 ka, with January, October and November showing the greatest positive anomalies in percent of monthly precipitation delivered by extreme precipitation events. The frequency and length of extreme temperature events in the western United States was also sensitive to 6 ka orbital forcing. Positive anomalies in the frequency of extreme maximum daily temperature values occurred inland in the RCM domain, with peak anomalies of 24 days/year centered over the Great Basin. Likewise, the number of days/year in which the maximum daily temperature exceeded 32° C increased over land by 24%, with the average heat-wave up to 12 days longer in the 6 ka simulation than in the control simulation. Finally, mean first and last freeze dates were later inland in the 6 ka simulation than in the control simulation.

  18. Identification of coronal heating events in 3D simulations

    NASA Astrophysics Data System (ADS)

    Kanella, Charalambos; Gudiksen, Boris V.

    2017-07-01

    Context. The solar coronal heating problem has been an open question in the science community since 1939. One of the proposed models for the transport and release of mechanical energy generated in the sub-photospheric layers and photosphere is the magnetic reconnection model that incorporates Ohmic heating, which releases a part of the energy stored in the magnetic field. In this model many unresolved flaring events occur in the solar corona, releasing enough energy to heat the corona. Aims: The problem with the verification and quantification of this model is that we cannot resolve small scale events due to limitations of the current observational instrumentation. Flaring events have scaling behavior extending from large X-class flares down to the so far unobserved nanoflares. Histograms of observable characteristics of flares show powerlaw behavior for energy release rate, size, and total energy. Depending on the powerlaw index of the energy release, nanoflares might be an important candidate for coronal heating; we seek to find that index. Methods: In this paper we employ a numerical three-dimensional (3D)-magnetohydrodynamic (MHD) simulation produced by the numerical code Bifrost, which enables us to look into smaller structures, and a new technique to identify the 3D heating events at a specific instant. The quantity we explore is the Joule heating, a term calculated directly by the code, which is explicitly correlated with the magnetic reconnection because it depends on the curl of the magnetic field. Results: We are able to identify 4136 events in a volume 24 × 24 × 9.5 Mm3 (i.e., 768 × 786 × 331 grid cells) of a specific snapshot. We find a powerlaw slope of the released energy per second equal to αP = 1.5 ± 0.02, and two powerlaw slopes of the identified volume equal to αV = 1.53 ± 0.03 and αV = 2.53 ± 0.22. The identified energy events do not represent all the released energy, but of the identified events, the total energy of the largest events

  19. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  20. Sequential Events Control System (SECS) Overview

    NASA Technical Reports Server (NTRS)

    Interbartolo, Michael

    2009-01-01

    This slide presentation will cover the Sequential Events Control System (SECS), which is the Apollo spacecraft subsystem that controls the automatically sequenced functions during the mission and during any a borts that could be performed. Included in this presentation are its general architecture, its integration into and use of the spacecraft' s other systems, and details on the functions it is responsible for c ontrolling during the mission. The objectives are to describe the system's architecture, the major components in the system, and the major system functions.

  1. Event-driven simulation of cerebellar granule cells.

    PubMed

    Carrillo, Richard R; Ros, Eduardo; Tolu, Silvia; Nieus, Thierry; D'Angelo, Egidio

    2008-01-01

    Around half of the neurons of a human brain are granule cells (approximately 10(11)granule neurons) [Kandel, E.R., Schwartz, J.H., Jessell, T.M., 2000. Principles of Neural Science. McGraw-Hill Professional Publishing, New York]. In order to study in detail the functional role of the intrinsic features of this cell we have developed a pre-compiled behavioural model based on the simplified granule-cell model of Bezzi et al. [Bezzi, M., Nieus, T., Arleo, A., D'Angelo, E., Coenen, O.J.-M.D., 2004. Information transfer at the mossy fiber-granule cell synapse of the cerebellum. 34th Annual Meeting. Society for Neuroscience, San Diego, CA, USA]. We can use an efficient event-driven simulation scheme based on lookup tables (EDLUT) [Ros, E., Carrillo, R.R., Ortigosa, E.M., Barbour, B., Ags, R., 2006. Event-driven simulation scheme for spiking neural networks using lookup tables to characterize neuronal dynamics. Neural Computation 18 (12), 2959-2993]. For this purpose it is necessary to compile into tables the data obtained through a massive numerical calculation of the simplified cell model. This allows network simulations requiring minimal numerical calculation. There are three major features that are considered functionally relevant in the simplified granule cell model: bursting, subthreshold oscillations and resonance. In this work we describe how the cell model is compiled into tables keeping these key properties of the neuron model.

  2. High resolution simulations of extreme weather event in south Sardinia

    NASA Astrophysics Data System (ADS)

    Dessy, C.

    2010-05-01

    In the last decade, like most region of Mediterranean Europe, Sardinia has experienced severe precipitation events generating flash floods resulting in loss of lives and large economic damage. A numerical meteorological operational set-up is applied in the local weather service with the aim to improve the operational short range weather forecast of the Service with particular attention to intense, mostly rare and potentially severe, events. On the early hours of 22 October 2008 an intense and almost stationary mesoscale convective system interested particularly the south of Sardinia, heavy precipitation caused a flash flood with fatalities and numerous property damages. The event was particularly intense: about 400 mm of rain in 12 hours (a peak of 150 mm in an hour) were misured by the regional network of weather stations and these values appear extremely meaningfulls since those are about seven times the climatological monthly rainfall for that area and nearly the climatological annual rainfall. With the aim to improve significantly quantitative precipitation forecasting, it was evaluated a different set-up of a high resolution convection resolving model (MM5) initialised with different initial and boundary conditions (ECMWF and NCAR). In this paper it is discussed the meteorological system related to the mentioned event by using different numerical weather models (GCM and LAM) combined with conventional data, radar Doppler and Meteosat images. Preliminary results say that a different set-up of a non hydrostatic model can forecast severe convection events in advance of about one day and produce more realistic rainfall than that current operational and also improve the weather forecasts to respect the ECMWF-GCM. So it could drive an operational alert system in order to limit the risks associated with heavy precipitation events.

  3. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    SciTech Connect

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  4. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  5. Coupling expert systems and simulation

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G.; Padalkar, S.; Rodriguez-Moscoso, J.; Hsieh, B. J.; Vinz, F.; Fernandez, K. R.

    1988-01-01

    A prototype coupled system called NESS (NASA Expert Simulation System) is described. NESS assists the user in running digital simulations of dynamic systems, interprets the output data to performance specifications, and recommends a suitable series compensator to be added to the simulation model.

  6. 3D Simulation of External Flooding Events for the RISMC Pathway

    SciTech Connect

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  7. Thermodynamic MHD Simulation of the Bastille Day Event

    NASA Astrophysics Data System (ADS)

    Torok, Tibor; Downs, Cooper; Lionello, Roberto; Linker, Jon A.; Mikic, Zoran; Titov, Viacheslav S.; Riley, Pete

    2014-05-01

    The "Bastille Day" event on July 14, 2000 is one of the most extensively studied solar eruptions. It originated in a complex active region close to disk center and produced an X5.7 flare, a fast halo CME, and an intense geomagnetic storm. We have recently begun to model this challenging event, with the final goal to simulate its whole evolution, from the pre-eruptive state to the CME's arrival at 1 AU. To this end, we first produce a steady-state MHD solution of the background corona that incorporates realistic energy transport ("thermodynamic MHD"), photospheric magnetic field measurements, and the solar wind. In order to model the pre-eruptive magnetic field, we then insert into this solution a stable, elongated flux rope that resides above the highly curved polarity inversion line of the active region. Finally, we produce an eruption by imposing photospheric flows that slowly converge towards the polarity inversion line. In this presentation we describe our method, compare the simulation results with the observations, and discuss the challenges and limitations involved in modeling such complex and powerful eruptions.

  8. Simulation and event reconstruction inside the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, S.

    2008-07-01

    The PANDA detector will be located at the future GSI accelerator FAIR. Its primary objective is the investigation of strong interaction with anti-proton beams, in the range up to 15 GeV/c as momentum of the incoming anti-proton. The PANDA offline simulation framework is called 'PandaRoot', as it is based upon the ROOT 5.14 package. It is characterized by a high versatility; it allows to perform simulation and analysis, to run different event generators (EvtGen, Pluto, UrQmd), different transport models (Geant3, Geant4, Fluka) with the same code, thus to compare the results simply by changing few macro lines without recompiling at all. Moreover auto-configuration scripts allow installing the full framework easily in different Linux distributions and with different compilers (the framework was installed and tested in more than 10 Linux platforms) without further manipulation. The final data are in a tree format, easily accessible and readable through simple clicks on the root browsers. The presentation will report on the actual status of the computing development inside the PandaRoot framework, in terms of detector implementation and event reconstruction.

  9. Modeling solar energetic particle events using ENLIL heliosphere simulations

    NASA Astrophysics Data System (ADS)

    Luhmann, J. G.; Mays, M. L.; Odstrcil, D.; Li, Yan; Bain, H.; Lee, C. O.; Galvin, A. B.; Mewaldt, R. A.; Cohen, C. M. S.; Leske, R. A.; Larson, D.; Futaana, Y.

    2017-07-01

    Solar energetic particle (SEP) event modeling has gained renewed attention in part because of the availability of a decade of multipoint measurements from STEREO and L1 spacecraft at 1 AU. These observations are coupled with improving simulations of the geometry and strength of heliospheric shocks obtained by using coronagraph images to send erupted material into realistic solar wind backgrounds. The STEREO and ACE measurements in particular have highlighted the sometimes surprisingly widespread nature of SEP events. It is thus an opportune time for testing SEP models, which typically focus on protons 1-100 MeV, toward both physical insight to these observations and potentially useful space radiation environment forecasting tools. Some approaches emphasize the concept of particle acceleration and propagation from close to the Sun, while others emphasize the local field line connection to a traveling, evolving shock source. Among the latter is the previously introduced SEPMOD treatment, based on the widely accessible and well-exercised WSA-ENLIL-cone model. SEPMOD produces SEP proton time profiles at any location within the ENLIL domain. Here we demonstrate a SEPMOD version that accommodates multiple, concurrent shock sources occurring over periods of several weeks. The results illustrate the importance of considering longer-duration time periods and multiple CME contributions in analyzing, modeling, and forecasting SEP events.

  10. Transportation Anslysis Simulation System

    SciTech Connect

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at the level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not account

  11. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  12. Simulation of Ionospheric Response During Solar Eclipse Events

    NASA Astrophysics Data System (ADS)

    Kordella, L.; Earle, G. D.; Huba, J.

    2016-12-01

    Total solar eclipses are rare, short duration events that present interesting case studies of ionospheric behavior because the structure of the ionosphere is determined and stabilized by varying energies of solar radiation (Lyman alpha, X-ray, U.V., etc.). The ionospheric response to eclipse events is a source of scientific intrigue that has been studied in various capacities over the past 50 years. Unlike the daily terminator crossings, eclipses cause highly localized, steep gradients of ionization efficiency due to their comparatively small solar zenith angle. However, the corona remains present even at full obscuration, meaning that the energy reduction never falls to the levels seen at night. Previous eclipse studies performed by research groups in the US, UK, China and Russia have shown a range of effects, some counter-intuitive and others contradictory. In the shadowed region of an eclipse (i.e. umbra) it is logical to assume a reduction in ionization rates correlating with the reduction of incident solar radiation. Results have shown that even this straightforward hypothesis may not be true; effects on plasma distribution, motion and temperature are more appreciable than might be expected. Recent advancements in ionospheric simulation codes present the opportunity to investigate the relationship between geophysical conditions and geomagnetic location on resulting eclipse event ionosphere. Here we present computational simulation results using the Naval Research Lab (NRL) developed ionospheric modeling codes Sami2 and Sami3 (Sami2 is Another Model of the Ionosphere) modified with spatio-temporal photoionization attenuation functions derived from theory and empirical data.

  13. Weather Climate Interactions and Extreme Events in the Climate System

    NASA Astrophysics Data System (ADS)

    Roundy, P. E.

    2015-12-01

    The most pronounced local impacts of climate change would occur in association with extreme weather events superimposed on the altered climate. Thus a major thrust of recent efforts in the climate community has been to assess how extreme regional events such as cold air outbreaks, heat waves, tropical cyclones, floods, droughts, and severe weather might change with the climate. Many of these types of events are poorly simulated in climate models because of insufficient spatial resolution and insufficient quality parameterization of sub grid scale convection and radiation processes. This talk summarizes examples selected from those discussed below of how weather and climate events can be interconnected so that the physics of natural climate and weather phenomena depend on each other, thereby complicating our ability to simulate extreme events. A major focus of the chapter is on the Madden Julian oscillation (MJO), which is associated with alternating eastward-moving planetary scale regions of enhanced and suppressed moist deep convection favoring warm pool regions in the tropics. The MJO modulates weather events around the world and influences the evolution of interannual climate variability. We first discuss how the MJO evolves together with the seasonal cycle, the El Niño/southern oscillation (ENSO), and the extratropical circulation, then continue with a case study illustration of how El Niño is intrinsically coupled to intraseasonal and synoptic weather events such as the MJO and westerly wind bursts. This interconnectedness in the system implies that modeling many types of regional extreme weather events requires more than simply downscaling coarse climate model signals to nested regional models because extreme outcomes in a region can depend on poorly simulated extreme weather in distant parts of the world. The authors hope that an improved understanding of these types of interactions between signals across scales of time and space will ultimately yield

  14. Cardiovascular Events in Systemic Lupus Erythematosus

    PubMed Central

    Fernández-Nebro, Antonio; Rúa-Figueroa, Íñigo; López-Longo, Francisco J.; Galindo-Izquierdo, María; Calvo-Alén, Jaime; Olivé-Marqués, Alejandro; Ordóñez-Cañizares, Carmen; Martín-Martínez, María A.; Blanco, Ricardo; Melero-González, Rafael; Ibáñez-Rúan, Jesús; Bernal-Vidal, José Antonio; Tomero-Muriel, Eva; Uriarte-Isacelaya, Esther; Horcada-Rubio, Loreto; Freire-González, Mercedes; Narváez, Javier; Boteanu, Alina L.; Santos-Soler, Gregorio; Andreu, José L.; Pego-Reigosa, José M.

    2015-01-01

    Abstract This article estimates the frequency of cardiovascular (CV) events that occurred after diagnosis in a large Spanish cohort of patients with systemic lupus erythematosus (SLE) and investigates the main risk factors for atherosclerosis. RELESSER is a nationwide multicenter, hospital-based registry of SLE patients. This is a cross-sectional study. Demographic and clinical variables, the presence of traditional risk factors, and CV events were collected. A CV event was defined as a myocardial infarction, angina, stroke, and/or peripheral artery disease. Multiple logistic regression analysis was performed to investigate the possible risk factors for atherosclerosis. From 2011 to 2012, 3658 SLE patients were enrolled. Of these, 374 (10.9%) patients suffered at least a CV event. In 269 (7.4%) patients, the CV events occurred after SLE diagnosis (86.2% women, median [interquartile range] age 54.9 years [43.2–66.1], and SLE duration of 212.0 months [120.8–289.0]). Strokes (5.7%) were the most frequent CV event, followed by ischemic heart disease (3.8%) and peripheral artery disease (2.2%). Multivariate analysis identified age (odds ratio [95% confidence interval], 1.03 [1.02–1.04]), hypertension (1.71 [1.20–2.44]), smoking (1.48 [1.06–2.07]), diabetes (2.2 [1.32–3.74]), dyslipidemia (2.18 [1.54–3.09]), neurolupus (2.42 [1.56–3.75]), valvulopathy (2.44 [1.34–4.26]), serositis (1.54 [1.09–2.18]), antiphospholipid antibodies (1.57 [1.13–2.17]), low complement (1.81 [1.12–2.93]), and azathioprine (1.47 [1.04–2.07]) as risk factors for CV events. We have confirmed that SLE patients suffer a high prevalence of premature CV disease. Both traditional and nontraditional risk factors contribute to this higher prevalence. Although it needs to be verified with future studies, our study also shows—for the first time—an association between diabetes and CV events in SLE patients. PMID:26200625

  15. Simulating neural systems with Xyce.

    SciTech Connect

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting; Warrender, Christina E.; Aimone, James Bradley; Teeter, Corinne; Duda, Alex M.

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  16. Corpuscular event-by-event simulation of quantum optics experiments: application to a quantum-controlled delayed-choice experiment

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Delina, M.; Jin, Fengping; Michielsen, Kristel

    2012-11-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and a single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. The approach is illustrated by applying it to a recent proposal for a quantum-controlled delayed choice experiment, demonstrating that also this thought experiment can be understood in terms of particle processes only.

  17. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    SciTech Connect

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah; Ross, Robert; Carns, Philip

    2016-05-15

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the model size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.

  18. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  19. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  20. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  1. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Simulating spontaneous aseismic and seismic slip events on evolving faults

    NASA Astrophysics Data System (ADS)

    Herrendörfer, Robert; van Dinther, Ylona; Pranger, Casper; Gerya, Taras

    2017-04-01

    Plate motion along tectonic boundaries is accommodated by different slip modes: steady creep, seismic slip and slow slip transients. Due to mainly indirect observations and difficulties to scale results from laboratory experiments to nature, it remains enigmatic which fault conditions favour certain slip modes. Therefore, we are developing a numerical modelling approach that is capable of simulating different slip modes together with the long-term fault evolution in a large-scale tectonic setting. We extend the 2D, continuum mechanics-based, visco-elasto-plastic thermo-mechanical model that was designed to simulate slip transients in large-scale geodynamic simulations (van Dinther et al., JGR, 2013). We improve the numerical approach to accurately treat the non-linear problem of plasticity (see also EGU 2017 abstract by Pranger et al.). To resolve a wide slip rate spectrum on evolving faults, we develop an invariant reformulation of the conventional rate-and-state dependent friction (RSF) and adapt the time step (Lapusta et al., JGR, 2000). A crucial part of this development is a conceptual ductile fault zone model that relates slip rates along discrete planes to the effective macroscopic plastic strain rates in the continuum. We test our implementation first in a simple 2D setup with a single fault zone that has a predefined initial thickness. Results show that deformation localizes in case of steady creep and for very slow slip transients to a bell-shaped strain rate profile across the fault zone, which suggests that a length scale across the fault zone may exist. This continuum length scale would overcome the common mesh-dependency in plasticity simulations and question the conventional treatment of aseismic slip on infinitely thin fault zones. We test the introduction of a diffusion term (similar to the damage description in Lyakhovsky et al., JMPS, 2011) into the state evolution equation and its effect on (de-)localization during faster slip events. We compare

  3. Simulated cold events in the northern North Atlantic during the last millennium

    NASA Astrophysics Data System (ADS)

    Moreno-Chamarro, Eduardo; Zanchettin, Davide; Lohmann, Katja; Jungclaus, Johann

    2014-05-01

    Paleoceanographic data show large inter-decadal cold excursions in sea-surface temperatures (SSTs) in the western subpolar gyre region and north of Iceland throughout the last millennium. A series of such events could have contributed to demise the Norse settlements over Greenland during the 13th to the 15th century due to associated deteriorating environmental conditions in the region. However, spatial extent, attribution and mechanism(s) of these cold events are not known. In this contribution, we use climate model simulations to clarify the role of the ocean and of coupled ocean-atmosphere dynamics in triggering these cold events, and to assess whether they can be explained by internal climate variability alone. Specifically, we investigate the North Atlantic-Arctic climate variability in a 1000-year control run describing an unperturbed pre-industrial climate, and in a 3-member ensemble of full-forcing transient simulations of the last millennium. Simulations are performed with the Max Planck Institute-Earth System Model for paleo-applications. In the control and transient simulations, we identified cold events of similar amplitude and duration to the reconstructed data. Spatial patterns and temporal evolutions of simulated cold events are similar in both simulation types. In the transient runs, furthermore, they do not robustly coincide with periods of strong external forcing (e.g. of major volcanic eruptions). We therefore conclude that such events can emerge because of internally-generated regional climate variability alone. Local ocean-atmosphere coupled processes in the North Atlantic subpolar gyre region appear as key part of the mechanism of simulated cold events. In particular, they are typically associated with the onset of prolonged positive sea-level pressure anomalies over the North Atlantic and associated weaker and south-eastward displaced subpolar gyre. The salt transport reduction by the Irminger Current together with an intensification of the

  4. Importance of Model Simulations in Cassini In-Flight Mission Events

    NASA Technical Reports Server (NTRS)

    Brown, Jay; Wang, Eric; Hernandez, Juan; Lee, Allan Y.

    2009-01-01

    Simulation environments have been an integral part of Cassini's heritage. From the time of flight software development and testing to the beginning of the spacecraft's extended mission operations, both softsim and hardware-in-the-loop testbeds have played vital roles in verifying and validating key mission events. Satellite flybys and mission-critical events have established the need to model Titan's atmospheric torque, Enceladus' plume density, and other key parametric spacecraft environments. This paper will focus on enhancements to Cassini's Flight Software Development System (FSDS) and Integrated Test Laboratory (ITL) to model key event attributes which establish valid test environments and ensure safe spacecraft operability. Comparisons between simulated to in-flight data are presented which substantiate model validity.

  5. Characteristics of rainfall events in regional climate model simulations for the Czech Republic

    NASA Astrophysics Data System (ADS)

    Svoboda, Vojtěch; Hanel, Martin; Máca, Petr; Kyselý, Jan

    2017-02-01

    Characteristics of rainfall events in an ensemble of 23 regional climate model (RCM) simulations are evaluated against observed data in the Czech Republic for the period 1981-2000. Individual rainfall events are identified using the concept of minimum inter-event time (MIT) and only heavy events (15 % of events with the largest event depths) during the warm season (May-September) are considered. Inasmuch as an RCM grid box represents a spatial average, the effects of areal averaging of rainfall data on characteristics of events are investigated using the observed data. Rainfall events from the RCM simulations are then compared to those from the at-site and area-average observations. Simulated number of heavy events and seasonal total precipitation due to heavy events are on average represented relatively well despite the higher spatial variation compared to observations. RCM-simulated event depths are comparable to the area-average observations, while event durations are overestimated and other characteristics related to rainfall intensity are significantly underestimated. The differences between RCM-simulated and at-site observed rainfall event characteristics are in general dominated by the biases of the climate models rather than the areal-averaging effect. Most of the rainfall event characteristics in the majority of the RCM simulations show a similar altitude-dependence pattern as in the observed data. The number of heavy events and seasonal total precipitation due to heavy events increase with altitude, and this dependence is captured better by the RCM simulations with higher spatial resolution.

  6. Numerical simulations of fast transient events in the sun.

    NASA Astrophysics Data System (ADS)

    Casillas-Perez, G. A.; Jeyakumar, S.; Perez-Enriquez, R.

    2016-12-01

    Fast transients are dynamical phenomena that show up as high brightness temperature increments over a duration of less than a second. In the Sun these events have been observed in the radio band in various forms, such as radio spikes for example, often seen accompanying other phenomena like normal radio bursts and solar flares. The study of solar fast radio transients is important to understand the physical processes occurring in the solar corona and its possible relation to other solar phenomena where large amounts of energy are released. In this work, we report a code developed to study the evolution of an electron beam pulse injected into the solar corona. We show the tests to validate the code and some results that have been obtained from the numerical simulations that were carried out using this code.

  7. Adaptive importance sampling Monte Carlo simulation of rare transition events.

    PubMed

    de Koning, Maurice; Cai, Wei; Sadigh, Babak; Oppelstrup, Tomas; Kalos, Malvin H; Bulatov, Vasily V

    2005-02-15

    We develop a general theoretical framework for the recently proposed importance sampling method for enhancing the efficiency of rare-event simulations [W. Cai, M. H. Kalos, M. de Koning, and V. V. Bulatov, Phys. Rev. E 66, 046703 (2002)], and discuss practical aspects of its application. We define the success/fail ensemble of all possible successful and failed transition paths of any duration and demonstrate that in this formulation the rare-event problem can be interpreted as a "hit-or-miss" Monte Carlo quadrature calculation of a path integral. The fact that the integrand contributes significantly only for a very tiny fraction of all possible paths then naturally leads to a "standard" importance sampling approach to Monte Carlo (MC) quadrature and the existence of an optimal importance function. In addition to showing that the approach is general and expected to be applicable beyond the realm of Markovian path simulations, for which the method was originally proposed, the formulation reveals a conceptual analogy with the variational MC (VMC) method. The search for the optimal importance function in the former is analogous to finding the ground-state wave function in the latter. In two model problems we discuss practical aspects of finding a suitable approximation for the optimal importance function. For this purpose we follow the strategy that is typically adopted in VMC calculations: the selection of a trial functional form for the optimal importance function, followed by the optimization of its adjustable parameters. The latter is accomplished by means of an adaptive optimization procedure based on a combination of steepest-descent and genetic algorithms.

  8. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  9. Cellular Dynamic Simulator: An Event Driven Molecular Simulation Environment for Cellular Physiology

    PubMed Central

    Byrne, Michael J.; Waxham, M. Neal; Kubota, Yoshihisa

    2010-01-01

    In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations. PMID:20361275

  10. Cellular dynamic simulator: an event driven molecular simulation environment for cellular physiology.

    PubMed

    Byrne, Michael J; Waxham, M Neal; Kubota, Yoshihisa

    2010-06-01

    In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.

  11. Production of Nitrogen Oxides by Laboratory Simulated Transient Luminous Events

    NASA Astrophysics Data System (ADS)

    Peterson, H.; Bailey, M.; Hallett, J.; Beasley, W.

    2007-12-01

    Restoration of the polar stratospheric ozone layer has occurred at rates below those originally expected following reductions in chlorofluorocarbon (CFC) usage. Additional reactions affecting ozone depletion now must also be considered. This research examines nitrogen oxides (NOx) produced in the middle atmosphere by transient luminous events (TLEs), with NOx production in this layer contributing to the loss of stratospheric ozone. In particular, NOx produced by sprites in the mesosphere would be transported to the polar stratosphere via the global meridional circulation and downward diffusion. A pressure-controlled vacuum chamber was used to simulate middle atmosphere pressures, while a power supply and in-chamber electrodes were used to simulate TLEs in the pressure controlled environment. Chemiluminescence NOx analyzers were used to sample NOx produced by the chamber discharges- originally a Monitor Labs Model 8440E, later a Thermo Environment Model 42. Total NOx production for each discharge as well as NOx per ampere of current and NOx per Joule of discharge energy were plotted. Absolute NOx production was greatest for discharge environments with upper tropospheric pressures (100-380 torr), while NOx/J was greatest for discharge environments with stratospheric pressures (around 10 torr). The different production efficiencies in NOx/J as a function of pressure pointed to three different production regimes, each with its own reaction mechanisms: one for tropospheric pressures, one for stratospheric pressures, and one for upper stratospheric to mesospheric pressures (no greater than 1 torr).

  12. Improved transition path sampling methods for simulation of rare events.

    PubMed

    Chopra, Manan; Malshe, Rohit; Reddy, Allam S; de Pablo, J J

    2008-04-14

    The free energy surfaces of a wide variety of systems encountered in physics, chemistry, and biology are characterized by the existence of deep minima separated by numerous barriers. One of the central aims of recent research in computational chemistry and physics has been to determine how transitions occur between deep local minima on rugged free energy landscapes, and transition path sampling (TPS) Monte-Carlo methods have emerged as an effective means for numerical investigation of such transitions. Many of the shortcomings of TPS-like approaches generally stem from their high computational demands. Two new algorithms are presented in this work that improve the efficiency of TPS simulations. The first algorithm uses biased shooting moves to render the sampling of reactive trajectories more efficient. The second algorithm is shown to substantially improve the accuracy of the transition state ensemble by introducing a subset of local transition path simulations in the transition state. The system considered in this work consists of a two-dimensional rough energy surface that is representative of numerous systems encountered in applications. When taken together, these algorithms provide gains in efficiency of over two orders of magnitude when compared to traditional TPS simulations.

  13. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  14. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  15. DIGITAL SIMULATION AND SYSTEM THEORY.

    DTIC Science & Technology

    SYSTEMS ENGINEERING, *MATHEMATICAL MODELS , SIMULATION, DIGITAL COMPUTERS, COMPUTER PROGRAMMING, PROGRAMMING LANGUAGES, COMPUTER LOGIC, STOCHASTIC PROCESSES, CALCULUS OF VARIATIONS, STATISTICAL ANALYSIS, OPERATIONS RESEARCH.

  16. Development of a Blast Event Simulation Process for Multi-Scale Modeling of Composite Armor for Light Weight Vehicles (PREPRINT)

    DTIC Science & Technology

    2011-03-15

    offset from the mine [Westine et al , 1985]. Both empirical models were integrated with the LS - DYNA commercial code. The CTH hydrocode [McGlaun et...properties at the micro-level. A Blast Event Simulation system (BEST) that facilitates the easy use of LS - DYNA or ABAQUS for conducting a complete sequence...and ABAQUS solvers for blast event simulations instead of CTH is that LS - DYNA and UNCLASSIFIED  ABAQUS are commercially readily accessible codes, have

  17. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    SciTech Connect

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  18. Simulation of a continuous lignite excavation system

    SciTech Connect

    Michalakopoulos, T.N.; Arvaniti, S.E.; Panagiotou, G.N.

    2005-07-01

    A discrete-event simulation model using the GPSS/H simulation language has been developed for a excavation system at a multi- level terrace mine. The continuous excavation system consists of five bucket wheel excavators and a network of 22 km of belt conveyors. Ways of dealing with the continuous material flow and frequent changes of material type are considered. The principal model output variables are production and arrival rate at the transfer point of mineral and waste. Animation and comparison with previous production data have been used to validate the model. 14 refs., 6 figs., 1 tab.

  19. Event communication in a regional disease surveillance system.

    PubMed

    Loschen, Wayne; Coberly, Jacqueline; Sniegoski, Carol; Holtry, Rekha; Sikes, Marvin; Happel Lewis, Sheryl

    2007-10-11

    When real-time disease surveillance is practiced in neighboring states within a region, public health users may benefit from easily sharing their concerns and findings regarding potential health threats. To better understand the need for this capability, an event communications component (ECC) was added to the National Capital Region Disease Surveillance System, an operational biosurveillance system employed in the District of Columbia and in surrounding Maryland and Virginia counties. Through usage analysis and user survey methods, we assessed the value of the enhanced system in daily operational use and during two simulated exercises. Results suggest that the system has utility for regular users of the system as well as suggesting several refinements for future implementations.

  20. Teaching sexual history-taking skills using the Sexual Events Classification System.

    PubMed

    Fidler, Donald C; Petri, Justin Daniel; Chapman, Mark

    2010-01-01

    The authors review the literature about educational programs for teaching sexual history-taking skills and describe novel techniques for teaching these skills. Psychiatric residents enrolled in a brief sexual history-taking course that included instruction on the Sexual Events Classification System, feedback on residents' video-recorded interviews with simulated patients, discussion of videos that simulated bad interviews, simulated patients, and a competency scoring form to score a video of a simulated interview. After the course, residents completed an anonymous survey to assess the usefulness of the experience. After the course, most residents felt more comfortable taking sexual histories. They described the Sexual Events Classification System and simulated interviews as practical methods for teaching sexual history-taking skills. The Sexual Events Classification System and simulated patient experiences may serve as a practical model for teaching sexual history-taking skills to general psychiatric residents.

  1. Design and implementation of a distributed Complex Event Processing system

    NASA Astrophysics Data System (ADS)

    Li, Yan; Shang, Yanlei

    2017-01-01

    Making use of the massive events from event sources such as sensors and bank transactions and extract valuable information is of significant importance. Complex Event Processing (CEP), a method of detecting complex events from simple events stream, provides a solution of processing data in real time fast and efficiently. However, a single node CEP system can't satisfy requirements of processing massive event streams from multitudinous event sources. Therefore, this article designs a distributed CEP system, which combine Siddhi, a CEP engine, and Storm, a distributed real time computation architecture. This system can construct topology automatically based on the event streams and execution plans provided by users and process the event streams parallel. Compared with single node complex event system, the distributed system can achieve better performance.

  2. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  3. Event-based simulation of neutron experiments: interference, entanglement and uncertainty relations

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; De Raedt, Hans

    2014-04-01

    We discuss a discrete-event simulation approach, which has been shown to give a unified cause-and-effect description of many quantum optics and single-neutron interferometry experiments. The event-based simulation algorithm does not require the knowledge of the solution of a wave equation of the whole system, yet reproduces the corresponding statistical distributions by generating detection events one-by-one. It is showm that single-particle interference and entanglement, two important quantum phenomena, emerge via information exchange between individual particles and devices such as beam splitters, polarizers and detectors. We demonstrate this by reproducing the results of several single-neutron interferometry experiments, including one that demonstrates interference and one that demonstrates the violation of a Bell-type inequality. We also present event-based simulation results of a single neutron experiment designed to test the validity of Ozawa's universally valid error-disturbance relation, an uncertainty relation derived using the theory of general quantum measurements.

  4. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  5. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  6. Predicting Liver Transplant Capacity Using Discrete Event Simulation

    PubMed Central

    Diaz, Hector Toro; Mayorga, Maria; Barritt, A. Sidney; Orman, Eric S.; Wheeler, Stephanie B.

    2014-01-01

    The number of liver transplants (LTs) performed in the US increased until 2006, but has since declined despite an ongoing increase in demand. This decline may be due in part to decreased donor liver quality and increasing discard of poor quality livers. We constructed a Discrete Event Simulation (DES) model informed by current donor characteristics to predict future LT trends through the year 2030. The data source for our model is the United Network for Organ Sharing database, which contains patient level information on all organ transplants performed in the US. Previous analysis showed that liver discard is increasing and that discarded organs are more often from donors who are older, obese, have diabetes, and donated after cardiac death. Given that the prevalence of these factors is increasing, the DES model quantifies the reduction in the number of LTs performed through 2030. In addition, the model estimates the total number of future donors needed to maintain the current volume of LTs, and the effect of a hypothetical scenario of improved reperfusion technology. We also forecast the number of patients on the waiting list and compare this to the estimated number of LTs to illustrate the impact that decreased LTs will have on patients needing transplants. By altering assumptions about the future donor pool, this model can be used to develop policy interventions to prevent a further decline in this life saving therapy. To our knowledge, there are no similar predictive models of future LT use based on epidemiologic trends. PMID:25391681

  7. Sediment transport in grassed swales during simulated runoff events.

    PubMed

    Bäckström, M

    2002-01-01

    Particle trapping in nine different grassed swales was measured successfully with a standardised runoff event simulation procedure. The percentage of total suspended solids removed ranged from 79 to 98%. It was found that sedimentation processes, rather than grass filtration governed the overall particle trapping efficiency. The highest particle trapping efficiency was observed in the field swales with dense, fully developed turf. A high infiltration rate was beneficial for the particle trapping and an increased swale length made it possible for smaller particles to be captured. A densely vegetated, ten metre long swale, receiving a stormwater flow of 1.0 litres per second, may capture a majority of the waterborne particles with settling velocities larger than 0.1 metres per hour. A simple model of particle trapping efficiency in grassed swales was developed and tested. It was found that mean swale residence time could be used as a design parameter for particle removal in grassed swales. The suggested exponential relationship between mean swale residence time and particle settling velocity associated with a certain trapping efficiency is so far only valid for a limited range of swale designs and residence times.

  8. Discrete Event Supervisory Control Applied to Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Shah, Neerav

    2005-01-01

    The theory of discrete event supervisory (DES) control was applied to the optimal control of a twin-engine aircraft propulsion system and demonstrated in a simulation. The supervisory control, which is implemented as a finite-state automaton, oversees the behavior of a system and manages it in such a way that it maximizes a performance criterion, similar to a traditional optimal control problem. DES controllers can be nested such that a high-level controller supervises multiple lower level controllers. This structure can be expanded to control huge, complex systems, providing optimal performance and increasing autonomy with each additional level. The DES control strategy for propulsion systems was validated using a distributed testbed consisting of multiple computers--each representing a module of the overall propulsion system--to simulate real-time hardware-in-the-loop testing. In the first experiment, DES control was applied to the operation of a nonlinear simulation of a turbofan engine (running in closed loop using its own feedback controller) to minimize engine structural damage caused by a combination of thermal and structural loads. This enables increased on-wing time for the engine through better management of the engine-component life usage. Thus, the engine-level DES acts as a life-extending controller through its interaction with and manipulation of the engine s operation.

  9. Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.

    PubMed

    Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime

    2017-07-01

    Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel(®). The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.

  10. A Process Improvement Study on a Military System of Clinics to Manage Patient Demand and Resource Utilization Using Discrete-Event Simulation, Sensitivity Analysis, and Cost-Benefit Analysis

    DTIC Science & Technology

    2015-03-12

    military installation. Mild medical incidents can range from flu/cold incidents that do not require hospital care to food poisoning at a local restaurant ...Healthcare Industry Overview ..............................................................................9 Simulation in Healthcare...16 III. Industrial and Systems

  11. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  12. Synthetic seismograms of Jan. 6, 2016 DPRK event calculated by the Earth Simulator

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Matsumoto, H.; Rozhkov, M.; Stachnik, J.; Baker, B.

    2016-12-01

    We calculate broadband synthetic seismograms using the spectral-element method (Komatitsch & Tromp, 2001) for Jan. 6 2016 DPRK event (Mw(USGS) 5.1). We use Earth Simulator system in JAMSTEC to compute synthetic seismograms using the spectral-element method. The simulations are performed on 8,100 processors, which require 2,025 nodes of the Earth Simulator. We use one chunk with the angular distance 40 degrees to compute synthetic seismograms. On this number of nodes, a simulation of 10 minutes of wave propagation accurate at periods of 3.0 seconds and longer requires about 2 hours of CPU time. We use CMT solution of Rozhkov et al (2016) as a source model for this event. This source model has 43% CLVD component, 19% double couple component and 38% isotropic component. The hypocenter depth of this solution is 1.4 km but we put the hypocenter at the surface for this computation. Comparisons of the synthetic waveforms with the observation at station Inchon in Korea (epicentral distance 4.2 degrees) show that the arrival time of Pn and Pg waves matches well with the observation, which demonstrates that the crustal structure we have used for this computation models the actual structure well. The surface waves observed at this station are also modeled well in the synthetics, which shows that the CMT solution we have used for this computation correctly grasps the source characteristics of this event. However, the amplitudes of Pn and Pg waves in the synthetics are smaller than the observations, which indicates that the amplitude of short period components is not enough in this computation. We are trying to increase the number of grid points in the mesh so that we can model much shorter period with our synthetic seismograms. We will try to compute synthetic seismograms for previous events around this region and will discuss about the differences with the 2016 event.

  13. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    PubMed

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  14. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  15. On the nature of medial temporal lobe contributions to the constructive simulation of future events

    PubMed Central

    Schacter, Daniel L.; Addis, Donna Rose

    2009-01-01

    A rapidly growing number of studies indicate that imagining or simulating possible future events depends on much of the same neural machinery as does remembering past events. One especially striking finding is that the medial temporal lobe (MTL), which has long been linked to memory function, appears to be similarly engaged during future event simulation. This paper focuses on the role of two MTL regions—the hippocampus and parahippocampal cortex—in thinking about the future and building mental simulations. PMID:19528005

  16. Numerical simulation diagnostics of a flash flood event in Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Samman, Ahmad

    On 26 January 2011, a severe storm hit the city of Jeddah, the second largest city in the Kingdom of Saudi Arabia. The storm resulted in heavy rainfall, which produced a flash flood in a short period of time. This event caused at least eleven fatalities and more than 114 injuries. Unfortunately, the observed rainfall data are limited to the weather station at King Abdul Aziz International airport, which is north of the city, while the most extreme precipitation occurred over the southern part of the city. This observation was useful to compare simulation result even though it does not reflect the severity of the event. The Regional Atmospheric Modeling System (RAMS) developed at Colorado State University was used to study this storm event. RAMS simulations indicted that a quasi-stationary Mesoscale convective system developed over the city of Jeddah and lasted for several hours. It was the source of the huge amount of rainfall. The model computed a total rainfall of more than 110 mm in the southern part of the city, where the flash flood occurred. This precipitation estimation was confirmed by the actual observation of the weather radar. While the annual rainfall in Jeddah during the winter varies from 50 to 100 mm, the amount of the rainfall resulting from this storm event exceeded the climatological total annual rainfall. The simulation of this event showed that warm sea surface temperature, combined with high humidity in the lower atmosphere and a large amount of convective available potential energy (CAPE) provided a favorable environment for convection. It also showed the presence of a cyclonic system over the north and eastern parts of the Mediterranean Sea, and a subtropical anti-cyclone over Northeastern Africa that contributed to cold air advection bringing cold air to the Jeddah area. In addition, an anti-cyclone (blocking) centered over east and southeastern parts of the Arabian Peninsula and the Arabian Sea produced a low level jet over the southern

  17. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  18. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  19. Analysis and Simulations of Space Radiation Induced Single Event Transients

    NASA Astrophysics Data System (ADS)

    Perez, Reinaldo

    2016-05-01

    Spacecraft electronics are affected by the space radiation environment. Among the different types of radiation effects that can affect spacecraft electronics is the single event transients. The space environment is responsible for many of the single event transients which can upset the performance of the spacecraft avionics hardware. In this paper we first explore the origins of single event transients, then explore the modeling of a single event transient in digital and analog circuit. The paper also addresses the concept of crosstalk that could develop among digital circuits in the present of a SET event. The paper ends with a brief discussion of SET hardening. The goal of the paper is to provide methodologies for assessing single event transients and their effects so that spacecraft avionics engineers can develop either hardware or software countermeasures in their designs.

  20. Using WIRED to study Simulated Linear Collider Detector Events

    SciTech Connect

    George, A

    2004-02-05

    The purpose of this project is to enhance the properties of the LCD WIRED Event Display. By extending the functionality of the display, physicists will be able to view events with more detail and interpret data faster. Poor characteristics associated with WIRED can severely affect the way we understand events, but by bringing attention to specific attributes we open doors to new ideas. Events displayed inside of the LCD have many different properties; this is why scientists need to be able to distinguish data using a plethora of symbols and other graphics. This paper will explain how we can view events differently using clustering and displaying results with track finding. Different source codes extracted from HEP libraries will be analyzed and tested to see which codes display the information needed. It is clear that, through these changes certain aspects of WIRED will be recognized more often allowing good event display which lead to better physics results.

  1. A Study on Discrete Event Simulation (DES) in a High-Level Architecture (HLA) Networked Simulation

    DTIC Science & Technology

    2010-12-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Singapore Technologies Electronics (Training & Simulation System) Pte Ltd 24 Ang Mo Kio St 65 Singapore...Thesis Co-Advisor Mathias Kölsch Chairman, MOVES Academic Committee Peter J. Denning Chairman, Department of Computer Science iv...and Simulation and broaden his academic spectrum. xiv THIS PAGE INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. OVERVIEW Since the evolution of

  2. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  3. A multiprocessor operating system simulator

    SciTech Connect

    Johnston, G.M.; Campbell, R.H. . Dept. of Computer Science)

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  4. Regional Climate Simulation of the Anomalous Events of 1998 using a Stretched-Grid GCM with Multiple Areas of Interest

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The GEOS (Goddard Earth Observing System) stretched-grid (SG) GCM developed and thoroughly tested over the last few years, is used for simulating the major anomalous regional climate events of 1998. The anomalous regional climate events are simulated simultaneously during the 13 months long (November-1997 - December-1998) SG-GCM simulation due to using the new SG-design with multiple (four) areas of interest. The following areas/regions of interest (one at each global quadrant) are implemented: U.S./Northern Mexico, the El-Nino/Brazil area, India-China, and Eastern Indian Ocean/Australia.

  5. Event-by-event simulation of nonclassical effects in two-photon interference experiments

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; Jin, Fengping; Delina, M.; De Raedt, Hans

    2012-11-01

    A corpuscular simulation model for second-order intensity interference phenomena is discussed. It is shown that both the visibility {\\mathscr{V}}=1/2 predicted for two-photon interference experiments with two independent sources and the visibility {\\mathscr{V}}=1 predicted for two-photon interference experiments with a parametric down-conversion source can be explained in terms of a locally causal, modular, adaptive, corpuscular, classical (non-Hamiltonian) dynamical system. Hence, there is no need to invoke quantum theory to explain the so-called nonclassical effects in the interference of signal and idler photons in parametric down conversion. A revision of the commonly accepted criterion of the nonclassical nature of light is needed.

  6. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  7. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  8. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  9. Decision support system for managing oil spill events.

    PubMed

    Keramitsoglou, Iphigenia; Cartalis, Constantinos; Kassomenos, Pavlos

    2003-08-01

    The Mediterranean environment is exposed to various hazards, including oil spills, forest fires, and floods, making the development of a decision support system (DSS) for emergency management an objective of utmost importance. The present work presents a complete DSS for managing marine pollution events caused by oil spills. The system provides all the necessary tools for early detection of oil-spills from satellite images, monitoring of their evolution, estimation of the accident consequences and provision of support to responsible Public Authorities during clean-up operations. The heart of the system is an image processing-geographic information system and other assistant individual software tools that perform oil spill evolution simulation and all other necessary numerical calculations as well as cartographic and reporting tasks related to a specific management of the oil spill event. The cartographic information is derived from the extant general maps representing detailed information concerning several regional environmental and land-cover characteristics as well as financial activities of the application area. Early notification of the authorities with up-to-date accurate information on the position and evolution of the oil spill, combined with the detailed coastal maps, is of paramount importance for emergency assessment and effective clean-up operations that would prevent environmental hazard. An application was developed for the Region of Crete, an area particularly vulnerable to oil spills due to its location, ecological characteristics, and local economic activities.

  10. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    PubMed Central

    van Beek, Johannes H. G. M.; Supandi, Farahaniza; Gavai, Anand K.; de Graaf, Albert A.; Binsl, Thomas W.; Hettling, Hannes

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible. PMID:21969677

  11. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism.

    PubMed

    van Beek, Johannes H G M; Supandi, Farahaniza; Gavai, Anand K; de Graaf, Albert A; Binsl, Thomas W; Hettling, Hannes

    2011-11-13

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible.

  12. Towards High Performance Discrete-Event Simulations of Smart Electric Grids

    SciTech Connect

    Perumalla, Kalyan S; Nutaro, James J; Yoginath, Srikanth B

    2011-01-01

    Future electric grid technology is envisioned on the notion of a smart grid in which responsive end-user devices play an integral part of the transmission and distribution control systems. Detailed simulation is often the primary choice in analyzing small network designs, and the only choice in analyzing large-scale electric network designs. Here, we identify and articulate the high-performance computing needs underlying high-resolution discrete event simulation of smart electric grid operation large network scenarios such as the entire Eastern Interconnect. We focus on the simulator's most computationally intensive operation, namely, the dynamic numerical solution for the electric grid state, for both time-integration as well as event-detection. We explore solution approaches using general-purpose dense and sparse solvers, and propose a scalable solver specialized for the sparse structures of actual electric networks. Based on experiments with an implementation in the THYME simulator, we identify performance issues and possible solution approaches for smart grid experimentation in the large.

  13. The simulation of a MCS event in the South America using a radiative transfer model

    NASA Astrophysics Data System (ADS)

    Silveira, B. B.; Aravéquia, J. A.

    2011-12-01

    The Mesoescale Convective Systems (MCS) have an important role in the total precipitation in some regions in the world. The Southeast of South America is one of these regions, because in this area the environment favors the development of MCS. The satellite image is an important data used in the identification and characterization of these systems. In these images the MCSs are characterize for have a low values of Brightness Temperature (BT). A channel utilized to identify these systems is 4 (infrared) of the sensor imager of GOES 10 satellite. With the objective of identify a MCS with an atmospheric model 12h forecast was realized a simulation of BT to channel 4 of GOES 10 using a radiative transfer model. The MCS event chosen was one that occur between 9 and 10 November 2008 and this system reached the North of Argentine and Paraguay. This MCS was identified using the outputs of FORTACC (Forecast and Tracking of Active Convective Cells). The BT simulation was realized using the radiative transfer model CRTM version 2.0.2 (Community Radiative Transfer Model) from JCSDA (Joint Center for Satellite Data Assimilation). To realize the simulation was used a 12 hours forecast from ETA model, this atmospheric model is an operational model from the CPTEC/INPE (Centro de Previsão de Tempo e Estudos Climáticos/ Instituto Nacional de Pesquisas Epaciais). The ETA model has 20x20 Km horizontal spatial resolution and 19 levels in the vertical. The simulation of BT values with CRTM indicates the region where the MCS occurred. However the BT values are overestimated by the CRTM, the simulated amounts are quantitatively higher than the observed by the channel 4 from GOES 10. The area with BT values related to the MCS is smaller than the observed in the satellite image, the system shape also wasn't simulated the satisfactory way.

  14. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  15. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  16. Parallelized event chain algorithm for dense hard sphere and polymer systems

    SciTech Connect

    Kampmann, Tobias A. Boltz, Horst-Holger; Kierfeld, Jan

    2015-01-15

    We combine parallelization and cluster Monte Carlo for hard sphere systems and present a parallelized event chain algorithm for the hard disk system in two dimensions. For parallelization we use a spatial partitioning approach into simulation cells. We find that it is crucial for correctness to ensure detailed balance on the level of Monte Carlo sweeps by drawing the starting sphere of event chains within each simulation cell with replacement. We analyze the performance gains for the parallelized event chain and find a criterion for an optimal degree of parallelization. Because of the cluster nature of event chain moves massive parallelization will not be optimal. Finally, we discuss first applications of the event chain algorithm to dense polymer systems, i.e., bundle-forming solutions of attractive semiflexible polymers.

  17. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  18. The degree of disparateness of event details modulates future simulation construction, plausibility, and recall.

    PubMed

    van Mulukom, Valerie; Schacter, Daniel L; Corballis, Michael C; Addis, Donna Rose

    2016-01-01

    Several episodic simulation studies have suggested that the plausibility of future events may be influenced by the disparateness of the details comprising the event. However, no study had directly investigated this idea. In the current study, we designed a novel episodic combination paradigm that varied the disparateness of details through a social sphere manipulation. Participants recalled memory details from three different social spheres. Details were recombined either within spheres or across spheres to create detail sets for which participants imagined future events in a second session. Across-sphere events were rated as significantly less plausible than within-sphere events and were remembered less often. The presented paradigm, which increases control over the disparateness of details in future event simulations, may be useful for future studies concerned with the similarity of the simulations to previous events and its plausibility.

  19. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  20. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    SciTech Connect

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-06-19

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described.

  1. Torque Simulator for Rotating Systems

    NASA Technical Reports Server (NTRS)

    Davis, W. T.

    1982-01-01

    New torque brake simulates varying levels of friction in bearings of rotating body. Rolling-tail torque brake uses magnetic force to produce friction between rotating part and stationary part. Simulator electronics produce positive or negative feedback signal, depending on direction of rotation. New system allows for first time in-depth study of effects of tail-fin spin rates on pitch-, yaw-, and roll-control characteristics.

  2. Simulation Systems for Cognitive Psychology

    DTIC Science & Technology

    1982-08-01

    the deve )pment of SNOBOL). 5.1 The First Generation Of Psychological Simulation Languages Therefore, the first generatiov’ of specialized languages...SIMULATION SYSTEMS FOR COGNITIVE PSYCHOLOGY Robert Neche University of Pittsburgh. August 1982 Technical Report No. UPITT/LRDC/ONR/APS-12 This...research was sponsored by the Personnel and Training Research Programs, Psychological Sciences Division, Office of Naval Research, under Contract No. N00014

  3. Hierarchical simulation of large system

    NASA Technical Reports Server (NTRS)

    Saab, Daniel G.

    1991-01-01

    The main problem facing current CAD tools for VLSIs is the large amount of memory required when dealing with large systems, primarily due to the circuit representation used by most current tools. This paper discusses an approach for hierarchical switch-level simulation of digital circuits. The approach exploits the hierarchy to reduce the memory requirements of the simulation, allowing the simulation of circuits that are too large to simulate at one flat level. The approach has been implemented in a hierarchical switch-level simulator, CHAMP, which runs on a SUN workstation. The program performs mixed mode simulation: parts of the circuit can be simulated faster at a behavioral level by supplying a high level software description. CHAMP allows assignable delays, and bidirectional signal flow inside circuit blocks that are represented as transistor networks as well as across the boundaries of higher level blocks. CHAMP is also unique in that it simulates directly from the hierarchical circuit description without flattening to a single level.

  4. Forward flux sampling-type schemes for simulating rare events: efficiency analysis.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-05-21

    We analyze the efficiency of several simulation methods which we have recently proposed for calculating rate constants for rare events in stochastic dynamical systems in or out of equilibrium. We derive analytical expressions for the computational cost of using these methods and for the statistical error in the final estimate of the rate constant for a given computational cost. These expressions can be used to determine which method to use for a given problem, to optimize the choice of parameters, and to evaluate the significance of the results obtained. We apply the expressions to the two-dimensional nonequilibrium rare event problem proposed by Maier and Stein [Phys. Rev. E 48, 931 (1993)]. For this problem, our analysis gives accurate quantitative predictions for the computational efficiency of the three methods.

  5. Assessing mid-latitude dynamics in extreme event attribution systems

    NASA Astrophysics Data System (ADS)

    Mitchell, Daniel; Davini, Paolo; Harvey, Ben; Massey, Neil; Haustein, Karsten; Woollings, Tim; Jones, Richard; Otto, Fredi; Guillod, Benoit; Sparrow, Sarah; Wallom, David; Allen, Myles

    2017-06-01

    Atmospheric modes of variability relevant for extreme temperature and precipitation events are evaluated in models currently being used for extreme event attribution. A 100 member initial condition ensemble of the global circulation model HadAM3P is compared with both the multi-model ensemble from the Coupled Model Inter-comparison Project, Phase 5 (CMIP5) and the CMIP5 atmosphere-only counterparts (AMIP5). The use of HadAM3P allows for huge ensembles to be computed relatively fast, thereby providing unique insights into the dynamics of extremes. The analysis focuses on mid Northern Latitudes (primarily Europe) during winter, and is compared with ERA-Interim reanalysis. The tri-modal Atlantic eddy-driven jet distribution is remarkably well captured in HadAM3P, but not so in the CMIP5 or AMIP5 multi-model mean, although individual models fare better. The well known underestimation of blocking in the Atlantic region is apparent in CMIP5 and AMIP5, and also, to a lesser extent, in HadAM3P. Pacific blocking features are well produced in all modeling initiatives. Blocking duration is biased towards models reproducing too many short-lived events in all three modelling systems. Associated storm tracks are too zonal over the Atlantic in the CMIP5 and AMIP5 ensembles, but better simulated in HadAM3P with the exception of being too weak over Western Europe. In all cases, the CMIP5 and AMIP5 performances were almost identical, suggesting that the biases in atmospheric modes considered here are not strongly coupled to SSTs, and perhaps other model characteristics such as resolution are more important. For event attribution studies, it is recommended that rather than taking statistics over the entire CMIP5 or AMIP5 available models, only models capable of producing the relevant dynamical phenomena be employed.

  6. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  7. Representing Dynamic Social Networks in Discrete Event Social Simulation

    DTIC Science & Technology

    2010-12-01

    applied settings in the areas of marketing and behavior modification programs (exercise adoption, smoking cessation) ( Icek Ajzen 2006). The model has an...society. The action choice component of the conceptual model is based on the theory of planned behavior (TPB) (I. Ajzen 1991). The TPB states that an...information networks into military simulations. In Pro- ceedings of the 40th Conference on Winter Simulation. pp. 133–144. Ajzen , I. 1991. The theory of

  8. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  9. Impulsive events in the evolution of a forced nonlinear system

    SciTech Connect

    Longcope, D.W.; Sudan, R.N. )

    1992-03-16

    Long-time numerical solutions of a low-dimensional model of the reduced MHD equations show that, when this system is driven quasistatically, the response is punctuated by impulsive events. The statistics of these events indicate a Poisson process; the frequency of these events scales as {Delta}{ital E}{sub {ital M}}{sup {minus}1}, where {Delta}{ital E}{sub {ital M}} is the energy released in one event.

  10. Systems Engineering Simulator (SES) Simulator Planning Guide

    NASA Technical Reports Server (NTRS)

    McFarlane, Michael

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  11. Analytic Perturbation Analysis of Discrete Event Dynamic Systems

    SciTech Connect

    Uryasev, S.

    1994-09-01

    This paper considers a new Analytic Perturbation Analysis (APA) approach for Discrete Event Dynamic Systems (DEDS) with discontinuous sample-path functions with respect to control parameters. The performance functions for DEDS usually are formulated as mathematical expectations, which can be calculated only numerically. APA is based on new analytic formulas for the gradients of expectations of indicator functions; therefore, it is called an analytic perturbation analysis. The gradient of performance function may not coincide with the expectation of a gradient of sample-path function (i.e., the interchange formula for the gradient and expectation sign may not be valid). Estimates of gradients can be obtained with one simulation run of the models.

  12. Using Simulation to Improve Systems.

    PubMed

    Kearney, James A; Deutsch, Ellen S

    2017-10-01

    Attempts to understand and improve health care delivery often focus on the characteristics of the patient and the characteristics of the health care providers, but larger systems surround and integrate with patients and providers. Components of health care delivery systems can support or interfere with efforts to provide optimal health care. Simulation in situ, involving real teams participating in simulations in real care settings, can be used to identify latent safety threats and improve the work environment while simultaneously supporting participant learning. Thoughtful planning and skilled debriefing are essential. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  14. A System for Interactive Behaviour Simulation.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    A psycho-ecological model is used as the basis for a simulation of interactive behavior strategies. The basic unit is an event, and each event has been recorded on closed circuit television videotape. The three basic paradigms of behavioral science--association, structure, and process--are used to anchor the simulations. The empirical foundation…

  15. [The performance of respirator alarms during simulated critical events in CMV/IPPV artificial respiration].

    PubMed

    Bender, H J; Frankenberger, H; Ryll, C; Albrecht, M D

    1993-06-01

    Alarm systems of ventilators enhance detection of possible critical events during artificial ventilation. Due to their significance, in some countries the alarm detection of ventilators is regulated by federal law. Up to now, no recommendations for the adjustment of alarm limits exist and only a few detailed investigations of the accuracy of alarm detection are available. METHODS. The response of four commercially available ventilators (Servoventilator 900C, Siemens, Inc.; Bennett 7200a, Hoyer, Inc.; Veolar, Hamilton, Inc.; EVITA, Dräger, Inc.) to critical events during artificial ventilation of a test lung were evaluated. We measured the alarm time (the time between event creation and alarm response) of ten different simulated critical events including disconnection, differentisized leaks, failure of the gas supply, and obstruction at different places in the artificial airway. DISCUSSION. All respirators were able to recognise severe critical situations such as hose disconnection, failure of gas supply, and total airway obstruction within a short time (< 15 s). The recognition of small airway leaks was more difficult for the ventilators even when the alarm thresholds were close. The alarm detection of the EVITA (software 10.0 or less) under conditions of partial airway obstruction may be a source of risk for the patient as the machine continued supplying inspiration with pressure-limited ventilation even when the pressure threshold was reached.

  16. Simulation of heavy rainfall events over Indian region: a benchmark skill with a GCM

    NASA Astrophysics Data System (ADS)

    Goswami, Prashant; Kantha Rao, B.

    2015-10-01

    Extreme rainfall events (ERE) contribute a significant component of the Indian summer monsoon rainfall. Thus an important requirement for regional climate simulations is to attain desirable quality and reliability in simulating the extreme rainfall events. While the global circulation model (GCM) with coarse resolution are not preferred for simulation of extreme events, it is expected that the global domain in a GCM would allow better representation of scale interactions, resulting in adequate skill in simulating localized events in spite of lower resolution. At the same time, a GCM with skill in simulation of extreme events will provide a more reliable tool for seamless prediction. The present work provides an assessment of a GCM for simulating 40 ERE that occurred over India during 1998-2013. It is found that, expectedly, the GCM forecasts underestimate the observed (TRMM) rainfall in most cases, but not always. Somewhat surprisingly, the forecasts of location are quite accurate in spite of low resolution (~50 km). An interesting result is that the highest skill of the forecasts is realized at 48 h lead rather than at 24 or 96 h lead. Diagnostics of dynamical fields like convergence shows that the forecasts can capture contrasting features on pre-event, event and post-event days. The forecast configuration used is similar to one that has been used for long-range monsoon forecasting and tropical cyclones in earlier studies; the present results on ERE forecasting, therefore, provide an indication for the potential application of the model for seamless prediction.

  17. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    PubMed Central

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-01-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the dermis upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  18. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. Copyright © 2015 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  19. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    NASA Astrophysics Data System (ADS)

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  20. Evaluating the Capability of Information Technology to Prevent Adverse Drug Events: A Computer Simulation Approach

    PubMed Central

    Anderson, James G.; Jay, Stephen J.; Anderson, Marilyn; Hunt, Thaddeus J.

    2002-01-01

    Background: The annual cost of morbidity and mortality due to medication errors in the U.S. has been estimated at $76.6 billion. Information technology implemented systematically has the potential to significantly reduce medication errors that result in adverse drug events (ADEs). Objective: To develop a computer simulation model that can be used to evaluate the effectiveness of information technology applications designed to detect and prevent medication errors that result in adverse drug effects. Methods: A computer simulation model was constructed representing the medication delivery system in a hospital. STELLA, a continuous simulation software package, was used to construct the model. Parameters of the model were estimated from a study of prescription errors on two hospital medical/surgical units and used in the baseline simulation. Five prevention strategies were simulated based on information obtained from the literature. Results: The model simulates the four stages of the medication delivery system: prescribing, transcribing, dispensing, and administering drugs. We simulated interventions that have been demonstrated in prior studies to decrease error rates. The results suggest that an integrated medication delivery system can save up to 1,226 days of excess hospitalization and $1.4 million in associated costs annually in a large hospital. The results of the analyses regarding the effects of the interventions on the additional hospital costs associated with ADEs are somewhat sensitive to the distribution of errors in the hospital, more sensitive to the costs of an ADE, and most sensitive to the proportion of medication errors resulting in ADEs. Conclusions: The results suggest that clinical information systems are potentially a cost-effective means of preventing ADEs in hospitals and demonstrate the importance of viewing medication errors from a systems perspective. Prevention efforts that focus on a single stage of the process had limited impact on the

  1. Improving prospective memory performance with future event simulation in traumatic brain injury patients.

    PubMed

    Mioni, Giovanna; Bertucci, Erica; Rosato, Antonella; Terrett, Gill; Rendell, Peter G; Zamuner, Massimo; Stablum, Franca

    2017-06-01

    Previous studies have shown that traumatic brain injury (TBI) patients have difficulties with prospective memory (PM). Considering that PM is closely linked to independent living it is of primary interest to develop strategies that can improve PM performance in TBI patients. This study employed Virtual Week task as a measure of PM, and we included future event simulation to boost PM performance. Study 1 evaluated the efficacy of the strategy and investigated possible practice effects. Twenty-four healthy participants performed Virtual Week in a no strategy condition, and 24 healthy participants performed it in a mixed condition (no strategy - future event simulation). In Study 2, 18 TBI patients completed the mixed condition of Virtual Week and were compared with the 24 healthy controls who undertook the mixed condition of Virtual Week in Study 1. All participants also completed a neuropsychological evaluation to characterize the groups on level of cognitive functioning. Study 1 showed that participants in the future event simulation condition outperformed participants in the no strategy condition, and these results were not attributable to practice effects. Results of Study 2 showed that TBI patients performed PM tasks less accurately than controls, but that future event simulation can substantially reduce TBI-related deficits in PM performance. The future event simulation strategy also improved the controls' PM performance. These studies showed the value of future event simulation strategy in improving PM performance in healthy participants as well as in TBI patients. TBI patients performed PM tasks less accurately than controls, confirming prospective memory impairment in these patients. Participants in the future event simulation condition out-performed participants in the no strategy condition. Future event simulation can substantially reduce TBI-related deficits in PM performance. Future event simulation strategy also improved the controls' PM performance.

  2. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  3. First-Principles Simulations of Violent Space-Weather Events

    DTIC Science & Technology

    2008-01-01

    charged gases ( plasmas ) that comprise its ionosphere and magnetosphere . These changes are driven, for the most part, by fluctuations in the flow of...morphological magnetic-field and plasma signatures. Our results also exhibit homology (repeated similar events originating from a single source), prompt...magnetic field and plasma from the Sun—the solar wind. The effects of these changes can include direct damage by energetic particles to orbiting

  4. Evaluating simulation education via electronic surveys immediately following live critical events: a pilot study.

    PubMed

    Happel, Corinne Savides; Lease, Meredith A; Nishisaki, Akira; Braga, Matthew S

    2015-02-01

    Simulation-based medical education has become popular in postgraduate training for medical emergencies; however, the direct impact on learners' clinical performances during live critical events is unknown. Our goal was to evaluate the perceived impact of simulation-based education on pediatric emergencies by auditing pediatric residents immediately after involvement in actual emergency clinical events. Weekly team-based pediatric simulation training for inpatient emergencies was implemented in an academic tertiary care hospital. Immediately after actual pediatric emergency events, each resident involved was audited regarding roles, performed tasks, and perceived effectiveness of earlier simulation-based education. The audit was performed by using a Likert scale. From September 2010 through August 2011, a total of 49 simulation sessions were held. During the same period, 27 pediatric emergency events occurred: 3 code events, 14 rapid response team activations, and 10 emergency transfers to the PICU. Forty-seven survey responses from 20 pediatric residents were obtained after the emergency clinical events. Fifty-three percent of residents felt well prepared, and 45% reported having experienced a similar simulation before the clinical event. A preceding similar simulation experience was perceived as helpful in improving clinical performance. Residents' confidence levels, however, did not differ significantly between those who reported having had a preceding similar simulation and those who had not (median of 4 vs median of 3; P=.16, Wilcoxon rank-sum test). A novel electronic survey was successfully piloted to measure residents' perceptions of simulation education compared with live critical events. Residents perceived that their experiences in earlier similar simulations positively affected their performances during emergencies. Copyright © 2015 by the American Academy of Pediatrics.

  5. Systems simulations supporting NASA telerobotics

    NASA Technical Reports Server (NTRS)

    Harrison, F. W., Jr.; Pennington, J. E.

    1987-01-01

    Two simulation and analysis environments have been developed to support telerobotics research at the Langley Research Center. One is a high-fidelity, nonreal-time, interactive model called ROBSIM, which combines user-generated models of workspace environment, robots, and loads into a working system and simulates the interaction among the system components. Models include user-specified actuator, sensor, and control parameters, as well as kinematic and dynamic characteristics. Kinematic, dynamic, and response analyses can be selected, with system configuration, task trajectories, and arm states displayed using computer graphics. The second environment is a real-time, manned Telerobotic Systems Simulation (TRSS) which uses the facilities of the Intelligent Systems Research Laboratory (ISRL). It utilizes a hierarchical structure of functionally distributed computers communicating over both parallel and high-speed serial data paths to enable studies of advanced telerobotic systems. Multiple processes perform motion planning, operator communications, forward and inverse kinematics, control/sensor fusion, and I/O processing while communicating via common memory. Both ROBSIM and TRSS, including their capability, status, and future plans are discussed. Also described is the architecture of ISRL and recent telerobotic system studies in ISRL.

  6. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  7. Analyzing Noncombatant Evacuation Operations Using Discrete Event Simulation

    DTIC Science & Technology

    2013-12-01

    hotels, stadiums , and other locations where NCEs can stay until transportation to the United States is available (Joint Chiefs of Staff 2007). NCEs...applying analytical techniques such as simulation, optimization, and decision analysis to solve real world problems. He is an avid sports fan and is known

  8. MCNP6. Simulating Correlated Data in Fission Events

    SciTech Connect

    Rising, Michael Evan; Sood, Avneet

    2015-12-03

    This report is a series of slides discussing the MCNP6 code and its status in simulating fission. Applications of interest include global security and nuclear nonproliferation, detection of special nuclear material (SNM), passive and active interrogation techniques, and coincident neutron and photon leakage.

  9. Simulating Underbelly Blast Events using Abaqus/Explicit - CEL

    DTIC Science & Technology

    2013-01-15

    simplified hybrid elastic-plastic material model for geologic materials developed by the U.S. Army – ERDC was implemented as a VUMAT and used to describe...as a VUMAT and used to describe the soil. The simulations agree favorably with the test results and produce higher fidelity solutions than traditional

  10. Stochastic Event Counter for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Tae-Sic Yoo; Humberto E. Garcia

    2008-06-01

    This paper addresses the issues of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). First, we develop a noble recursive procedure that updates active counter information state sequentially with available observations. In general, the cardinality of active counter information state is unbounded, which makes the exact recursion infeasible computationally. To overcome this difficulty, we develop an approximated recursive procedure that regulates and bounds the size of active counter information state. Using the approximated active counting information state, we give an approximated minimum mean square error (MMSE) counter. The developed algorithms are then applied to count special routing events in a material flow system.

  11. The influence of spectral nudging in simulating Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2015-04-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  12. Automated calculation and simulation systems

    NASA Astrophysics Data System (ADS)

    Ohl, Thorsten

    2003-04-01

    I briefly summarize the parallel sessions on Automated Calculation and Simulation Systems for high-energy particle physics phenomenology at ACAT 2002 (Moscow State University, June 2002) and present a short overview over the current status of the field and try to identify the important trends.

  13. Modeling and simulation of single-event effect in CMOS circuit

    NASA Astrophysics Data System (ADS)

    Suge, Yue; Xiaolin, Zhang; Yuanfu, Zhao; Lin, Liu; Hanning, Wang

    2015-11-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed.

  14. Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events

    PubMed Central

    2015-01-01

    Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed. PMID:26390294

  15. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  16. Enabling parallel simulation of large-scale HPC network systems

    SciTech Connect

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; Carns, Philip

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks used in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations

  17. Efficient event-driven simulations shed new light on microtubule organization in the plant cortical array

    NASA Astrophysics Data System (ADS)

    Tindemans, Simon H.; Deinum, Eva E.; Lindeboom, Jelmer J.; Mulder, Bela M.

    2014-04-01

    The dynamics of the plant microtubule cytoskeleton is a paradigmatic example of the complex spatiotemporal processes characterising life at the cellular scale. This system is composed of large numbers of spatially extended particles, each endowed with its own intrinsic stochastic dynamics, and is capable of non-equilibrium self-organisation through collisional interactions of these particles. To elucidate the behaviour of such a complex system requires not only conceptual advances, but also the development of appropriate computational tools to simulate it. As the number of parameters involved is large and the behaviour is stochastic, it is essential that these simulations be fast enough to allow for an exploration of the phase space and the gathering of sufficient statistics to accurately pin down the average behaviour as well as the magnitude of fluctuations around it. Here we describe a simulation approach that meets this requirement by adopting an event-driven methodology that encompasses both the spontaneous stochastic changes in microtubule state as well as the deterministic collisions. In contrast with finite time step simulations this technique is intrinsically exact, as well as several orders of magnitude faster, which enables ordinary PC hardware to simulate systems of ˜ 10^3 microtubules on a time scale ˜ 10^{3} faster than real time. In addition we present new tools for the analysis of microtubule trajectories on curved surfaces. We illustrate the use of these methods by addressing a number of outstanding issues regarding the importance of various parameters on the transition from an isotropic to an aligned and oriented state.

  18. A Summary of Some Discrete-Event System Control Problems

    NASA Astrophysics Data System (ADS)

    Rudie, Karen

    A summary of the area of control of discrete-event systems is given. In this research area, automata and formal language theory is used as a tool to model physical problems that arise in technological and industrial systems. The key ingredients to discrete-event control problems are a process that can be modeled by an automaton, events in that process that cannot be disabled or prevented from occurring, and a controlling agent that manipulates the events that can be disabled to guarantee that the process under control either generates all the strings in some prescribed language or as many strings as possible in some prescribed language. When multiple controlling agents act on a process, decentralized control problems arise. In decentralized discrete-event systems, it is presumed that the agents effecting control cannot each see all event occurrences. Partial observation leads to some problems that cannot be solved in polynomial time and some others that are not even decidable.

  19. Mesoscale Simulations of a Wind Ramping Event for Wind Energy Prediction

    SciTech Connect

    Rhodes, M; Lundquist, J K

    2011-09-21

    Ramping events, or rapid changes of wind speed and wind direction over a short period of time, present challenges to power grid operators in regions with significant penetrations of wind energy in the power grid portfolio. Improved predictions of wind power availability require adequate predictions of the timing of ramping events. For the ramping event investigated here, the Weather Research and Forecasting (WRF) model was run at three horizontal resolutions in 'mesoscale' mode: 8100m, 2700m, and 900m. Two Planetary Boundary Layer (PBL) schemes, the Yonsei University (YSU) and Mellor-Yamada-Janjic (MYJ) schemes, were run at each resolution as well. Simulations were not 'tuned' with nuanced choices of vertical resolution or tuning parameters so that these simulations may be considered 'out-of-the-box' tests of a numerical weather prediction code. Simulations are compared with sodar observations during a wind ramping event at a 'West Coast North America' wind farm. Despite differences in the boundary-layer schemes, no significant differences were observed in the abilities of the schemes to capture the timing of the ramping event. As collaborators have identified, the boundary conditions of these simulations probably dominate the physics of the simulations. They suggest that future investigations into characterization of ramping events employ ensembles of simulations, and that the ensembles include variations of boundary conditions. Furthermore, the failure of these simulations to capture not only the timing of the ramping event but the shape of the wind profile during the ramping event (regardless of its timing) indicates that the set-up and execution of such simulations for wind power forecasting requires skill and tuning of the simulations for a specific site.

  20. Aided targeting system simulation evaluation

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Becker, Curtis

    1994-01-01

    Simulation research was conducted at the Crew Station Research and Development Facility on the effectiveness and ease of use of three targeting systems. A manual system required the aviator to scan a target array area with a simulated second generation forward looking infrared (FLIR) sensor, locate and categorize targets, and construct a target hand-off list. The interface between the aviator and the system was like that of an advanced scout helicopter (manual mode). Two aided systems detected and categorized targets automatically. One system used only the FLIR sensor and the second used FLIR fused with Longbow radar. The interface for both was like that of an advanced scout helicopter aided mode. Exposure time while performing the task was reduced substantially with the aided systems, with no loss of target hand-off list accuracy. The fused sensor system showed lower time to construct the target hand-off list and a slightly lower false alarm rate than the other systems. A number of issues regarding system sensitivity and criterion, and operator interface design are discussed.

  1. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC

  2. Simple Movement and Detection in Discrete Event Simulation

    DTIC Science & Technology

    2005-12-01

    with a description of uniform linear motion in the following section. We will then con- sider the simplest kind of sensing, the “ cookie -cutter.” A... cookie -cutter sensor sees everything that is within its range R, and must be notified at the precise time a target enters it range. In a time-step...simulation, cookie -cutter detection is very easy. Simply compute the distance between the sensor and the target at each time step. If the target is

  3. Model Learning for Probabilistic Simulation on Rare Events and Scenarios

    DTIC Science & Technology

    2015-03-06

    significant contributions to risk assessment of natural and/or artificial disasters . Not only to this application, the technique developed in this...for the objective (1) through its application to risk assessment of severe river flood which is a representative but rare natural disaster in...the developed methods to actual applications such as simulations of rare mega- disaster scenarios. (5) Practical demonstrations of the adapted

  4. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2016-04-08

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  5. Medical simulation topic interests in a pediatric healthcare system.

    PubMed

    Deutsch, Ellen S; Olivieri, Jason J; Hossain, Jobayer; Sobolewski, Heather L

    2010-10-01

    Encouraged by evidence suggesting that simulation-based educational programs can translate to improved patient care, interest and investment in medical simulation have grown considerably in the past decade. Despite such growth, simulation is still a developing field, and little is known of the perceived needs for simulation training among practicing clinicians. This study describes medical simulation interests among clinicians in a pediatric health care system. A web-based survey addressing previous medical simulation experience, medical simulation interests, and demographics was distributed to physicians, nurses, and respiratory therapists within a pediatric healthcare system in the Delaware Valley. All three groups expressed the highest level of interest in simulated resuscitation events ("mock codes") and the least interest in simulations involving communication and electronic medical records. "Airway problems" was identified as the most popular medical simulation topic of interest. Although the rank order of interest in medical simulation categories was similar across groups, physicians reported the lowest levels of interest in all simulation categories. Characteristics such as previous simulation experience and group (eg, nurses and respiratory therapists) were associated with medical simulation interests. Years in practice did not impact interest. For adult learners, educational experiences should be targeted to the learners' perceived needs but should also address unrecognized deficits. Collectively, physicians, nurses, and respiratory therapists were most interested in participating in simulations addressing "codes" (emergency resuscitations) and airway management; these perceptions may provide a focus for designing simulation events that appeal to diverse learning styles. Prior experience with medical simulation seems to increase interest in subsequent simulation activities and offers the optimistic possibility that first-hand experience with simulation

  6. Simulation of January 1-7, 1978 events

    NASA Technical Reports Server (NTRS)

    Chao, J. K.; Moldwin, M. B.; Akasofu, S.-I.

    1987-01-01

    The solar wind disturbances of January 1 to 7, 1978 are reconstructed by a modeling method. First, the interplanetary magnetic field (IMF) background pattern, including a corotating shock, is reproduced using the Stanford source surface map. Then, two solar flares with their onset times on January 1, 0717 UT at S17 deg E10 deg and 2147 UT S17 deg E32 deg, respectively, are selected to generate two interplanetary transient shocks. It is shown that these two shocks interacted with the corotating shock, resulting in a series of interplanetary events observed by four spacecraft, Helios 1 and 2, IMP-8 (Interplanetary Monitoring Platform 8), and Voyager 2. Results show that these three shock waves interact and coalesce in interplanetary space such that Helios 2 and Voyager 2 observed only one shock and Helios 1 and IMP-8 observed two shocks. All shocks observed by the four spacecraft, except the corotating shock at Helios 1, are either a transient shock or a shock which is formed from coalescing of the transient shocks with the corotating shock. The method is useful in reconstructing a very complicated chain of interplanetary events observed by a number of spacecraft.

  7. Simulation of LHC events on a millions threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2015-12-01

    Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.

  8. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  9. An intelligent simulation training system

    NASA Technical Reports Server (NTRS)

    Biegel, John E.

    1990-01-01

    The Department of Industrial Engineering at the University of Central Florida, Embry-Riddle Aeronautical University and General Electric (SCSD) have been funded by the State of Florida to build an Intelligent Simulation Training System. The objective was and is to make the system generic except for the domain expertise. Researchers accomplished this objective in their prototype. The system is modularized and therefore it is easy to make any corrections, expansions or adaptations. The funding by the state of Florida has exceeded $3 million over the past three years and through the 1990 fiscal year. UCF has expended in excess of 15 work years on the project. The project effort has been broken into three major tasks. General Electric provides the simulation. Embry-Riddle Aeronautical University provides the domain expertise. The University of Central Florida has constructed the generic part of the system which is comprised of several modules that perform the tutoring, evaluation, communication, status, etc. The generic parts of the Intelligent Simulation Training Systems (ISTS) are described.

  10. Simulator verification techniques study. Integrated simulator self test system concepts

    NASA Technical Reports Server (NTRS)

    Montoya, G.; Wenglinski, T. H.

    1974-01-01

    Software and hardware requirements for implementing hardware self tests are presented in support of the development of training and procedures development simulators for the space shuttle program. Self test techniques for simulation hardware and the validation of simulation performance are stipulated. The requirements of an integrated simulator self system are analyzed. Readiness tests, fault isolation tests, and incipient fault detection tests are covered.

  11. [Validation of an adverse event reporting system in primary care].

    PubMed

    de Lourdes Rojas-Armadillo, María; Jiménez-Báez, María Valeria; Chávez-Hernández, María Margarita; González-Fondón, Araceli

    2016-01-01

    Patient safety is a priority issue in health systems, due to the damage costs, institutional weakening, lack of credibility, and frustration on those who committed an error that resulted in an adverse event. There is no standardized instrument for recording, reporting, and analyzing sentinel or adverse events (AE) in primary care. Our aim was to design and validate a surveillance system for recording sentinel events, adverse events and near miss incidents in primary care. We made a review of systems for recording and reporting adverse events in primary care. Then, we proposed an instrument to record these events, and register faults in the structure and process, in primary health care units in the Instituto Mexicano del Seguro Social. We showed VENCER-MF format to 35 subjects. Out of them, 100% identified a failure in care process, 90% recorded a sentinel event, 85% identified the cause of this event, 75% of them suggested some measures for avoiding the recurrence of adverse events. We used a Cronbach's alpha of 0.6, p=0.03. The instrument VENCER-MF has a good consistency for the identification of adverse events.

  12. A Risk Assessment System with Automatic Extraction of Event Types

    NASA Astrophysics Data System (ADS)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  13. Power system extreme event screening using graphpartitioning

    SciTech Connect

    Lesieutre, Bernard C.; Roy, Sandip; Donde, Vaibhav; Pinar, Ali

    2006-09-06

    We propose a partitioning problem in a power system contextthat weighs the two objectives of minimizing cuts between partitions andmaximizing the power imbalance between partitions. We then pose theproblem in a purely graph theoretic sense. We offer an approximatesolution through relaxation of the integer problem and suggest refinementusing stochastic methods. Results are presented for the IEEE 30-bus and118-bus electric power systems.

  14. Towards a Self-Consistent Simulation Capability of Catastrophic Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Sokolov, I.; Gombosi, T. I.; Bindi, V.; Borovikov, D.; Kota, J.; Giacalone, J.

    2016-12-01

    Space weather refers to variations in the space environment that can affect technologies or endanger human life and health. Solar energetic particle (SEP) events can affect communications and airline safety. Satellites are affected by radiation damage to electronics and to components that produce power and provide images. Sun and star sensors are blinded during large SEP events. Protons of ≳30 MeV penetrate spacesuits and spacecraft walls. Events, like that of August 4, 1972, would have been fatal to moon-walking astronauts. Catastrophic events typically are characterized by hard particle energy spectra potentially containing large fluxes of hundreds of MeV-GeV type particles. These super-energetic particles can penetrate even into the "safest" areas of spacecraft and produce induced radioactivity. We describe several technologies which are to be combined into a physics-based, self consistent model to understand and forecast the origin and evolution of SEP events: The Alfvén Wave Solar-wind Model (AWSoM) simulates the chromosphere-to-Earth system using separate electron and ion temperatures and separate parallel and perpendicular temperatures. It solves the energy equations including thermal conduction and coronal heating by Alfvén wave turbulence. It uses adaptive mesh refinement (AMR), which allows us to cover a broad range of spacial scales. The Eruptive Event Generator using the Gibson-Low flux-rope model (EEGGL) allows the user to select an active region on the sun, select the polarity inversion line where the eruption is observed, and insert a Gibson-Low flux-rope to produce eruption. The Multiple-Field-Lines-Advection Model for Particle Acceleration (M-FLAMPA) solves the particle transport equation along a multitude of interplanetary magnetic field lines originating from the Sun, using time-dependent parameters for the shock and magnetic field obtained from the MHD simulation. It includes a self-consistent coupling of Alfvén wave turbulence to the SEPs

  15. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  16. Stochastic simulation in systems biology.

    PubMed

    Székely, Tamás; Burrage, Kevin

    2014-11-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest.

  17. Stochastic simulation in systems biology

    PubMed Central

    Székely, Tamás; Burrage, Kevin

    2014-01-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest. PMID:25505503

  18. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family

    NASA Astrophysics Data System (ADS)

    Custodio, M. D. S.; Ambrizzi, T.; Da Rocha, R.

    2015-12-01

    The increased horizontal resolution of climate models aims to improve the simulations accuracy and to understand the non-linear processes during interactions between different spatial scales within the climate system. Up to this moment, these interactions did not have a good representation on low horizontal resolution GCMs. The variations of extreme climatic events had been described and analyzed in the scientific literature. In a scenario of global warming it is necessary understanding and explaining extreme events and to know if global models may represent these events. The purpose of this study was to understand the impact of the horizontal resolution in high resolution coupled and atmospheric global models of HiGEM project in simulating atmospheric patterns and processes of interaction between spatial scales. Moreover, evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model in capturing the signal of interannual and intraseasonal variability of precipitation over Amazon region. The results indicated that the grid refinement and ocean-atmosphere coupling contributes to a better representation of seasonal patterns, both precipitation and temperature, on the Amazon region. Besides, the climatic models analyzed represent better than other models (regional and global) the climatic characteristics of this region. This indicates a breakthrough in the development of high resolution climate models. Both coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The interannual variability analysis showed that coupled simulations intensify the impact of the ENSO in the Amazon. In the intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. The simulation of ENSO in GCMs can be attributed to their high

  19. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings.

  20. An event generator for simulations of complex β-decay experiments

    NASA Astrophysics Data System (ADS)

    Jordan, D.; Algora, A.; Tain, J. L.

    2016-08-01

    This article describes a Monte Carlo event generator for the design, optimization and performance characterization of beta decay spectroscopy experimental set-ups. The event generator has been developed within the Geant4 simulation architecture and provides new features and greater flexibility in comparison with the current available decay generator.

  1. Transition Path Sampling and Other Advanced Simulation Techniques for Rare Events

    NASA Astrophysics Data System (ADS)

    Dellago, Christoph; Bolhuis, Peter G.

    Computer simulations of molecular processes such as nucleation in first-order phase transitions or the folding of a protein are often complicated by widely disparate time scales related to important but rare events. Here, we will review sev eral recently developed computational methods designed to address the rare-events problem. In doing so, we will focus on the transition path sampling methodology.

  2. Tsunami simulations for historical and plausible mega-thrust events originating in the Eastern Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Valle, Brett; Kalligeris, Nikos; Findikakis, Angelos; Okal, Emile; Synolakis, Costas

    2013-04-01

    Tsunamis have been reported at rates of one to two per year in the Mediterranean Sea, on average, over the past 2000 years (Ambraseys and Synolakis, 2010). However, quantification of tsunami hazards in the Eastern Mediterranean Sea remains difficult, as large events are infrequent. Simulations were performed for a series of seismic events originating along the Eastern Hellenic Arc and Western Cyprian Arc. The locations and source characteristics represent plausible mega-thrust events similar to historical events along the Hellenic Arc, including the 365 AD and 1303 AD events. Sensitivity simulations were performed to address uncertainty in the location and source characteristics of the 1303 AD event, and in consideration of potential future events originating along the Eastern Hellenic Arc. Sensitivity simulations were also performed for events originating along the Western Cyprian Arc. The hydrodynamic simulations used a series of codes known as the Method of Splitting Tsunami (MOST) (Titov and Synolakis, 1998). Reported results include wave propagation in the Eastern Mediterranean and tsunami inundation near Alexandria, Egypt, and for neighboring coastlines. References: Ambraseys, N. and C.E. Synolakis (2010), Tsunami Catalogs for the Eastern Mediterranean, Revisited, Journal of Earthquake Engineering 14(3): 309-330; and Titov V.V. and C.E. Synolakis (1998), 'Numerical modeling of tidal wave runup,' J. Waterw. Port Coast. Ocean Eng. 124(4): 157-171.

  3. Representing Ground Robotic Systems in Battlefield Simulations

    DTIC Science & Technology

    2002-08-01

    representations of intelligent system performance for its battlefield simulation tools . These simulation tools differ considerably in their level of...simulation study, 2) the overall fidelity of the target simulation tool , and 3) the elements of the robotic system that are relevant to the...simulation study. In this paper, we discuss a framework for modeling robotic system performance in the context of a battlefield simulation tool . We apply

  4. Calculation of 239Pu fission observables in an event-by-event simulation

    SciTech Connect

    Vogt, R; Randrup, J; Pruet, J; Younes, W

    2010-03-31

    The increased interest in more exclusive fission observables has demanded more detailed models. We describe a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including any interesting correlations. The various model assumptions are described and the potential utility of the model is illustrated. As a concrete example, we use formal statistical methods, experimental data on neutron production in neutron-induced fission of {sup 239}Pu, along with FREYA, to develop quantitative insights into the relation between reaction observables and detailed microscopic aspects of fission. Current measurements of the mean number of prompt neutrons emitted in fission taken together with less accurate current measurements for the prompt post-fission neutron energy spectrum, up to the threshold for multi-chance fission, place remarkably fine constraints on microscopic theories.

  5. Time accuracy of a barcode system for recording resuscitation events: laboratory trials.

    PubMed

    Stewart, J A; Short, F A

    1999-11-01

    Barcode systems for recording clinical data from resuscitation attempts offer the prospect of more complete and time-accurate data collection; in addition, collection of data in digital form and the resulting ease of computer processing promises to facilitate data analysis for quality improvement and research. We conducted trials of such a barcode system, recording events during a videotaped, simulated in-hospital resuscitation, with particular attention to time accuracy. Nine subjects watched a videotape of a simulated cardiac resuscitation, recording events first with the barcode system and then with a conventional handwritten form. Recorded times were compared to an accurate record of events (gold standard) from the videotape. Mean absolute errors and standard deviations of errors from the gold standard were significantly smaller with the barcode system (P < 0.01 for both). Numbers of event omissions did not differ significantly. The barcode system is more accurate than conventional handwritten recording in capturing event times from a simulated resuscitation. The system shows promise as a means to improve time accuracy of resuscitation records.

  6. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  7. Experience producing simulated events for the DZero experiment on the SAM-Grid

    SciTech Connect

    Garzoglio, G.; Terekhov, I.; Snow, J.; Jain, S.; Nishandar, A.; /Texas U., Arlington

    2004-12-01

    Most of the simulated events for the DZero experiment at Fermilab have been historically produced by the ''remote'' collaborating institutions. One of the principal challenges reported concerns the maintenance of the local software infrastructure, which is generally different from site to site. As the understanding of the distributed computing community over distributively owned and shared resources progresses, the adoption of grid technologies to address the production of Monte Carlo events for high energy physics experiments becomes increasingly interesting. SAM-Grid is a software system developed at Fermilab, which integrates standard grid technologies for job and information management with SAM, the data handling system of the DZero and CDF experiments. During the past few months, this grid system has been tailored for the Monte Carlo production of DZero. Since the initial phase of deployment, this experience has exposed an interesting series of requirements to the SAM-Grid services, the standard middleware, the resources and their management and to the analysis framework of the experiment. As of today, the inefficiency due to the grid infrastructure has been reduced to as little as 1%. In this paper, we present our statistics and the ''lessons learned'' in running large high energy physics applications on a grid infrastructure.

  8. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  9. Discrete-Event Simulation Modeling of the Repairable Inventory Process to Enhance the ARGCS Business Case Analysis

    DTIC Science & Technology

    2006-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT Discrete-Event Simulation Modeling of the Repairable...TYPE AND DATES COVERED MBA Professional Report 4. TITLE AND SUBTITLE: Discrete-Event Simulation Modeling of the Repairable Inventory Process to...Advanced Concept Technology Demonstration; Agile Rapid Global Combat Support; Discrete- Event Simulation Modeling of the Repairable Inventory Process to

  10. Extreme events in excitable systems and mechanisms of their generation.

    PubMed

    Ansmann, Gerrit; Karnatak, Rajat; Lehnertz, Klaus; Feudel, Ulrike

    2013-11-01

    We study deterministic systems, composed of excitable units of FitzHugh-Nagumo type, that are capable of self-generating and self-terminating strong deviations from their regular dynamics without the influence of noise or parameter change. These deviations are rare, short-lasting, and recurrent and can therefore be regarded as extreme events. Employing a range of methods we analyze dynamical properties of the systems, identifying features in the systems' dynamics that may qualify as precursors to extreme events. We investigate these features and elucidate mechanisms that may be responsible for the generation of the extreme events.

  11. Accuracy of harm scores entered into an event reporting system.

    PubMed

    Abbasi, Toni; Adornetto-Garcia, Debra; Johnston, Patricia A; Segovia, Julie H; Summers, Barbara

    2015-04-01

    This quality improvement project evaluated the accuracy of harm scores entered into an event reporting system by inpatient nursing staff at a National Cancer Institute-designated comprehensive cancer center. Nurses scored 10 safety scenarios using 2 versions of the Agency for Healthcare Research and Quality scale to determine interrater reliability. Results indicated inconsistency in the way nurses scored the scenarios, suggesting that the event reporting system may not accurately portray the severity of harm in patient safety events. Nurse executives can use this information to guide the development and implementation of incident reporting systems.

  12. Reading Sky and Seeing a Cloud: On the Relevance of Events for Perceptual Simulation

    PubMed Central

    2016-01-01

    Previous research has shown that processing words with an up/down association (e.g., bird, foot) can influence the subsequent identification of visual targets in congruent location (at the top/bottom of the screen). However, as facilitation and interference were found under similar conditions, the nature of the underlying mechanisms remained unclear. We propose that word comprehension relies on the perceptual simulation of a prototypical event involving the entity denoted by a word in order to provide a general account of the different findings. In 3 experiments, participants had to discriminate between 2 target pictures appearing at the top or the bottom of the screen by pressing the left versus right button. Immediately before the targets appeared, they saw an up/down word belonging to the target’s event, an up/down word unrelated to the target, or a spatially neutral control word. Prime words belonging to target event facilitated identification of targets at a stimulus onset asynchrony (SOA) of 250 ms (Experiment 1), but only when presented in the vertical location where they are typically seen, indicating that targets were integrated in the simulations activated by the prime words. Moreover, at the same SOA, there was a robust facilitation effect for targets appearing in their typical location regardless of the prime type. However, when words were presented for 100 ms (Experiment 2) or 800 ms (Experiment 3), only a location nonspecific priming effect was found, suggesting that the visual system was not activated. Implications for theories of semantic processing are discussed. PMID:27762581

  13. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  14. A Study on Modeling Approaches in Discrete Event Simulation Using Design Patterns

    DTIC Science & Technology

    2007-12-01

    Port Security ( FPPS ) 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE...79 Figure 52 Event-graph Logic of the FPPS Simulation Application.................................81 Figure 53 FPPS Simulation...module. For the random-utility package, a discussion on empirical analysis is conducted. Finally, this chapter looks how the FPPS application has been

  15. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    NASA Astrophysics Data System (ADS)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  16. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  17. Performance and efficiency of geotextile-supported erosion control measures during simulated rainfall events

    NASA Astrophysics Data System (ADS)

    Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin

    2013-04-01

    Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show

  18. Simulation System for Training in Laparoscopic Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao

    2003-01-01

    A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set

  19. Canister Transfer System Event Sequence Calculation

    SciTech Connect

    Richard Morissette

    2001-08-16

    The ''Department of Energy Spent Nuclear Fuel Canister, Transportation, and Monitored Geologic Repository Systems, Structures, and Components Performance Allocation Study'' (CRWMS M&O 2000b) allocated performance to both the canisters received at the Monitored Geologic Repository (MGR) and the MGR Canister Transfer System (CTS). The purpose of this calculation is to evaluate an assumed range of canister and CTS performance allocation failure probabilities and determine the effect of these failure probabilities on the frequency of a radionuclide release. Five canister types are addressed in this calculation; high-level radioactive waste (HLW) canisters containing vitrified borosilicate glass, HLW canisters containing immobilized plutonium surrounded by borosilicate glass (Pu/HLW canisters), Department of Energy (DOE) spent nuclear fuel (DSNF) standard canisters (4 sizes), DSNF multi-canister overpacks (MCOs) for N-reactor fuel and other selected DSNF, and naval spent nuclear fuel (SNF) canisters (2 sizes). The quality assurance program applies to this calculation, and the work is performed in accordance with procedure AP-3.12Q, ''Calculations''. The work done for this calculation was evaluated according to AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'' that determined this activity to be subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000a). This work was performed in accordance with the ''Technical Work Plan for: Department of Energy Nuclear Fuel Work Packages'' (CRWMS M&O 2000c) for this activity.

  20. Effect of simulation on nursing knowledge and critical thinking in failure to rescue events.

    PubMed

    Schubert, Carolyn R

    2012-10-01

    Failure to rescue events are hospital deaths that result from human error and unsafe patient conditions. A failure to rescue event implies that the last and best chance to avoid tragedy is not acted on in time to avoid a disaster. Patient safety is often compromised by nurses who do not perform accurate assessments (vigilance), do not detect clinical changes (surveillance), or do not display critical thinking (recognition that something is wrong). This project used simulation as a teaching strategy to enhance nursing performance. Medical-surgical nurses took part in a simulated failure to rescue event in which the patient's clinical condition deteriorated rapidly. Nursing knowledge and critical thinking improved after the simulation and showed the effectiveness of simulation as a teaching strategy to address nursing knowledge and critical thinking skills.

  1. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  2. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  3. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, Debra P. C.; Sala, O.E.; Allen, C.D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas. ?? The Ecological Society of America.

  4. Solar system events at high spatial resolution

    SciTech Connect

    Baines, K H; Gavel, D T; Getz, A M; Gibbartd, S G; MacIntosh, B; Max, C E; McKay, C P; Young, E F; de Pater, I

    1999-02-19

    Until relatively recent advances in technology, astronomical observations from the ground were limited in image resolution by the blurring effects of earth's atmosphere. The blur extent, ranging typically from 0.5 to 2 seconds of arc at the best astronomical sights, precluded ground-based observations of the details of the solar system's moons, asteroids, and outermost planets. With the maturing of a high resolution image processing technique called speckle imaging the resolution limitation of the atmosphere can now be largely overcome. Over the past three years they have used speckle imaging to observe Titan, a moon of Saturn with an atmospheric density comparable to Earth's, Io, the volcanically active innermost moon of Jupiter, and Neptune, a gas giant outer planet which has continually changing planet-encircling storms. These observations were made at the world's largest telescope, the Keck telescope in Hawaii and represent the highest resolution infrared images of these objects ever taken.

  5. Simulation of linear mechanical systems

    NASA Technical Reports Server (NTRS)

    Sirlin, S. W.

    1993-01-01

    A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.

  6. Hard Sphere Simulation by Event-Driven Molecular Dynamics: Breakthrough, Numerical Difficulty, and Overcoming the issues

    NASA Astrophysics Data System (ADS)

    Isobe, Masaharu

    Hard sphere/disk systems are among the simplest models and have been used to address numerous fundamental problems in the field of statistical physics. The pioneering numerical works on the solid-fluid phase transition based on Monte Carlo (MC) and molecular dynamics (MD) methods published in 1957 represent historical milestones, which have had a significant influence on the development of computer algorithms and novel tools to obtain physical insights. This chapter addresses the works of Alder's breakthrough regarding hard sphere/disk simulation: (i) event-driven molecular dynamics, (ii) long-time tail, (iii) molasses tail, and (iv) two-dimensional melting/crystallization. From a numerical viewpoint, there are serious issues that must be overcome for further breakthrough. Here, we present a brief review of recent progress in this area.

  7. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  8. An Object Description Language for Distributed Discrete Event Simulations

    DTIC Science & Technology

    2001-05-24

    given a universe Up, we define a system SRXpT over a collection of parameters RcP ’ as SR XPT E-lrV xeRUT (2-5) We then define r:pT--->SR such that iz...View2D::zNear - This is the near clipping plane. float gvm::View2D::zFar - This is the far clipping plane Public Constructors: gvm::View2D::View2D...of the view. GLdouble gvm::View3D::scale - Zooming factor of the scale. GLdouble gvm::View3D::zNear - Near clipping plane. GLdouble gvm::View3D::zFar

  9. Nonclassical effects in two-photon interference experiments: an event-by-event simulation

    NASA Astrophysics Data System (ADS)

    Michielsen, K.; Jin, F.; De Raedt, H.

    2013-10-01

    It is shown that both the visibility V =1/2 predicted for two-photon interference experiments with two independent sources and the visibility V = 1 predicted for two-photon interference experiments with a parametric down-conversion source can be explained in terms of a locally causal, adaptive, corpuscular, classical (non-Hamiltonian) dynamical system. Hence, there is no need to invoke quantum theory to explain the so-called nonclassical effects in the interference of signal and idler photons in parametric down conversion and a revision of the commonly accepted criterion of the nonclassical nature of light is called for.

  10. System of systems modeling and simulation.

    SciTech Connect

    Lawton, Craig R.; Campbell, James E.; Anderson, Dennis James; Thompson, Bruce Miles; Longsine, Dennis E.; Shirah, Donald N.; Cranwell, Robert M.

    2005-02-01

    Analyzing the performance of a complex System of Systems (SoS) requires a systems engineering approach. Many such SoS exist in the Military domain. Examples include the Army's next generation Future Combat Systems 'Unit of Action' or the Navy's Aircraft Carrier Battle Group. In the case of a Unit of Action, a system of combat vehicles, support vehicles and equipment are organized in an efficient configuration that minimizes logistics footprint while still maintaining the required performance characteristics (e.g., operational availability). In this context, systems engineering means developing a global model of the entire SoS and all component systems and interrelationships. This global model supports analyses that result in an understanding of the interdependencies and emergent behaviors of the SoS. Sandia National Laboratories will present a robust toolset that includes methodologies for developing a SoS model, defining state models and simulating a system of state models over time. This toolset is currently used to perform logistics supportability and performance assessments of the set of Future Combat Systems (FCS) for the U.S. Army's Program Manager Unit of Action.

  11. Rare event statistics in reaction-diffusion systems.

    PubMed

    Elgart, Vlad; Kamenev, Alex

    2004-10-01

    We present an efficient method to calculate probabilities of large deviations from the typical behavior (rare events) in reaction-diffusion systems. This method is based on a semiclassical treatment of an underlying "quantum" Hamiltonian, encoding the system's evolution. To this end, we formulate the corresponding canonical dynamical system and investigate its phase portrait. This method is presented for a number of pedagogical examples.

  12. Event-triggered consensus tracking of multi-agent systems with Lur'e nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Na; Duan, Zhisheng; Wen, Guanghui; Zhao, Yu

    2016-05-01

    In this paper, distributed consensus tracking problem for networked Lur'e systems is investigated based on event-triggered information interactions. An event-triggered control algorithm is designed with the advantages of reducing controller update frequency and sensor energy consumption. By using tools of ?-procedure and Lyapunov functional method, some sufficient conditions are derived to guarantee that consensus tracking is achieved under a directed communication topology. Meanwhile, it is shown that Zeno behaviour of triggering time sequences is excluded for the proposed event-triggered rule. Finally, some numerical simulations on coupled Chua's circuits are performed to illustrate the effectiveness of the theoretical algorithms.

  13. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  14. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  15. The multinomial simulation algorithm for discrete stochastic simulation of reaction-diffusion systems

    NASA Astrophysics Data System (ADS)

    Lampoudi, Sotiria; Gillespie, Dan T.; Petzold, Linda R.

    2009-03-01

    The Inhomogeneous Stochastic Simulation Algorithm (ISSA) is a variant of the stochastic simulation algorithm in which the spatially inhomogeneous volume of the system is divided into homogeneous subvolumes, and the chemical reactions in those subvolumes are augmented by diffusive transfers of molecules between adjacent subvolumes. The ISSA can be prohibitively slow when the system is such that diffusive transfers occur much more frequently than chemical reactions. In this paper we present the Multinomial Simulation Algorithm (MSA), which is designed to, on the one hand, outperform the ISSA when diffusive transfer events outnumber reaction events, and on the other, to handle small reactant populations with greater accuracy than deterministic-stochastic hybrid algorithms. The MSA treats reactions in the usual ISSA fashion, but uses appropriately conditioned binomial random variables for representing the net numbers of molecules diffusing from any given subvolume to a neighbor within a prescribed distance. Simulation results illustrate the benefits of the algorithm.

  16. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  17. Simulation of debris flow events in Sicily by cellular automata model SCIDDICA_SS3

    NASA Astrophysics Data System (ADS)

    Cancelliere, A.; Lupiano, V.; Peres, D. J.; Stancanelli, L.; Avolio, M.; Foti, E.; Di Gregorio, S.

    2013-12-01

    Debris flow models are widely used for hazard mapping or for evaluating the effectiveness of risk mitigation measures. Several models analyze the dynamics of debris flow runout solving Partial Differential Equations. In use of such models, difficulties arise in estimating kinematic geotechnical soil parameters for real phenomena. In order to overcome such difficulties, alternative semi-empirical approaches can be employed, such as macroscopic Cellular Automata (CA). In particular, for CA simulation purposes, the runout of debris flows emerges from local interactions in a dynamical system, subdivided into elementary parts, whose state evolves within a spatial and temporal discretum. The attributes of each cell (substates) describe physical characteristics. For computational reasons, the natural phenomenon is splitted into a number of elementary processes, whose proper composition makes up the CA transition function. By simultaneously applying this function to all the cells, the evolution of the phenomenon can be simulated in terms of modifications of the substates. In this study, we present an application of the macroscopic CA semi-empirical model SCIDDICA_SS3 to the Peloritani Mountains area in Sicily island, Italy. The model was applied using detailed data from the 1 October 2009 debris flow event, which was triggered by a rainfall event of about 250 mm falling in 9 hours, that caused the death of 37 persons. This region is characterized by river valleys with large hillslope angles (30°-60°), catchment basins of small extensions (0.5-12 km2) and soil composed by metamorphic material, which is easy to be eroded. CA usage implies a calibration phase, that identifies an optimal set of parameters capable of adequately play back the considered case, and a validation phase, that tests the model on a sufficient (and different) number of cases similar in terms of physical and geomorphological properties. The performance of the model can be measured in terms of a fitness

  18. The added value of convection permitting simulations of extreme precipitation events over the eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Zittis, G.; Bruggeman, A.; Camera, C.; Hadjinicolaou, P.; Lelieveld, J.

    2017-07-01

    Climate change is expected to substantially influence precipitation amounts and distribution. To improve simulations of extreme rainfall events, we analyzed the performance of different convection and microphysics parameterizations of the WRF (Weather Research and Forecasting) model at very high horizontal resolutions (12, 4 and 1 km). Our study focused on the eastern Mediterranean climate change hot-spot. Five extreme rainfall events over Cyprus were identified from observations and were dynamically downscaled from the ERA-Interim (EI) dataset with WRF. We applied an objective ranking scheme, using a 1-km gridded observational dataset over Cyprus and six different performance metrics, to investigate the skill of the WRF configurations. We evaluated the rainfall timing and amounts for the different resolutions, and discussed the observational uncertainty over the particular extreme events by comparing three gridded precipitation datasets (E-OBS, APHRODITE and CHIRPS). Simulations with WRF capture rainfall over the eastern Mediterranean reasonably well for three of the five selected extreme events. For these three cases, the WRF simulations improved the ERA-Interim data, which strongly underestimate the rainfall extremes over Cyprus. The best model performance is obtained for the January 1989 event, simulated with an average bias of 4% and a modified Nash-Sutcliff of 0.72 for the 5-member ensemble of the 1-km simulations. We found overall added value for the convection-permitting simulations, especially over regions of high-elevation. Interestingly, for some cases the intermediate 4-km nest was found to outperform the 1-km simulations for low-elevation coastal parts of Cyprus. Finally, we identified significant and inconsistent discrepancies between the three, state of the art, gridded precipitation datasets for the tested events, highlighting the observational uncertainty in the region.

  19. A CORBA event system for ALMA common software

    NASA Astrophysics Data System (ADS)

    Fugate, David W.

    2004-09-01

    The ALMA Common Software notification channel framework provides developers with an easy to use, high-performance, event-driven system supported across multiple programming languages and operating systems. It sits on top of the CORBA notification service and hides nearly all CORBA from developers. The system is based on a push event channel model where suppliers push events onto the channel and consumers process these asynchronously. This is a many-to-many publishing model whereby multiple suppliers send events to multiple consumers on the same channel. Furthermore, these event suppliers and consumers can be coded in C++, Java, or Python on any platform supported by ACS. There are only two classes developers need to be concerned with: SimpleSupplier and Consumer. SimpleSupplier was designed so that ALMA events (defined as IDL structures) could be published in the simplest manner possible without exposing any CORBA to the developer. Essentially all that needs to be known is the channel's name and the IDL structure being published. The API takes care of everything else. With the Consumer class, the developer is responsible for providing the channel's name as well as associating event types with functions that will handle them.

  20. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  1. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  2. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  3. Simulation of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Schnepfleitner, Harald; Rode, Matthias; Sass, Oliver

    2014-05-01

    Rock moisture distribution during freeze-thaw events is the key to understanding frost weathering and subsequent rockfall. Data on moisture levels of natural rock walls are scarce and difficult to measure. An innovative and cheap way to avoid these problems is the use of simulation calculations. Although they are an abstraction of the real system they are widely used in natural science. A novel way to simulate moisture in natural rock walls is the use of the software WUFI which has been developed to understand the moisture behavior in building materials. However, the enormous know-how behind these commercial applications has not been exploited for geomorphological research to date. Necessary input data for the simulation are climate data in hourly resolution (temperature, rainfall, wind, irradiation) and material properties (porosity, sorption and diffusivity parameters) of the prevailing rock. Two different regions were analysed, the Gesäuse (Johnsbachtal: 700 m, limestone and dolomite) and the Sonnblick (3000 m, gneiss and granite). We aimed at comparing the two regions in terms of general susceptibility to frost weathering, as well as the influence of aspect, inclination and rock parameters and the possible impact of climate change. The calculated 1D-moisture profiles and temporal progress of rock moisture - in combination with temperature data - allow to detect possible periods of active weathering and resulting rockfalls. These results were analyzed based on two different frost weathering theories, the "classical" frost shattering theory (requiring high number of freeze-thaw cycles and a pore saturation of 90%) and the segregation ice theory (requiring a long freezing period and a pore saturation threshold of approx. 60%). An additionally considered critical factor for both theories was the frost depth, namely the duration of the "frost cracking window" (between -3 and -10°C) at each site. The results shows that in both areas, north-facing rocks are

  4. Role of systems pharmacology in understanding drug adverse events

    PubMed Central

    Berger, Seth I.; Iyengar, Ravi

    2011-01-01

    Systems pharmacology involves the application of systems biology approaches, combining large-scale experimental studies with computational analyses, to the study of drugs, drug targets, and drug effects. Many of these initial studies have focused on identifying new drug targets, new uses of known drugs, and systems-level properties of existing drugs. This review focuses on systems pharmacology studies that aim to better understand drug side effects and adverse events. By studying the drugs in the context of cellular networks, these studies provide insights into adverse events caused by off-targets of drugs as well as adverse events-mediated complex network responses. This allows rapid identification of biomarkers for side effect susceptibility. In this way, systems pharmacology will lead to not only newer and more effective therapies, but safer medications with fewer side effects. PMID:20803507

  5. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  6. Characteristics and dependencies of error in satellite-based flood event simulations

    NASA Astrophysics Data System (ADS)

    Mei, Yiwen; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Zoccatelli, Davide; Borga, Marco

    2016-04-01

    The error in satellite precipitation driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. rainfall and runoff cumulative depth and time series shape). Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite-precipitation exhibits good agreement with reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of time series shows significant dampening effect. The random error dampening effect is less pronounced for the flash flood events, and the rain flood events with high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  8. Importance of long-time simulations for rare event sampling in zinc finger proteins.

    PubMed

    Godwin, Ryan; Gmeiner, William; Salsbury, Freddie R

    2016-01-01

    Molecular dynamics (MD) simulation methods have seen significant improvement since their inception in the late 1950s. Constraints of simulation size and duration that once impeded the field have lessened with the advent of better algorithms, faster processors, and parallel computing. With newer techniques and hardware available, MD simulations of more biologically relevant timescales can now sample a broader range of conformational and dynamical changes including rare events. One concern in the literature has been under which circumstances it is sufficient to perform many shorter timescale simulations and under which circumstances fewer longer simulations are necessary. Herein, our simulations of the zinc finger NEMO (2JVX) using multiple simulations of length 15, 30, 1000, and 3000 ns are analyzed to provide clarity on this point.

  9. Event-triggered sliding mode control for a class of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Behera, Abhisek K.; Bandyopadhyay, Bijnan

    2016-09-01

    Event-triggering strategy is one of the real-time control implementation techniques which aims at achieving minimum resource utilisation while ensuring the satisfactory performance of the closed-loop system. In this paper, we address the problem of robust stabilisation for a class of nonlinear systems subject to external disturbances using sliding mode control (SMC) by event-triggering scheme. An event-triggering scheme is developed for SMC to ensure the sliding trajectory remains confined in the vicinity of sliding manifold. The event-triggered SMC brings the sliding mode in the system and thus the steady-state trajectories of the system also remain bounded within a predesigned region in the presence of disturbances. The design of event parameters is also given considering the practical constraints on control execution. We show that the next triggering instant is larger than its immediate past triggering instant by a given positive constant. The analysis is also presented with taking delay into account in the control updates. An upper bound for delay is calculated to ensure stability of the system. It is shown that with delay steady-state bound of the system is increased than that of the case without delay. However, the system trajectories remain bounded in the case of delay, so stability is ensured. The performance of this event-triggered SMC is demonstrated through a numerical simulation.

  10. Electrical aspects of photovoltaic-system simulation

    NASA Astrophysics Data System (ADS)

    Hart, G. W.; Raghuraman, P.

    1982-06-01

    A TRNSYS simulation was developed to simulate the performance of utility interactive residential photovoltaic energy systems. The PV system is divided into major functional components, which are individually described with computer models. The results of simulation and actual measured data are compared. The electrical influences on the design of such photovoltaic energy systems are given particular attention.

  11. Optimal switching policy for performance enhancement of distributed parameter systems based on event-driven control

    NASA Astrophysics Data System (ADS)

    Mu, Wen-Ying; Cui, Bao-Tong; Lou, Xu-Yang; Li, Wen

    2014-07-01

    This paper aims to improve the performance of a class of distributed parameter systems for the optimal switching of actuators and controllers based on event-driven control. It is assumed that in the available multiple actuators, only one actuator can receive the control signal and be activated over an unfixed time interval, and the other actuators keep dormant. After incorporating a state observer into the event generator, the event-driven control loop and the minimum inter-event time are ultimately bounded. Based on the event-driven state feedback control, the time intervals of unfixed length can be obtained. The optimal switching policy is based on finite horizon linear quadratic optimal control at the beginning of each time subinterval. A simulation example demonstrate the effectiveness of the proposed policy.

  12. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding.

  13. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  14. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event.

  15. Simulation for CZT Compton PET (Maximization of the efficiency for PET using Compton event)

    NASA Astrophysics Data System (ADS)

    Yoon, Changyeon; Lee, Wonho; Lee, Taewoong

    2011-10-01

    Multiple interactions in positron emission tomography (PET) using scintillators are generally treated as noise events because each interacted position and energy of the multiple interactions cannot be obtained individually and the sequence of multiple scattering is not fully known. Therefore, the first interaction position, which is the crucial information for a PET image reconstruction, cannot be determined correctly. However, in the case of a pixelized semiconductor detector, such as CdZnTe, each specific position and energy information of multiple interactions can be obtained. Moreover, for the emission of two 511 keV radiations in PET, if one radiation deposits all the energy in one position (photoelectric effect) and the other radiation undergoes Compton scattering followed by the photoelectric effect, the sequence of Compton scattering followed by the photoelectric effect can be determined using the Compton scattering formula. Hence, the correct position of Compton scattering can be determined, and the Compton scattering effect, which is discarded in conventional PET systems can be recovered in the new system reported in this study. The PET system in this study, which was simulated using GATE 5.0 code, was composed of 20 mm×10 mm×10 mm CdZnTe detectors consisting of 1 mm×0.5 mm×2.5 mm pixels. The angular uncertainties caused by Doppler broadening, pixelization effect and energy broadening were estimated and compared. The pixelized effect was the main factor in increasing the angular uncertainty and was strongly dependent on the distance between the 1st and 2nd interaction positions. The effect of energy broadening to an angular resolution less than expected and that of Doppler broadening was minimal. The number of Compton events was double that of the photoelectric effect assuming full energy absorption. Therefore, the detection efficiency of this new PET system can be improved greatly because both the photoelectric effect and Compton scattering are

  16. Simulation of centennial-scale drought events over eastern China during the past 1500 years

    NASA Astrophysics Data System (ADS)

    Sun, Weiyi; Liu, Jian; Wang, Zhiyuan

    2017-02-01

    The characteristics and causes of centennial-scale drought events over eastern China during the past 1500 years were explored based on simulations of the Community Earth System Model (CESM). The results show that centennial- scale drought events over eastern China occurred during the periods of 622-735 (Drought period 1, D1) and 1420-1516 (Drought period 2, D2) over the past 1500 years, which is comparable with climate proxy data. In D1, the drought center occurred in northern China and the Yangtze River valley; however, in southern China, precipitation was much more than usual. In D2, decreased precipitation was found across almost the whole region of eastern China. The direct cause of these two drought events was the weakened East Asian summer monsoon, and the specific process was closely linked to the air-sea interaction of the Indo-Pacific Ocean. In D1, regions of maximum cooling were observed over the western Pacific, which may have led to anomalous subsidence, weakening the Walker circulation, and reducing the northward transport of water vapor. Additionally, upward motion occurred over southern China, strengthening convection and increasing precipitation. In D2, owing to the decrease in the SST, subsidence dominated the North Indian Ocean, blocking the low-level cross-equatorial flow, enhancing the tropical westerly anomalies, and reducing the northward transport of moisture. Additionally, descending motion appeared in eastern China, subsequently decreasing the precipitation over the whole region of eastern China. The anomalous cooling of the Indo-Pacific Ocean SST may have been caused by the persistently low solar irradiation in D1; whereas, in D2, this characteristic may have been influenced not only by persistently low solar irradiation, but frequent volcanic eruptions too.

  17. SIBYLL: An event generator for simulation of high energy cosmic ray cascades

    SciTech Connect

    Fletcher, R.S.; Gaisser, T.K.; Lipari, P.; Stanev, T. INFN sezione di Roma, and Dipartimento di Fisica Universita di Roma I, Piazzale Aldo Moro 2, Rome Department of Physics, The Johns Hopkins University, Baltimore, Maryland 21218 )

    1994-11-01

    We describe the physical basis and some applications of an efficient event generator designed for Monte Carlo simulations of atmospheric cascades at ultrahigh energies. The event generator (SIBYLL) incorporates many features of the Lund programs, but emphasizes the fragmentation region and the production of minijets. A consistent treatment of hadron-hadron and hadron-nucleus interactions is emphasized. Examples of applications are the calculation of coincident muons observed in deep underground detectors and the simulation of the longitudinal development of air shower components in the atmosphere.

  18. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    NASA Astrophysics Data System (ADS)

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintoré, Joaquin

    2012-09-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  19. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    USGS Publications Warehouse

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  20. Monte Carlo generator ELRADGEN 2.0 for simulation of radiative events in elastic ep-scattering of polarized particles

    NASA Astrophysics Data System (ADS)

    Akushevich, I.; Filoti, O. F.; Ilyichev, A.; Shumeiko, N.

    2012-07-01

    The structure and algorithms of the Monte Carlo generator ELRADGEN 2.0 designed to simulate radiative events in polarized ep-scattering are presented. The full set of analytical expressions for the QED radiative corrections is presented and discussed in detail. Algorithmic improvements implemented to provide faster simulation of hard real photon events are described. Numerical tests show high quality of generation of photonic variables and radiatively corrected cross section. The comparison of the elastic radiative tail simulated within the kinematical conditions of the BLAST experiment at MIT BATES shows a good agreement with experimental data. Catalogue identifier: AELO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1299 No. of bytes in distributed program, including test data, etc.: 11 348 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: All Operating system: Any RAM: 1 MB Classification: 11.2, 11.4 Nature of problem: Simulation of radiative events in polarized ep-scattering. Solution method: Monte Carlo simulation according to the distributions of the real photon kinematic variables that are calculated by the covariant method of QED radiative correction estimation. The approach provides rather fast and accurate generation. Running time: The simulation of 108 radiative events for itest:=1 takes up to 52 seconds on Pentium(R) Dual-Core 2.00 GHz processor.

  1. BEEC: An event generator for simulating the Bc meson production at an e+e- collider

    NASA Astrophysics Data System (ADS)

    Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You

    2013-12-01

    The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in

  2. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

    PubMed Central

    Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  3. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees.

  4. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  5. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  6. A geostatistical extreme-value framework for fast simulation of natural hazard events.

    PubMed

    Youngman, Benjamin D; Stephenson, David B

    2016-05-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student's t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements.

  7. Public Health System Response to Extreme Weather Events.

    PubMed

    Hunter, Mark D; Hunter, Jennifer C; Yang, Jane E; Crawley, Adam W; Aragón, Tomás J

    2016-01-01

    Extreme weather events, unpredictable and often far-reaching, constitute a persistent challenge for public health preparedness. The goal of this research is to inform public health systems improvement through examination of extreme weather events, comparing across cases to identify recurring patterns in event and response characteristics. Structured telephone-based interviews were conducted with representatives from health departments to assess characteristics of recent extreme weather events and agencies' responses. Response activities were assessed using the Centers for Disease Control and Prevention Public Health Emergency Preparedness Capabilities framework. Challenges that are typical of this response environment are reported. Forty-five local health departments in 20 US states. Respondents described public health system responses to 45 events involving tornadoes, flooding, wildfires, winter weather, hurricanes, and other storms. Events of similar scale were infrequent for a majority (62%) of the communities involved; disruption to critical infrastructure was universal. Public Health Emergency Preparedness Capabilities considered most essential involved environmental health investigations, mass care and sheltering, surveillance and epidemiology, information sharing, and public information and warning. Unanticipated response activities or operational constraints were common. We characterize extreme weather events as a "quadruple threat" because (1) direct threats to population health are accompanied by damage to public health protective and community infrastructure, (2) event characteristics often impose novel and pervasive burdens on communities, (3) responses rely on critical infrastructures whose failure both creates new burdens and diminishes response capacity, and (4) their infrequency and scale further compromise response capacity. Given the challenges associated with extreme weather events, we suggest opportunities for organizational learning and

  8. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    SciTech Connect

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.

  9. On the problem of discrete-event systems properties preservation

    NASA Astrophysics Data System (ADS)

    Nagul, Nadezhda; Bychkov, Igor

    2017-01-01

    The paper presents a novel approach to solving a problem generally arising in studying dynamical systems, namely the problem of a system's properties preservation under some transformation. Combining algebra, logic and dynamics, the method of logical-algebraic equations (LAE-method) is developed, serving to synthesize criteria for preservation properties of systems connected by special type of morphisms. The LAE-method is applicable to various systems, but we focus on the case of discrete-event systems (DES), which are the systems that evolve in time due to the occurrence of some event sequences. We consider the issues of the LAE-method application to the reduction of supervisor for DES, the problems of DES basic properties, such as observability and controllability, preservation when sensor readings provide information about system's state and it is available to a supervisor. Decentralized supervisory control is also addressed, in particular, the question whether local supervisors properties are inherited in a global supervisor.

  10. Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model

    NASA Astrophysics Data System (ADS)

    Mikolajewicz, Uwe; Ziemen, Florian

    2016-04-01

    Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.

  11. State-dependent doubly weighted stochastic simulation algorithm for automatic characterization of stochastic biochemical rare events

    NASA Astrophysics Data System (ADS)

    Roh, Min K.; Daigle, Bernie J.; Gillespie, Dan T.; Petzold, Linda R.

    2011-12-01

    In recent years there has been substantial growth in the development of algorithms for characterizing rare events in stochastic biochemical systems. Two such algorithms, the state-dependent weighted stochastic simulation algorithm (swSSA) and the doubly weighted SSA (dwSSA) are extensions of the weighted SSA (wSSA) by H. Kuwahara and I. Mura [J. Chem. Phys. 129, 165101 (2008)], 10.1063/1.2987701. The swSSA substantially reduces estimator variance by implementing system state-dependent importance sampling (IS) parameters, but lacks an automatic parameter identification strategy. In contrast, the dwSSA provides for the automatic determination of state-independent IS parameters, thus it is inefficient for systems whose states vary widely in time. We present a novel modification of the dwSSA—the state-dependent doubly weighted SSA (sdwSSA)—that combines the strengths of the swSSA and the dwSSA without inheriting their weaknesses. The sdwSSA automatically computes state-dependent IS parameters via the multilevel cross-entropy method. We apply the method to three examples: a reversible isomerization process, a yeast polarization model, and a lac operon model. Our results demonstrate that the sdwSSA offers substantial improvements over previous methods in terms of both accuracy and efficiency.

  12. Discrete event simulation for exploring strategies: an urban water management case.

    PubMed

    Huang, Dong-Bin; Scholz, Roland W; Gujer, Willi; Chitwood, Derek E; Loukopoulos, Peter; Schertenleib, Roland; Siegrist, Hansruedi

    2007-02-01

    This paper presents a model structure aimed at offering an overview of the various elements of a strategy and exploring their multidimensional effects through time in an efficient way. It treats a strategy as a set of discrete events planned to achieve a certain strategic goal and develops a new form of causal networks as an interfacing component between decision makers and environment models, e.g., life cycle inventory and material flow models. The causal network receives a strategic plan as input in a discrete manner and then outputs the updated parameter sets to the subsequent environmental models. Accordingly, the potential dynamic evolution of environmental systems caused by various strategies can be stepwise simulated. It enables a way to incorporate discontinuous change in models for environmental strategy analysis, and enhances the interpretability and extendibility of a complex model by its cellular constructs. It is exemplified using an urban water management case in Kunming, a major city in Southwest China. By utilizing the presented method, the case study modeled the cross-scale interdependencies of the urban drainage system and regional water balance systems, and evaluated the effectiveness of various strategies for improving the situation of Dianchi Lake.

  13. State-dependent doubly weighted stochastic simulation algorithm for automatic characterization of stochastic biochemical rare events.

    PubMed

    Roh, Min K; Daigle, Bernie J; Gillespie, Dan T; Petzold, Linda R

    2011-12-21

    In recent years there has been substantial growth in the development of algorithms for characterizing rare events in stochastic biochemical systems. Two such algorithms, the state-dependent weighted stochastic simulation algorithm (swSSA) and the doubly weighted SSA (dwSSA) are extensions of the weighted SSA (wSSA) by H. Kuwahara and I. Mura [J. Chem. Phys. 129, 165101 (2008)]. The swSSA substantially reduces estimator variance by implementing system state-dependent importance sampling (IS) parameters, but lacks an automatic parameter identification strategy. In contrast, the dwSSA provides for the automatic determination of state-independent IS parameters, thus it is inefficient for systems whose states vary widely in time. We present a novel modification of the dwSSA--the state-dependent doubly weighted SSA (sdwSSA)--that combines the strengths of the swSSA and the dwSSA without inheriting their weaknesses. The sdwSSA automatically computes state-dependent IS parameters via the multilevel cross-entropy method. We apply the method to three examples: a reversible isomerization process, a yeast polarization model, and a lac operon model. Our results demonstrate that the sdwSSA offers substantial improvements over previous methods in terms of both accuracy and efficiency.

  14. The data system dynamic simulation /DSDS/

    NASA Technical Reports Server (NTRS)

    Hooper, J. W.; Piner, J. R.

    1978-01-01

    The paper describes the development by NASA of the data system dynamic simulation (DSDS) which provides a data system simulation capability for a broad range of programs, with the capability to model and simulate all or any portion of an end-to-end data system to multiple levels of fidelity. Versatility is achieved by specifying parameters which define the performance characteristics of data system components, and by specifying control and data paths in a data system. DSDS helps reduce overall simulation cost and the time required for obtaining a data systems analysis, and helps provide both early realistic representations of data systems and the flexibility to study design changes and operating strategies.

  15. Validation of simulation strategies for the flow in a model propeller turbine during a runaway event

    NASA Astrophysics Data System (ADS)

    Fortin, M.; Houde, S.; Deschênes, C.

    2014-03-01

    Recent researches indicate that the useful life of a turbine can be affected by transient events. This study aims to define and validate strategies for the simulation of the flow within a propeller turbine model in runaway condition. Using unsteady pressure measurements on two runner blades for validation, different strategies are compared and their results analysed in order to quantify their precision. This paper will focus on justifying the choice of the simulations strategies and on the analysis of preliminary results.

  16. Nonhydrostatic, Mesobeta-Scale Model Simulations of Cloud Ceiling and Visibility for an East Coast Winter Precipitation Event.

    NASA Astrophysics Data System (ADS)

    Stoelinga, Mark T.; Warner, Thomas T.

    1999-04-01

    Experiments are described that provide an example of the baseline skill level for the numerical prediction of cloud ceiling and visibility, where application to aviation-system safety and efficiency is emphasized. Model simulations of a light, mixed-phase, East Coast precipitation event are employed to assess ceiling and visibility predictive skill, and its sensitivity to the use of data assimilation and the use of simple versus complex microphysics schemes. To obtain ceiling and visibility from the model-simulated, state-of-the-atmosphere variables, a translation algorithm was developed based on empirical and theoretical relationships between hydrometeor characteristics and light extinction. The model-simulated ceilings were generally excessively high; however, the visibility simulations were reasonably accurate and comparable to the existing operational terminal forecasts. The benefit of data assimilation for such very short-range forecasts was demonstrated, as was the desirability of employing a reasonably sophisticated microphysics scheme.

  17. High-speed event detector for embedded nanopore bio-systems.

    PubMed

    Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie

    2015-08-01

    Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.

  18. Adaptation in anaesthesia team coordination in response to a simulated critical event and its relationship to clinical performance.

    PubMed

    Burtscher, M J; Manser, T; Kolbe, M; Grote, G; Grande, B; Spahn, D R; Wacker, J

    2011-06-01

    Recent studies in anaesthesia and intensive care indicate that a team's ability to adapt its coordination activities to changing situational demands is crucial for effective teamwork and thus, safe patient care. This study addresses the relationship between adaptation of team coordination and markers of clinical performance in response to a critical event, particularly regarding which types of coordination activities are used and which team member engages in those coordination activities. Video recordings of 15 two-person anaesthesia teams (anaesthesia trainee plus anaesthesia nurse) performing a simulated induction of general anaesthesia were coded, using a structured observation system for coordination activities. The simulation involved a critical event-asystole during laryngoscopy. Clinical performance was assessed using two separate reaction times related to the critical event. Analyses of variance revealed a significant effect of the critical event on team coordination: after the occurrence of the asystole, team members adapted their coordination activities by spending more time on information management-a specific type of coordination activity (F(1,28)=15.17, P=0.001). No significant effect was found for task management. The increase in information management was related to faster decisions regarding how to respond to the critical event, but only for trainees and not for nurses. Our findings support the claim that adaptation of coordination activities is related to improved team performance in healthcare. Moreover, adaptation and its relationship to team performance were found to vary with regard to type of coordination activities and team member.

  19. Modeling Large Scale Circuits Using Massively Parallel Descrete-Event Simulation

    DTIC Science & Technology

    2013-06-01

    1,966,080 cores of the Sequoia Blue Gene/Q supercomputer system. For the PHOLD benchmark model, we demonstrate the ability to process 33 trillion...events in 65 seconds yielding a peak event-rate in excess of 504 billion events/second using 120 racks of Sequoia . 15. SUBJECT TERMS Circuit...13 Table 7 - SEQUOIA : Raw PHOLD Performance Data for 1, 2, 4, 8, and 48 Rack Runs ............ 18 Table 8 - SEQUOIA : Raw PHOLD Performance

  20. An Integrated Approach To Payload System Simulation

    NASA Technical Reports Server (NTRS)

    Lee, M.; Swartz, R. L., Jr.; Teng, A.; Weidner, R. J.

    1996-01-01

    This paper describes a payload system simulation implemented at JPL as part of a comprehensive mission simulation facility. The flight software function includes communication with other process modules, instrument control, and data management. The payload system simulation software consists of: a camera subsystem, a virtual world, and a mission visualization toolset.

  1. An Integrated Approach To Payload System Simulation

    NASA Technical Reports Server (NTRS)

    Lee, M.; Swartz, R. L., Jr.; Teng, A.; Weidner, R. J.

    1996-01-01

    This paper describes a payload system simulation implemented at JPL as part of a comprehensive mission simulation facility. The flight software function includes communication with other process modules, instrument control, and data management. The payload system simulation software consists of: a camera subsystem, a virtual world, and a mission visualization toolset.

  2. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  3. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  4. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Event-triggered output feedback control for distributed networked systems.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature.

  6. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A recommendation and a specification for the visual simulation system design for the space shuttle mission simulator are presented. A recommended visual system is described which most nearly meets the visual design requirements. The cost analysis of the recommended system covering design, development, manufacturing, and installation is reported. Four alternate systems are analyzed.

  7. Episodic simulation of future events is impaired in mild Alzheimer's disease

    PubMed Central

    Addis, Donna Rose; Sacchetti, Daniel C.; Ally, Brandon A.; Budson, Andrew E.; Schacter, Daniel L.

    2009-01-01

    Recent neuroimaging studies have demonstrated that both remembering the past and simulating the future activate a core neural network including the medial temporal lobes. Regions of this network, in particular the medial temporal lobes, are prime sites for amyloid deposition and are structurally and functionally compromised in Alzheimer's disease (AD). While we know some functions of this core network, specifically episodic autobiographical memory, are impaired in AD, no study has examined whether future episodic simulation is similarly impaired. We tested the ability of sixteen AD patients and sixteen age-matched controls to generate past and future autobiographical events using an adapted version of the Autobiographical Interview. Participants also generated five remote autobiographical memories from across the lifespan. Event transcriptions were segmented into distinct details, classified as either internal (episodic) or external (non-episodic). AD patients exhibited deficits in both remembering past events and simulating future events, generating fewer internal and external episodic details than healthy older controls. The internal and external detail scores were strongly correlated across past and future events, providing further evidence of the close linkages between the mental representations of past and future. PMID:19497331

  8. Simulation and field monitoring of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Rode, Matthias; Sass, Oliver

    2013-04-01

    Detachment of rock fragments from alpine rockwalls is mainly assigned to frost weathering. However, the actual process of frost weathering as well as the contribution of further weathering processes (e.g. hydration, thermal fatigue) is poorly understood. Rock moisture distribution during freeze-thaw events is key to understanding weathering. For this purpose, different measuring systems were set up in two study areas (Dachstein - permafrost area (2700m a.s.l.) and Gesäuse - non permafrost area (900m a.s.l.), Styria, Austria) within the framework of the research project ROCKING ALPS (FWF-P24244). We installed small-scale 2D-geoelectric survey lines in north and in south facing rockwalls, supplemented by high resolution temperature and moisture sensors. Moisture is determined by means of resistivity measurements which are difficult to calibrate, but provide good time series. Additional novel moisture sensors were developed which use the heat capacity of the surrounding rock as a proxy of water content. These sensors give point readings from a defined depth and are independent from soluble salt contents. Pore water pressure occurring during freeze-thaw events is recorded by means of pressure transducers (piezometers). First results from the Dachstein show that short term latent heat effects during the phase change have crucial influence on the moisture content. These results are cross-checked by simulation calculations. Based on meteorologic and lithologic input values, the simulation routine calculates, in an iterative procedure, the hourly energy and water transport at different depths, the latter in the liquid and in the vapor phase. The calculated profile lines and chronological sequences of rock moisture allow - in combination with temperature data - to detect possible periods of active weathering. First simulations from the Gesäuse show that maximum values of pore saturation occur from May to September. The thresholds of the "classical" frost shattering theory

  9. Adaptable, high recall, event extraction system with minimal configuration

    PubMed Central

    2015-01-01

    Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems

  10. A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris

    2017-04-01

    Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.

  11. Pesticide trapping efficiency of a modified backwater wetland using a simulated runoff event

    USDA-ARS?s Scientific Manuscript database

    This study examined the trapping efficiency of a modified backwater wetland amended with a mixture of three pesticides, atrazine, metolachlor, and fipronil, using a simulated runoff event. The 700 m long, 25 m wide wetland, located along the Coldwater River in Tunica County, Mississippi, was modifie...

  12. Simulations of barrier traversal and reflection times based on event enhanced quantum theory

    NASA Astrophysics Data System (ADS)

    Ruschhaupt, Andreas

    1998-12-01

    The formalism of event enhanced quantum theory is used to simulate traversal and reflection times of electrons through a one-dimensional barrier. The dependence of these times on the parameters of the barrier and the detectors is examined and the results are compared with those of selected approaches.

  13. Effects of a simulated agricultural runoff event on sediment toxicity in a managed backwater wetland

    USDA-ARS?s Scientific Manuscript database

    permethrin (both cis and trans isomers), on 10-day sediment toxicity to Hyalella azteca in a managed natural backwater wetland after a simulated agricultural runoff event. Sediment samples were collected at 10, 40, 100, 300, and 500 m from inflow 13 days prior to amendment and 1, 5, 12, 22, and 36 ...

  14. Critical event management with geographic information system technology

    NASA Astrophysics Data System (ADS)

    Booth, John F.; Young, Jeffrey M.

    1997-02-01

    Critical event management at the Los Angeles County Regional Criminal Information Clearinghouse (LACRCIC) provides for the deconfliction of operations, such as reverse stings, arrests, undercover buys/busts, searches, surveillances, and site surveys in the Los Angeles, Orange, Riverside, and San Bernardino county area. During these operations, the opportunity for officer-to-officer confrontation is high, possibly causing a worse case scenario -- officers drawing on each other resulting in friendly fire injuries or casualties. In order to prevent local, state, and federal agencies in the Los Angeles area from experiencing this scenario, the LACRCIC provides around the clock critical event management services via its secure war room. The war room maintains a multicounty detailed street-level map base and geographic information system (GIS) application to support this effort. Operations are telephoned in by the participating agencies and posted in the critical event management system by war room analysts. The application performs both a proximity search around the address and a commonality of suspects search. If a conflict is found, the system alerts the analyst by sounding an audible alarm and flashing the conflicting events on the automated basemap. The analyst then notifies the respective agencies of the conflicting critical events so coordination or rescheduling can occur.

  15. Explicit spatial scattering for load balancing in conservatively synchronized parallel discrete-event simulations

    SciTech Connect

    Thulasidasan, Sunil; Kasiviswanathan, Shiva; Eidenbenz, Stephan; Romero, Philip

    2010-01-01

    We re-examine the problem of load balancing in conservatively synchronized parallel, discrete-event simulations executed on high-performance computing clusters, focusing on simulations where computational and messaging load tend to be spatially clustered. Such domains are frequently characterized by the presence of geographic 'hot-spots' - regions that generate significantly more simulation events than others. Examples of such domains include simulation of urban regions, transportation networks and networks where interaction between entities is often constrained by physical proximity. Noting that in conservatively synchronized parallel simulations, the speed of execution of the simulation is determined by the slowest (i.e most heavily loaded) simulation process, we study different partitioning strategies in achieving equitable processor-load distribution in domains with spatially clustered load. In particular, we study the effectiveness of partitioning via spatial scattering to achieve optimal load balance. In this partitioning technique, nearby entities are explicitly assigned to different processors, thereby scattering the load across the cluster. This is motivated by two observations, namely, (i) since load is spatially clustered, spatial scattering should, intuitively, spread the load across the compute cluster, and (ii) in parallel simulations, equitable distribution of CPU load is a greater determinant of execution speed than message passing overhead. Through large-scale simulation experiments - both of abstracted and real simulation models - we observe that scatter partitioning, even with its greatly increased messaging overhead, significantly outperforms more conventional spatial partitioning techniques that seek to reduce messaging overhead. Further, even if hot-spots change over the course of the simulation, if the underlying feature of spatial clustering is retained, load continues to be balanced with spatial scattering leading us to the observation that

  16. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data

  17. Widespread, Very Heavy Precipitation Events in Contemporary and Scenario Summer Climates from NARCCAP Simulations

    NASA Astrophysics Data System (ADS)

    Kawazoe, S.; Gutowski, W. J., Jr.

    2015-12-01

    We analyze the ability of regional climate models (RCMs) to simulate very heavy daily precipitation and supporting processes for both contemporary and future-scenario simulations during summer (JJA). RCM output comes from North American Regional Climate Change Assessment Program (NARCCAP) simulations, which are all run at a spatial resolution of 50 km. Analysis focuses on the upper Mississippi basin for summer, between 1982-1998 for the contemporary climate, and 2052-2068 during the scenario climate. We also compare simulated precipitation and supporting processes with those obtained from observed precipitation and reanalysis atmospheric states. Precipitation observations are from the University of Washington (UW) and the Climate Prediction Center (CPC) gridded dataset. Utilizing two observational datasets helps determine if any uncertainties arise from differences in precipitation gridding schemes. Reanalysis fields come from the North American Regional Reanalysis. The NARCCAP models generally reproduce well the precipitation-vs.-intensity spectrum seen in observations, while producing overly strong precipitation at high intensity thresholds. In the future-scenario climate, there is a decrease in frequency for light to moderate precipitation intensities, while an increase in frequency is seen for the higher intensity events. Further analysis focuses on precipitation events exceeding the 99.5 percentile that occur simultaneously at several points in the region, yielding so-called "widespread events". For widespread events, we analyze local and large scale environmental parameters, such as 2-m temperature and specific humidity, 500-hPa geopotential heights, Convective Available Potential Energy (CAPE), vertically integrated moisture flux convergence, among others, to compare atmospheric states and processes leading to such events in the models and observations. The results suggest that an analysis of atmospheric states supporting very heavy precipitation events is a

  18. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  19. Topics in gravitation - numerical simulations of event horizons and parameter estimation for LISA

    NASA Astrophysics Data System (ADS)

    Cohen, Michael Isaac

    2011-08-01

    In Part I, we consider numerical simulations of event horizons. Event horizons are the defining physical features of black hole spacetimes, and are of considerable interest in studying black hole dynamics. Here, we reconsider three techniques to find event horizons in numerical spacetimes, and find that straightforward integration of geodesics backward in time is most robust. We apply this method to various systems, from a highly spinning Kerr hole through to an asymmetric binary black hole inspiral. We find that the exponential rate at which outgoing null geodesics diverge from the event horizon of a Kerr black hole is the surface gravity of the hole. In head-on mergers we are able to track quasi-normal ringing of the merged black hole through seven oscillations, covering a dynamic range of about 10^5. In the head-on "kick" merger, we find that computing the Landau-Lifshitz velocity of the event horizon is very useful for an improved understanding of the kick behaviour. Finally, in the inspiral simulations, we find that the topological structure of the black holes does not produce an intermediate toroidal phase, though the structure is consistent with a potential re-slicing of the spacetime in order to introduce such a phase. We further discuss the topological structure of non-axisymmetric collisions. In Part II, we consider parameter estimation of cosmic string burst gravitational waves in Mock LISA data. A network of observable, macroscopic cosmic (super-)strings may well have formed in the early Universe. If so, the cusps that generically develop on cosmic-string loops emit bursts of gravitational radiation that could be detectable by gravitational-wave interferometers, such as the ground-based LIGO/Virgo detectors and the planned, space-based LISA detector. We develop two versions of a LISA-oriented string-burst search pipeline within the context of the Mock LISA Data Challenges, which rely on the publicly available MultiNest and PyMC software packages

  20. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  1. Modelling the dependence and internal structure of storm events for continuous rainfall simulation

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Yeboah; Melching, Charles S.

    2012-09-01

    SummaryPair-copula construction methodology has been explored to model the dependence structure between net storm event depth (R), maximum wet periods' depth (M), and the total wet periods' duration (L), noting that the total storm event depth is RT = R + M. Random variable R was used instead of RT in order to avoid physical boundary effects due to the condition of RT ⩾ M. The flexibility of pair-copula construction allowed the examination of 11 bivariate copulas at the three bivariate stages of the three-dimensional (3D) copula. For 21 years of hourly rainfall data from Cook County, Illinois, USA, examined, three different copulas were found suitable for the bivariate stages. For the internal storm event structure, a Geometric distribution was used to model the net event duration, defined as the difference between the total duration (D) and L. A two-parameter Poisson model was adopted for modelling the distribution of the L wet periods within D, and the first-order autoregressive Lognormal model was applied for the distribution of RT over the L wet periods. Incorporation of an inter-event (I) sub-model completed the continuous rainfall simulation scheme. The strong seasonality in the marginal and dependence model parameters was captured using first harmonic Fourier series, thus, reducing the number of parameters. Polynomial functions were fitted to the internal storm event model parameters which did not exhibit seasonal variability. Four hundred simulation runs were carried out in order to verify the developed model. Kolmogorov-Smirnov (KS) tests found the hypothesis that the observed and simulated storm event quantiles come from the same distribution cannot be rejected at the 5% significance level in nearly all cases. Gross statistics (dry probability, mean, variance, skewness, autocorrelations, and the intensity-duration-frequency (IDF) curves) of the continuous rainfall time series at several aggregation levels were very well preserved by the developed model.

  2. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  3. Discrete event model-based simulation for train movement on a single-line railway

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ming; Li, Ke-Ping; Yang, Li-Xing

    2014-08-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption.

  4. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  5. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  6. Simulations of Transient Phenomena in Liquid Rocket Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, V.; Hosangadi, A.; Cavallo, P. A.; Daines, R.

    2006-01-01

    Valve systems in rocket propulsion systems and testing facilities are constantly subject to dynamic events resulting from the timing of valve motion leading to unsteady fluctuations in pressure and mass flow. Such events can also be accompanied by cavitation, resonance, system vibration leading to catastrophic failure. High-fidelity dynamic computational simulations of valve operation can yield important information of valve response to varying flow conditions. Prediction of transient behavior related to valve motion can serve as guidelines for valve scheduling, which is of crucial importance in engine operation and testing. Feed components operating in cryogenic regimes can also experience cavitation based instabilities leading to large scale shedding of vapor clouds and pressure oscillations. In this paper, we present simulations of the diverse unsteady phenomena related to valve and feed systems that include valve stall, valve timing studies as well as two different forms of cavitation instabilities in components utilized in the test loop.

  7. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  8. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  9. Software for event oriented processing on multiprocessor systems

    SciTech Connect

    Fischler, M.; Areti, H.; Biel, J.; Bracker, S.; Case, G.; Gaines, I.; Husby, D.; Nash, T.

    1984-08-01

    Computing intensive problems that require the processing of numerous essentially independent events are natural customers for large scale multi-microprocessor systems. This paper describes the software required to support users with such problems in a multiprocessor environment. It is based on experience with and development work aimed at processing very large amounts of high energy physics data.

  10. Exercise-Associated Collapse in Endurance Events: A Classification System.

    ERIC Educational Resources Information Center

    Roberts, William O.

    1989-01-01

    Describes a classification system devised for exercise-associated collapse in endurance events based on casualties observed at six Twin Cities Marathons. Major diagnostic criteria are body temperature and mental status. Management protocol includes fluid and fuel replacement, temperature correction, and leg cramp treatment. (Author/SM)

  11. Exercise-Associated Collapse in Endurance Events: A Classification System.

    ERIC Educational Resources Information Center

    Roberts, William O.

    1989-01-01

    Describes a classification system devised for exercise-associated collapse in endurance events based on casualties observed at six Twin Cities Marathons. Major diagnostic criteria are body temperature and mental status. Management protocol includes fluid and fuel replacement, temperature correction, and leg cramp treatment. (Author/SM)

  12. Simulation system of airborne FLIR searcher

    NASA Astrophysics Data System (ADS)

    Sun, Kefeng; Li, Yu; Gao, Jiaobo; Wang, Jun; Wang, Jilong; Xie, Junhu; Ding, Na; Sun, Dandan

    2014-11-01

    Airborne Forward looking infra-red (FLIR) searcher simulation system can provide multi-mode simulated test environment that almost actual field environment, and can simulate integrated performance and external interface of airborne FLIR simulation system. Furthermore, the airborne FLIR searcher simulation system can support the algorithm optimization of image processing, and support the test and evaluation of electro-optical system, and also support the line test of software and evaluate the performance of the avionics system. The detailed design structure and information cross-linking relationship of each component are given in this paper. The simulation system is composed of the simulation center, the FLIR actuator, the FLIR emulator, and the display control terminal. The simulation center can generate the simulated target and aircraft flying data in the operation state of the airborne FLIR Searcher. The FLIR actuator can provide simulation scene. It can generate the infrared target and landform based scanning scene, response to the commands from simulation center and the FLIR actuator and operation control unit. The infrared image generated by the FLIR actuator can be processed by the FLIR emulator using PowerPC hardware framework and processing software based on VxWorks system. It can detect multi-target and output the DVI video and the multi-target detection information which corresponds to the working state of the FLIR searcher. Display control terminal can display the multi-target detection information in two-dimension situation format, and realize human-computer interaction function.

  13. Event-based H2/H∞ controllers for networked control systems

    NASA Astrophysics Data System (ADS)

    Orihuela, L.; Millán, P.; Vivas, C.; Rubio, F. R.

    2014-12-01

    This paper is concerned with event-based H2/H∞ control design for networked systems with interval time-varying delays. The contributions are twofold. First, conditions for uniform ultimately bounded stability are provided in the H2/H∞ event-based context. The relation between the boundedness of the stability region and the threshold that triggers the events is studied. Second, a practical design procedure for event-based H2/H∞ control is provided. The method makes use of Lyapunov-Krasovskii functionals (LKFs) and it is characterised by its generality, as only mild assumptions are imposed on the structures of the LKF and the cost functional. The robustness and performance of the proposed technique is showed through numerical simulations.

  14. A computer aided treatment event recognition system in radiation therapy

    SciTech Connect

    Xia, Junyi Mart, Christopher; Bayouth, John

    2014-01-15

    Purpose: To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. Methods: CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012–November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors’ clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. Results: All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when

  15. Soil Organic Carbon Loss and Selective Transportation under Field Simulated Rainfall Events

    PubMed Central

    Nie, Xiaodong; Li, Zhongwu; Huang, Jinquan; Huang, Bin; Zhang, Yan; Ma, Wenming; Hu, Yanbiao; Zeng, Guangming

    2014-01-01

    The study on the lateral movement of soil organic carbon (SOC) during soil erosion can improve the understanding of global carbon budget. Simulated rainfall experiments on small field plots were conducted to investigate the SOC lateral movement under different rainfall intensities and tillage practices. Two rainfall intensities (High intensity (HI) and Low intensity (LI)) and two tillage practices (No tillage (NT) and Conventional tillage (CT)) were maintained on three plots (2 m width × 5 m length): HI-NT, LI-NT and LI-CT. The rainfall lasted 60 minutes after the runoff generated, the sediment yield and runoff volume were measured and sampled at 6-min intervals. SOC concentration of sediment and runoff as well as the sediment particle size distribution were measured. The results showed that most of the eroded organic carbon (OC) was lost in form of sediment-bound organic carbon in all events. The amount of lost SOC in LI-NT event was 12.76 times greater than that in LI-CT event, whereas this measure in HI-NT event was 3.25 times greater than that in LI-NT event. These results suggest that conventional tillage as well as lower rainfall intensity can reduce the amount of lost SOC during short-term soil erosion. Meanwhile, the eroded sediment in all events was enriched in OC, and higher enrichment ratio of OC (ERoc) in sediment was observed in LI events than that in HI event, whereas similar ERoc curves were found in LI-CT and LI-NT events. Furthermore, significant correlations between ERoc and different size sediment particles were only observed in HI-NT event. This indicates that the enrichment of OC is dependent on the erosion process, and the specific enrichment mechanisms with respect to different erosion processes should be studied in future. PMID:25166015

  16. Hydrological Simulation of Flood Events At Large Basins Using Distributed Modelling

    NASA Astrophysics Data System (ADS)

    Vélez, J.; Vélez, I.; Puricelli, M.; Francés, F.

    Recent advances in technology allows to the scientist community advance in new pro- cedures in order to reduce the risk associated to flood events. A conceptual distributed model has been implemented to simulate the hydrological processes involved during floods. The model has been named TETIS. The basin is divided into rectangular cells, all of them connected according to the network drainage. The rainfall-runoff process is modelled using four linked tanks at each cell with different outflow relationships at each tank, which represent the ET, direct runoff, interflow and base flow, respectively. The routing along the channel network has been proposed using basin geomorpho- logic characteristics coupled to the cinematic wave procedure. The vertical movement along the cell is proposed using simple relationships based on soil properties as field capacity and the saturated hydraulic conductivities, which were previously obtained using land use, litology, edaphology and basin properties maps. The different vertical proccesses along the cell included are: capillar storage, infiltration, percolation and underground losses. Finally, snowmelting and reservoir routing has been included. TETIS has been implemented in the flood warning system of the Tagus River, with a basin of 59 200 km2. The time discretization of the input data is 15 minutes, and the cell size is 500x500 m. The basic parameter maps were estimated for the entire basin, and a calibration and validation processes were performed using some recorded events in the upper part of the basin. Calibration confirmed the initial parameter estimation. Additionally, the validation in time and space showed the robustness of these types of models

  17. TLEs and early VLF events: Simulating the important impact of transmitter-disturbance-receiver geometry

    NASA Astrophysics Data System (ADS)

    NaitAmor, S.; Ghalila, H.; Cohen, M. B.

    2017-01-01

    Early very low frequency (VLF) events are perturbations to subionospherically propagating VLF radio transmitters which sometimes occur when lightning activity is near the transmitter-receiver path. They are often correlated to Transient Luminous Events (TLEs). Recent analysis have focused on a new type of early events whose recovery time persists for many minutes, called LOng Recovery Events (LOREs). The underlying cause of these events is still unclear. Curiously, LOREs sometimes appear on only one path, while the same event observed on a different transmitter-receiver path does not indicate a LORE. In this paper we observe and simulate two cases of early signal perturbations: The first is a typical early VLF event, and the second is a LORE. Both were recorded by two AWESOME VLF receivers in North Africa on 12 December 2009, during the EuroSprite campaign. We combine observations with theoretical modeling to infer the electron density change that most closely reproduces the observed perturbation. Our results explain the cases where LOREs are detected on only one path as resulting from transmitter-receiver geometry significantly which impacts the modal content and therefore the observed VLF recovery time.

  18. Computer simulation and discrete-event models in the analysis of a mammography clinic patient flow.

    PubMed

    Coelli, Fernando C; Ferreira, Rodrigo B; Almeida, Renan Moritz V R; Pereira, Wagner Coelho A

    2007-09-01

    This work develops a discrete-event computer simulation model for the analysis of a mammography clinic performance. Two mammography clinic computer simulation models were developed, based on an existing public sector clinic of the Brazilian Cancer Institute, located in Rio de Janeiro city, Brazil. Two clinics in a total of seven configurations (number of equipment units and working personnel) were studied. Models tried to simulate changes in patient arrival rates, number of equipment units, available personnel (technicians and physicians), equipment maintenance scheduling schemes and exam repeat rates. Model parameters were obtained by direct measurements and literature reviews. A commercially-available simulation software was used for model building. The best patient scheduling (patient arrival rate) for the studied configurations had an average of 29 min for Clinic 1 (consisting of one mammography equipment, one to three technicians and one physician) and 21 min for Clinic 2 (two mammography equipment units, one to four technicians and one physician). The exam repeat rates and equipment maintenance scheduling simulations indicated that a large impact over patient waiting time would appear in the smaller capacity configurations. Discrete-event simulation was a useful tool for defining optimal operating conditions for the studied clinics, indicating the most adequate capacity configurations and equipment maintenance schedules.

  19. Residential photovoltaic system simulation: Thermal aspects

    NASA Astrophysics Data System (ADS)

    Hart, G. W.; Raghuraman, P.

    1982-04-01

    A TRNSYS simulation was developed to simulate the performance of utility interactive residential photovoltaic energy systems. The PV system is divided into its major functional components, which are individually described with computer models. These models are described in detail. The results of simulation and actual measured data obtained a MIT Lincoln Laboratory's Northeast Residential Station are compared. The thermal influences on the design of such photovoltaic energy systems are given particular attention.

  20. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  1. Polar cap potential saturation during the Bastille Day storm event using global MHD simulation

    NASA Astrophysics Data System (ADS)

    Kubota, Y.; Nagatsuma, T.; Den, M.; Tanaka, T.; Fujita, S.

    2017-04-01

    We investigated the temporal variations and saturation of the cross polar cap potential (CPCP) in the Bastille Day storm event (15 July 2000) by global magnetohydrodynamics (MHD) simulation. The CPCP is considered to depend on the electric field and dynamic pressure of the solar wind as well as on the ionospheric conductivity. Previous studies considered only the ionospheric conductivity due to solar extreme ultraviolet (EUV) variations. In this paper, we dealt with the changes in the CPCP attributable to auroral conductivity variations caused by pressure enhancement in the inner magnetosphere owing to energy injection from the magnetosphere because the energy injection is considerably enhanced in a severe magnetic storm event. Our simulation reveals that the auroral conductivity enhancement is significant for the CPCP variation in a severe magnetic storm event. The numerical results concerning the Bastille Day event show that the ionospheric conductivity averaged over the auroral oval is enhanced up to 18 mho in the case of Bz of less than -59 nT. On the other hand, the average conductivity without the auroral effect is almost 6 mho throughout the entire period. Resultantly, the saturated CPCP is about 240 kV in the former and 704 kV in the latter when Bz is -59 nT. This result indicates that the CPCP variations could be correctly reproduced when the time variation of auroral conductivity caused by pressure enhancement due to the energy injection from the magnetosphere is correctly considered in a severe magnetic storm event.

  2. Monitoring Cave Recharge in the Edwards Aquifer Recharge Zone for Natural and Simulated Rainfall Events

    NASA Astrophysics Data System (ADS)

    Gregory, L.; Veni, G.; Shade, B.; Wilcox, B. P.; Munster, C. L.; Owens, M. K.

    2005-12-01

    Across semi-arid regions of the world, woody plant encroachment is widespread with potential implications for groundwater recharge and streamflow. In an effort to better understand the interactions between woody plants and recharge, we are monitoring drip rates in shallow caves in the Edwards Aquifer recharge zone of Central Texas. The surface is covered by a dense stand of ashe juniper (Juniperus ashei). In addition to stemflow, throughfall, and surface runoff was monitored for both natural precipitation events as well as simulated rainfall. Interception and throughfall are measured using a grid of rain gauges and throughfall collectors. Surface runoff measurements were quantified with a 15.24 centimeter H- flume instrumented with an ultrasonic water level sensor. Drip collectors constructed inside the cave collect recharge entering the cave from the ceiling. Large scale rainfall simulation equipment onsite allows us to "re-create" these naturally occurring rainfall events and compare the resulting data with that from the original event. Performing these types of tests allows us to learn important information about the cave footprint's ability to transmit recharge waters into the cave. During a simulation, water is applied directly to the cave footprint and not to the entire hillslope as in a natural rain event. We found that recharge for the natural and simulated events were similar. In each case, recharge makes up less than 5% of the water budget, in spite of the fact that there was little, if any, surface runoff. The working hypothesis is that most of the rainfall is routed off the hillslope as lateral subsurface flow.

  3. Microbial ice nucleators scavenged from the atmosphere during simulated rain events

    NASA Astrophysics Data System (ADS)

    Hanlon, Regina; Powers, Craig; Failor, Kevin; Monteil, Caroline L.; Vinatzer, Boris A.; Schmale, David G.

    2017-08-01

    Rain and snow collected at ground level have been found to contain biological ice nucleators. These ice nucleators have been proposed to have originated in clouds, where they may have participated in the formation of precipitation via ice phase nucleation. We conducted a series of field experiments to test the hypothesis that at least some of the microbial ice nucleators (prokaryotes and eukaryotes) present in rain may not originate in clouds but instead be scavenged from the lower atmosphere by rainfall. Thirty-three simulated rain events were conducted over four months off the side of the Smart Road Bridge in Blacksburg, VA, USA. In each event, sterile water was dispensed over the side of the bridge and recovered in sterile containers in an open fallow agricultural field below (a distance of ∼55 m). Microbes scavenged from the simulated rain events were cultured and their ice nucleation activity was examined. Putative microbial ice nucleators were cultured from 94% (31/33) of the simulated rain events, and represented 1.5% (121/8331) of the total colonies assayed. Putative ice nucleators were subjected to additional droplet freezing assays, and those confirmed through these repeated assays represented 0.4% (34/8331) of the total. Mean CFUs scavenged by simulated rain ranged from 2 to 267 CFUs/mL. Scavenged ice nucleators belong to a number of taxa including the bacterial genera Pseudomonas, Pantoea, and Xanthomonas, and the fungal genera Fusarium, Humicola, and Mortierella. An ice-nucleating strain of the fungal genus Penicillium was also recovered from a volumetric air sampler at the study site. This work expands our knowledge of the scavenging properties of rainfall, and suggests that at least some ice nucleators in natural precipitation events may have been scrubbed from the atmosphere during rainfall, and thus are not likely to be involved in precipitation.

  4. Military healthcare providers reporting of adverse events following immunizations to the vaccine adverse event reporting system.

    PubMed

    Li, Rongxia; McNeil, Michael M; Pickering, Susanne; Pemberton, Michael R; Duran, Laurie L; Collins, Limone C; Nelson, Michael R; Engler, Renata J M

    2014-04-01

    We studied military health care provider (HCP) practices regarding reporting of adverse events following immunization (AEFI). A convenience sample of HCP was surveyed to assess familiarity with Vaccine Adverse Event Reporting System (VAERS), AEFI they were likely to report, methods used and preferred for reporting, and perceived barriers to reporting. We analyzed factors associated with HCP reporting AEFI to VAERS. A total of 547 surveys were distributed with 487 completed and returned for an 89% response rate. The percentage of HCP aware of VAERS (54%) varied by occupation. 47% of respondents identified knowledge of at least one AEFI with only 34% of these indicating that they had ever reported to VAERS. More serious events were more likely to be reported. Factors associated with HCP reporting AEFIs in bivariate analysis included HCP familiarity with filing a paper VAERS report, HCP familiarity with filing an electronic VAERS report, HCP familiarity with VAERS, and time spent on immunization tasks. In a multivariable analysis, only HCP familiarity with filing a paper VAERS report was statistically significant (Odds ratio = 115.3; p < 0.001). Specific educational interventions targeted to military HCP likely to see AEFIs but not currently filing VAERS reports may improve vaccine safety reporting practices. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  5. Event-chain Monte Carlo algorithms for hard-sphere systems.

    PubMed

    Bernard, Etienne P; Krauth, Werner; Wilson, David B

    2009-11-01

    In this paper we present the event-chain algorithms, which are fast Markov-chain Monte Carlo methods for hard spheres and related systems. In a single move of these rejection-free methods, an arbitrarily long chain of particles is displaced, and long-range coherent motion can be induced. Numerical simulations show that event-chain algorithms clearly outperform the conventional Metropolis method. Irreversible versions of the algorithms, which violate detailed balance, improve the speed of the method even further. We also compare our method with a recent implementations of the molecular-dynamics algorithm.

  6. Event-triggered reliable control for fuzzy Markovian jump systems with mismatched membership functions.

    PubMed

    Hou, Liyuan; Cheng, Jun; Qi, Wenhai

    2017-01-01

    The problem of event-triggered reliable control for fuzzy Markovian jump system (FMJS) with mismatched membership functions (MMFs) is addressed. Based on the mode-dependent reliable control and event-triggered communication scheme, the stability conditions and control design procedure are formulated. More precisely, a general actuator-failure is designed such that the FMJS is reliable in the sense of stochastically stable and reduce the utilization of network resources. Furthermore, the improved MMFs are introduced to reduce the conservativeness of obtained results. Finally, simulation results indicate the effectiveness of the proposed methodology.

  7. Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.

    PubMed

    Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung

    2012-04-10

    We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.

  8. Using Discrete Event Computer Simulation to Improve Patient Flow in a Ghanaian Acute Care Hospital

    PubMed Central

    Best, Allyson M.; Dixon, Cinnamon A.; Kelton, W. David; Lindsell, Christopher J.

    2014-01-01

    Objectives Crowding and limited resources have increased the strain on acute care facilities and emergency departments (EDs) worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation (DES) is a computer-based tool that can be used to estimate how changes to complex healthcare delivery systems, such as EDs, will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. Methods We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (e.g. modified staff start times and roles) and resource-additional (e.g. increased staff) operational interventions on patient throughput. Previously captured, de-identified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). Results The base-case (no change) scenario had a mean LOS of 292 minutes (95% CI 291, 293). In isolation, neither adding staffing, changing staff roles, nor varying shift times affected overall patient LOS. Specifically, adding two registration workers, history takers, and physicians resulted in a 23.8 (95% CI 22.3, 25.3) minute LOS decrease. However, when shift start-times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI 94, 98); and with the simultaneous combination of staff roles (Registration and History-taking) there was an overall mean LOS reduction of 152 minutes (95% CI 150, 154). Conclusions Resource-neutral interventions identified through DES modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. DES offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute

  9. Simulation of Anomalous Regional Climate Events with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    1999-01-01

    The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.

  10. Mixed-realism simulation of adverse event disclosure: an educational methodology and assessment instrument.

    PubMed

    Matos, Francisco M; Raemer, Daniel B

    2013-04-01

    Physicians have an ethical duty to disclose adverse events to patients or families. Various strategies have been reported for teaching disclosure, but no instruments have been shown to be reliable for assessing them.The aims of this study were to report a structured method for teaching adverse event disclosure using mixed-realism simulation, develop and begin to validate an instrument for assessing performance, and describe the disclosure practice of anesthesiology trainees. Forty-two anesthesiology trainees participated in a 2-part exercise with mixed-realism simulation. The first part took place using a mannequin patient in a simulated operating room where trainees became enmeshed in a clinical episode that led to an adverse event and the second part in a simulated postoperative care unit where the learner is asked to disclose to a standardized patient who systematically moves through epochs of grief response. Two raters scored subjects using an assessment instrument we developed that combines a 4-element behaviorally anchored rating scale (BARS) and a 5-stage objective rating scale. The performance scores for elements within the BARS and the 5-stage instrument showed excellent interrater reliability (Cohen's κ = 0.7), appropriate range (mean range for BARS, 4.20-4.47; mean range for 5-stage instrument, 3.73-4.46), and high internal consistency (P < 0.05). We have demonstrated a comprehensive methodology using a mixed-realism simulation that engages learners in an adverse event and allows them to practice disclosure to a structured range of patient responses. We have developed a reliable 2-part instrument with strong psychometric properties for assessing disclosure performance.

  11. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling.

    PubMed

    Sauer, Bryan G; Singh, Kanwar P; Wagner, Barry L; Vanden Hoek, Matthew S; Twilley, Katherine; Cohn, Steven M; Shami, Vanessa M; Wang, Andrew Y

    2016-11-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience.

  12. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling

    PubMed Central

    Sauer, Bryan G.; Singh, Kanwar P.; Wagner, Barry L.; Vanden Hoek, Matthew S.; Twilley, Katherine; Cohn, Steven M.; Shami, Vanessa M.; Wang, Andrew Y.

    2016-01-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience. PMID:27853739

  13. Discrete event simulation for healthcare organizations: a tool for decision making.

    PubMed

    Hamrock, Eric; Paige, Kerrie; Parks, Jennifer; Scheulen, James; Levin, Scott

    2013-01-01

    Healthcare organizations face challenges in efficiently accommodating increased patient demand with limited resources and capacity. The modern reimbursement environment prioritizes the maximization of operational efficiency and the reduction of unnecessary costs (i.e., waste) while maintaining or improving quality. As healthcare organizations adapt, significant pressures are placed on leaders to make difficult operational and budgetary decisions. In lieu of hard data, decision makers often base these decisions on subjective information. Discrete event simulation (DES), a computerized method of imitating the operation of a real-world system (e.g., healthcare delivery facility) over time, can provide decision makers with an evidence-based tool to develop and objectively vet operational solutions prior to implementation. DES in healthcare commonly focuses on (1) improving patient flow, (2) managing bed capacity, (3) scheduling staff, (4) managing patient admission and scheduling procedures, and (5) using ancillary resources (e.g., labs, pharmacies). This article describes applicable scenarios, outlines DES concepts, and describes the steps required for development. An original DES model developed to examine crowding and patient flow for staffing decision making at an urban academic emergency department serves as a practical example.

  14. Numerical simulations of the jetted tidal disruption event Swift J1644+57

    NASA Astrophysics Data System (ADS)

    Mimica, Petar; Aloy, Miguel A.; Giannios, Dimitrios; Metzger, Brian D.

    2016-05-01

    In this work we focus on the technical details of the numerical simulations of the non-thermal transient Swift J1644+57, whose emission is probably produced by a two- component jet powered by a tidal disruption event. In this context we provide details of the coupling between the relativistic hydrodynamic simulations and the radiative transfer code. First, we consider the technical demands of one-dimensional simulations of a fast relativistic jet, and show to what extent (for the same physical parameters of the model) do the computed light curves depend on the numerical parameters of the different codes employed. In the second part we explain the difficulties of computing light curves from axisymmetric two dimensonal simulations and discuss a procedure that yields an acceptable tradeoff between the computational cost and the quality of the results.

  15. Numerical Simulation and Analysis of the Localized Heavy Precipitation Event in South Korea based on diagnostic variables

    NASA Astrophysics Data System (ADS)

    Roh, Joon-Woo; Choi, Young-Jean

    2016-04-01

    Accurate prediction of precipitation is one of the most difficult and significant tasks in weather forecasting. Heavy precipitations in the Korean Peninsula are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. Many previous studies have used observations, numerical modeling, and statistics to investigate the potential causes of warm-season heavy precipitation in South Korea. Especially, the frequency of warm-season torrential rainfall events more than 30 mm/h precipitation has increased threefold in Seoul, a metropolitan city in South Korea, in recent 30 years. Localized heavy rainfall events in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances along the Changma front, or from convective instabilities resulting from unstable air masses. In order to investigate localized heavy precipitation system in Seoul metropolitan area, analysis and numerical experiment were performed for a typical event in 20 June 2014. This case is described to a structure of baroclinic instability associated with a short-wave trough from the northwest and high moist and warm air by a thermal low from the southwest of the Korean Peninsula. We investigated localized heavy precipitation in narrow zone of the Seoul urban area using numerical simulations based on the Weather Research and Forecast (WRF) model with convective scale. The topography and land use data of the revised U.S. Geological Survey (USGS) data and the appropriate set of physical scheme options for WRF model simulation were deliberated. Simulation experiments showed patches of primary physical structures related to the localized heavy precipitation using the diagnostic fields, which are storm relative helicity (SRH), updraft helicity (UH), and instantaneous contraction rates (ICON). SRH and UH are dominantly related to

  16. Wire chamber requirements and tracking simulation studies for tracking systems at the superconducting super collider

    SciTech Connect

    Hanson, G.G.; Niczyporuk, B.B.; Palounek, A.P.T.

    1989-02-01

    Limitations placed on wire chambers by radiation damage and rate requirements in the SSC environment are reviewed. Possible conceptual designs for wire chamber tracking systems which meet these requirements are discussed. Computer simulation studies of tracking in such systems are presented. Simulations of events from interesting physics at the SSC, including hits from minimum bias background events, are examined. Results of some preliminary pattern recognition studies are given. Such computer simulation studies are necessary to determine the feasibility of wire chamber tracking systems for complex events in a high-rate environment such as the SSC. 11 refs., 9 figs., 1 tab.

  17. Synchronized phasor measurements of a power system event

    SciTech Connect

    Burnett, R.O.; Butts, M.M.; Cease, T.W.; Centeno, V.; Michel, G.; Murphy, R.J.; Phadke, A.G.

    1994-08-01

    The paper describes one of the first field measurements of positive sequence voltage phasors at key system buses during a recent switching operation at Plant Scherer of the Georgia Power Company. The phasor measurements were synchronized using time transmissions from the Global Positioning System (GPS) satellites. The data show the first ever observation of power swings recorded via synchronized phasors at several points on an integrated power network. Measurements were made on the Georgia Power Company (GPC) system, the Florida Power and Light (FPL) system, and on the Tennessee Valley Authority (TVA) system. The disturbance was also simulated on a stability program. Results of the simulation, and a comparison with the observed field data are also included.

  18. Safety Discrete Event Models for Holonic Cyclic Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Ciufudean, Calin; Filote, Constantin

    In this paper the expression “holonic cyclic manufacturing systems” refers to complex assembly/disassembly systems or fork/join systems, kanban systems, and in general, to any discrete event system that transforms raw material and/or components into products. Such a system is said to be cyclic if it provides the same sequence of products indefinitely. This paper considers the scheduling of holonic cyclic manufacturing systems and describes a new approach using Petri nets formalism. We propose an approach to frame the optimum schedule of holonic cyclic manufacturing systems in order to maximize the throughput while minimize the work in process. We also propose an algorithm to verify the optimum schedule.

  19. Evaluating the aerosol indirect effect in WRF-Chem simulations of the January 2013 Beijing air pollution event.

    NASA Astrophysics Data System (ADS)

    Peckham, Steven; Grell, Georg; Xie, Ying; Wu, Jian-Bin

    2015-04-01

    In January 2013, an unusual weather pattern over Northern China produced unusually cool, moist conditions for the region. Recent peer-reviewed scientific manuscripts report that during this time period, Beijing experienced a historically severe haze and smog event with observed monthly average fine particulate matter (PM2.5) concentrations exceeding 225 micrograms per cubic meter. MODIS satellite observations produced AOD values of approximately 1.5 to 2 for the same time. In addition, over eastern and northern China record-breaking hourly average PM2.5 concentrations of more than 700 μg m-3 were observed. Clearly, the severity and persistence of this air pollution episode has raised the interest of the scientific community as well as widespread public attention. Despite the significance of this and similar air pollution events, several questions regarding the ability of numerical weather prediction models to forecast such events remain. Some of these questions are: • What is the importance of including aerosols in the weather prediction models? • What is the current capability of weather prediction models to simulate aerosol impacts upon the weather? • How important is it to include the aerosol feedbacks (direct and indirect effect) in the numerical model forecasts? In an attempt to address these and other questions, a Joint Working Group of the Commission for Atmospheric Sciences and the World Climate Research Programme has been convened. This Working Group on Numerical Experimentation (WGNE), has set aside several events of interest and has asked its members to generate numerical simulations of the events and examine the results. As part of this project, weather and pollution simulations were produced at the NOAA Earth System Research Laboratory using the Weather Research and Forecasting (WRF) chemistry model. These particular simulations include the aerosol indirect effect and are being done in collaboration with a group in China that will produce

  20. Safety monitoring in the Vaccine Adverse Event Reporting System (VAERS)

    PubMed Central

    Shimabukuro, Tom T.; Nguyen, Michael; Martin, David; DeStefano, Frank

    2015-01-01

    The Centers for Disease Control and Prevention (CDC) and the U.S. Food and Drug Administration (FDA) conduct post-licensure vaccine safety monitoring using the Vaccine Adverse Event Reporting System (VAERS), a spontaneous (or passive) reporting system. This means that after a vaccine is approved, CDC and FDA continue to monitor safety while it is distributed in the marketplace for use by collecting and analyzing spontaneous reports of adverse events that occur in persons following vaccination. Various methods and statistical techniques are used to analyze VAERS data, which CDC and FDA use to guide further safety evaluations and inform decisions around vaccine recommendations and regulatory action. VAERS data must be interpreted with caution due to the inherent limitations of passive surveillance. VAERS is primarily a safety signal detection and hypothesis generating system. Generally, VAERS data cannot be used to determine if a vaccine caused an adverse event. VAERS data interpreted alone or out of context can lead to erroneous conclusions about cause and effect as well as the risk of adverse events occurring following vaccination. CDC makes VAERS data available to the public and readily accessible online. We describe fundamental vaccine safety concepts, provide an overview of VAERS for healthcare professionals who provide vaccinations and might want to report or better understand a vaccine adverse event, and explain how CDC and FDA analyze VAERS data. We also describe strengths and limitations, and address common misconceptions about VAERS. Information in this review will be helpful for healthcare professionals counseling patients, parents, and others on vaccine safety and benefit-risk balance of vaccination. PMID:26209838

  1. WRF simulation of downslope wind events in coastal Santa Barbara County

    NASA Astrophysics Data System (ADS)

    Cannon, Forest; Carvalho, Leila M. V.; Jones, Charles; Hall, Todd; Gomberg, David; Dumas, John; Jackson, Mark

    2017-07-01

    The National Weather Service (NWS) considers frequent gusty downslope winds, accompanied by rapid warming and decreased relative humidity, among the most significant weather events affecting southern California coastal areas in the vicinity of Santa Barbara (SB). These extreme conditions, commonly known as ;sundowners;, have affected the evolution of all major wildfires that impacted SB in recent years. Sundowners greatly increase fire, aviation and maritime navigation hazards and are thus a priority for regional forecasting. Currently, the NWS employs the Weather Research Forecasting (WRF) model at 2 km resolution to complement forecasts at regional-to-local scales. However, no systematic study has been performed to evaluate the skill of WRF in simulating sundowners. This research presents a case study of an 11-day period in spring 2004 during which sundowner events were observed on multiple nights. We perform sensitivity experiments for WRF using available observations for validation and demonstrate that WRF is skillful in representing the general mesoscale structure of these events, though important shortcomings exist. Furthermore, we discuss the generation and evolution of sundowners during the case study using the best performing configuration, and compare these results to hindcasts for two major SB fires. Unique, but similar, profiles of wind and stability are observed over SB between case studies despite considerable differences in large-scale circulation, indicating that common conditions may exist across all events. These findings aid in understanding the evolution of sundowner events and are potentially valuable for event prediction.

  2. Effects of microphysics parameterization schemes on the simulation of a heavy rainfall event in Shanghai

    NASA Astrophysics Data System (ADS)

    Kan, Yu; Liu, Chaoshun; Qiao, Fengxue; Liu, Yanan; Gao, Wei; Sun, Zhibin

    2016-09-01

    A typical heavy rainfall event occurred in Shanghai on September 13, 2009 was simulated using the Weather Research and Forecasting Model (WRF) to study the impact of microphysics parameterization on heavy precipitation simulations. Sensitivity experiments were conducted using the cumulus parameterization scheme of Betts-Miller-Janjic (BMJ), but with three different microphysics schemes (Lin et al, WRF Single-Moment 5-class scheme (WSM5) and WRF Single-Moment 6-class scheme (WSM6)) under three-way nested domains with horizontal resolutions of 36km, 12km and 4km. The results showed that all three microphysics schemes are able to capture the general pattern of this heavy rainfall event, but differ in simulating the location, center and intensity of precipitation. Specifically, the Lin scheme overestimated the rainfall intensity and simulated the rainfall location drifting northeastwards. However, the WSM5 scheme better simulated the rainfall location but stronger intensity than the observation, while the WSM6 scheme better produced the rainfall intensity, but with an unrealistic rainfall area.

  3. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  4. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  5. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    SciTech Connect

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  6. Developing clinical competency in crisis event management: an integrated simulation problem-based learning activity.

    PubMed

    Liaw, S Y; Chen, F G; Klainin, P; Brammer, J; O'Brien, A; Samarasekera, D D

    2010-08-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session would be superior to those who completed the conventional problem-based session. The students were allocated into either simulation with problem-based discussion (SPBD) or problem-based discussion (PBD) for scenarios on respiratory and cardiac distress. Following completion of each scenario, students from both groups were invited to sit an optional individual test involving a systematic assessment and immediate management of a simulated patient facing a crisis event. A total of thirty students participated in the first post test related to a respiratory scenario and thirty-three participated in the second post test related to a cardiac scenario. Their clinical performances were scored using a checklist. Mean test scores for students completing the SPBD were significantly higher than those who completing the PBD for both the first post test (SPBD 20.08, PBD 18.19) and second post test (SPBD 27.56, PBD 23.07). Incorporation of simulation learning activities into problem-based discussion appeared to be an effective educational strategy for teaching nursing students to assess and manage crisis events.

  7. Event-Based Robust Control for Uncertain Nonlinear Systems Using Adaptive Dynamic Programming.

    PubMed

    Zhang, Qichao; Zhao, Dongbin; Wang, Ding

    2016-10-18

    In this paper, the robust control problem for a class of continuous-time nonlinear system with unmatched uncertainties is investigated using an event-based control method. First, the robust control problem is transformed into a corresponding optimal control problem with an augmented control and an appropriate cost function. Under the event-based mechanism, we prove that the solution of the optimal control problem can asymptotically stabilize the uncertain system with an adaptive triggering condition. That is, the designed event-based controller is robust to the original uncertain system. Note that the event-based controller is updated only when the triggering condition is satisfied, which can save the communication resources between the plant and the controller. Then, a single network adaptive dynamic programming structure with experience replay technique is constructed to approach the optimal control policies. The stability of the closed-loop system with the event-based control policy and the augmented control policy is analyzed using the Lyapunov approach. Furthermore, we prove that the minimal intersample time is bounded by a nonzero positive constant, which excludes Zeno behavior during the learning process. Finally, two simulation examples are provided to demonstrate the effectiveness of the proposed control scheme.

  8. Method for simulating discontinuous physical systems

    DOEpatents

    Baty, Roy S.; Vaughn, Mark R.

    2001-01-01

    The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.

  9. Computer simulation of initial events in the biochemical mechanisms of DNA damage

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Holley, W. R.

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article.

  10. Simulation of air admission in a propeller hydroturbine during transient events

    NASA Astrophysics Data System (ADS)

    Nicolle, J.; Morissette, J.-F.

    2016-11-01

    In this study, multiphysic simulations are carried out in order to model fluid loading and structural stresses on propeller blades during startup and runaway. It is found that air admission plays an important role during these transient events and that biphasic simulations are therefore required. At the speed no load regime, a large air pocket with vertical free surface forms in the centre of the runner displacing the water flow near the shroud. This significantly affects the torque developed on the blades and thus structural loading. The resulting pressures are applied to a quasi-static structural model and good agreement is obtained with experimental strain gauge data.

  11. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    SciTech Connect

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning.

  12. Methodology Development of Computationally-Efficient Full Vehicle Simulations for the Entire Blast Event

    DTIC Science & Technology

    2015-08-06

    gravity flight and slam-down. The current method of choice to simulate the effect of a shallow-buried IED or mine on a Lagrangian vehicle model, is a fluid...blast event, gravity flight,return-to-ground,restart,ALE,SPH,ATD,TARDEC Generic Hull, ROM 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...structural deformation of the floor, gravity flight and slam-down. The current method of choice to simulate the effect of a shallow-buried IED or mine on

  13. Computer simulation of initial events in the biochemical mechanisms of DNA damage

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Holley, W. R.

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article.

  14. Numerical simulations of solar energetic particle event timescales associated with ICMEs

    NASA Astrophysics Data System (ADS)

    Qi, Shi-Yang; Qin, Gang; Wang, Yang

    2017-03-01

    Recently, S. W. Kahler studied the timescales of solar energetic particle (SEP) events associated with coronal mass ejections (CMEs) from analysis of spacecraft data. They obtained different timescales for SEP events, such as TO, the onset time from CME launch to SEP onset, TR, the rise time from onset to half the peak intensity (0.5{I}{{p}}), and TD, the duration of the SEP intensity above 0.5{I}{{p}}. In this work, we solve the transport equation for SEPs considering interplanetary coronal mass ejection (ICME) shocks as energetic particle sources. With our modeling assumptions, our simulations show similar results to Kahler’s analysis of spacecraft data, that the weighted average of TD increases with both CME speed and width. Moreover, from our simulation results, we suggest TD is directly dependent on CME speed, but not dependent on CME width, which were not found in the analysis of observational data.

  15. Stochastic Optimal Regulation of Nonlinear Networked Control Systems by Using Event-Driven Adaptive Dynamic Programming.

    PubMed

    Sahoo, Avimanyu; Jagannathan, Sarangapani

    2017-02-01

    In this paper, an event-driven stochastic adaptive dynamic programming (ADP)-based technique is introduced for nonlinear systems with a communication network within its feedback loop. A near optimal control policy is designed using an actor-critic framework and ADP with event sampled state vector. First, the system dynamics are approximated by using a novel neural network (NN) identifier with event sampled state vector. The optimal control policy is generated via an actor NN by using the NN identifier and value function approximated by a critic NN through ADP. The stochastic NN identifier, actor, and critic NN weights are tuned at the event sampled instants leading to aperiodic weight tuning laws. Above all, an adaptive event sampling condition based on estimated NN weights is designed by using the Lyapunov technique to ensure ultimate boundedness of all the closed-loop signals along with the approximation accuracy. The net result is event-driven stochastic ADP technique that can significantly reduce the computation and network transmissions. Finally, the analytical design is substantiated with simulation results.

  16. Selective Attention in Multi-Chip Address-Event Systems

    PubMed Central

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model. PMID:22346689

  17. Selective attention in multi-chip address-event systems.

    PubMed

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

  18. Cost comparison of orthopaedic fracture pathways using discrete event simulation in a Glasgow hospital.

    PubMed

    Anderson, Gillian H; Jenkins, Paul J; McDonald, David A; Van Der Meer, Robert; Morton, Alec; Nugent, Margaret; Rymaszewski, Lech A

    2017-09-07

    Healthcare faces the continual challenge of improving outcome while aiming to reduce cost. The aim of this study was to determine the micro cost differences of the Glasgow non-operative trauma virtual pathway in comparison to a traditional pathway. Discrete event simulation was used to model and analyse cost and resource utilisation with an activity-based costing approach. Data for a full comparison before the process change was unavailable so we used a modelling approach, comparing a virtual fracture clinic (VFC) with a simulated traditional fracture clinic (TFC). The orthopaedic unit VFC pathway pioneered at Glasgow Royal Infirmary has attracted significant attention and interest and is the focus of this cost study. Our study focused exclusively on patients with non-operative trauma attending emergency department or the minor injuries unit and the subsequent step in the patient pathway. Retrospective studies of patient outcomes as a result of the protocol introductions for specific injuries are presented in association with activity costs from the models. Patients are satisfied with the new pathway, the information provided and the outcome of their injuries (Evidence Level IV). There was a 65% reduction in the number of first outpatient face-to-face (f2f) attendances in orthopaedics. In the VFC pathway, the resources required per day were significantly lower for all staff groups (p≤0.001). The overall cost per patient of the VFC pathway was £22.84 (95% CI 21.74 to 23.92) per patient compared with £36.81 (95% CI 35.65 to 37.97) for the TFC pathway. Our results give a clearer picture of the cost comparison of the virtual pathway over a wholly traditional f2f clinic system. The use of simulation-based stochastic costings in healthcare economic analysis has been limited to date, but this study provides evidence for adoption of this method as a basis for its application in other healthcare settings. © Article author(s) (or their employer(s) unless otherwise

  19. Results of simulated abnormal heating events for full-length nuclear fuel rods

    SciTech Connect

    Guenther, R.J.

    1983-01-01

    Full-length nuclear fuel rods were tested in a furnace to simulate the slow heating rates postulated for commercial pressurized water reactor fuel rods exposed to an overheating event in a storage cask. Fuel rod temperatures and internal gas pressures were monitored during the test and are presented along with mensural data for cladding. Metallography of the cladding provided data on grain growth, hydriding, oxidation, cladding stresses, and the general nature of the failures.

  20. Validating numerical simulations of snow avalanches using dendrochronology: the Cerro Ventana event in Northern Patagonia, Argentina

    NASA Astrophysics Data System (ADS)

    Casteller, A.; Christen, M.; Villalba, R.; Martínez, H.; Stöckli, V.; Leiva, J. C.; Bartelt, P.

    2008-05-01

    The damage caused by snow avalanches to property and human lives is underestimated in many regions around the world, especially where this natural hazard remains poorly documented. One such region is the Argentinean Andes, where numerous settlements are threatened almost every winter by large snow avalanches. On 1 September 2002, the largest tragedy in the history of Argentinean mountaineering took place at Cerro Ventana, Northern Patagonia: nine persons were killed and seven others injured by a snow avalanche. In this paper, we combine both numerical modeling and dendrochronological investigations to reconstruct this event. Using information released by local governmental authorities and compiled in the field, the avalanche event was numerically simulated using the avalanche dynamics programs AVAL-1D and RAMMS. Avalanche characteristics, such as extent and date were determined using dendrochronological techniques. Model simulation results were compared with documentary and tree-ring evidences for the 2002 event. Our results show a good agreement between the simulated projection of the avalanche and its reconstructed extent using tree-ring records. Differences between the observed and the simulated avalanche, principally related to the snow height deposition in the run-out zone, are mostly attributed to the low resolution of the digital elevation model used to represent the valley topography. The main contributions of this study are (1) to provide the first calibration of numerical avalanche models for the Patagonian Andes and (2) to highlight the potential of textit{Nothofagus pumilio} tree-ring records to reconstruct past snow-avalanche events in time and space. Future research should focus on testing this combined approach in other forested regions of the Andes.

  1. Integrating Existing Simulation Components into a Cohesive Simulation System

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment

  2. Tuning dust Schemes in Weather Research Forecast For Simulating Severe Events Over Egypt

    NASA Astrophysics Data System (ADS)

    ElTahan, Muhammed; shokr, Mohammed; Sherif, Atef O.

    2017-04-01

    Weather Research and Forecasting model coupled with chemistry (WRF-Chem) was used to simulate selected severe dust storm events over Egypt in terms of the aerosol optical depth (AOD). Two severe events on 22 January 2004 and 31 March 2013 are included in this work. The model results are compared against satellite data From the Moderate Resolution Imaging Spectroradiometer (MODIS) on board NASA's Aqua Satellite. The Spatial Resolution for both data sets are 10 km. Sensitivity analysis for dust emission schemes in the model was performed to identify the best dust scheme able to simulate the events. The analysis includes three dust schemes; the Goddard Chemistry Aerosol Radiation and Transport (GOCART) dust emissions, the GOCART dust emissions with Air Force Weather Agency (GOCART-AFWA) modification and the GOCART with University of Cologne (GOCART-UOC) modifications. Each scheme was tested by adjusting scheme coefficients related to the dust flux. The results of the WRF-Chem simulations underestimates the AOD for all the three schemes. By tuning the scheme coefficients, it was always possible to reduce the bias of model output as compared to satellite data. Different tuning were required for each case depending on the origin and composition of the dust storm. Model output and MODIS data were also compared against Aerosol Robotic Network (AERONET) Cairo Station.

  3. Helmet mounted display systems for helicopter simulation

    NASA Technical Reports Server (NTRS)

    Haworth, Loran A.; Bucher, Nancy; Runnings, David

    1989-01-01

    Simulation scientists continually pursue improved flight simulation technology with the goal of closely replicating the 'real world' physical environment. The presentation/display of visual information for flight simulation is one such area enjoying recent technical improvements that are fundamental for conducting simulated operations close to the terrain. Detailed and appropriate visual information is especially critical for Nap-Of-the-Earth (NOE) helicopter flight simulation where the pilot maintains an 'eyes-out' orientation to avoid obstructions and terrain. This paper elaborates on the visually coupled Wide Field Of View Helmet Mounted Display (WFOVHMD) system technology as a viable visual display system for helicopter simulation. In addition the paper discusses research conducted on the NASA-Ames Vertical Motion Simulator that examined one critical research issue for helmet mounted displays.

  4. Helmet mounted display systems for helicopter simulation

    NASA Technical Reports Server (NTRS)

    Haworth, Loran A.; Bucher, Nancy; Runnings, David

    1989-01-01

    Simulation scientists continually pursue improved flight simulation technology with the goal of closely replicating the 'real world' physical environment. The presentation/display of visual information for flight simulation is one such area enjoying recent technical improvements that are fundamental for conducting simulated operations close to the terrain. Detailed and appropriate visual information is especially critical for Nap-Of-the-Earth (NOE) helicopter flight simulation where the pilot maintains an 'eyes-out' orientation to avoid obstructions and terrain. This paper elaborates on the visually coupled Wide Field Of View Helmet Mounted Display (WFOVHMD) system technology as a viable visual display system for helicopter simulation. In addition the paper discusses research conducted on the NASA-Ames Vertical Motion Simulator that examined one critical research issue for helmet mounted displays.

  5. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  6. A comparison of active adverse event surveillance systems worldwide.

    PubMed

    Huang, Yu-Lin; Moon, Jinhee; Segal, Jodi B

    2014-08-01

    Post-marketing drug surveillance for adverse drug events (ADEs) has typically relied on spontaneous reporting. Recently, regulatory agencies have turned their attention to more preemptive approaches that use existing data for surveillance. We conducted an environmental scan to identify active surveillance systems worldwide that use existing data for the detection of ADEs. We extracted data about the systems' structures, data, and functions. We synthesized the information across systems to identify common features of these systems. We identified nine active surveillance systems. Two systems are US based-the FDA Sentinel Initiative (including both the Mini-Sentinel Initiative and the Federal Partner Collaboration) and the Vaccine Safety Datalink (VSD); two are Canadian-the Canadian Network for Observational Drug Effect Studies (CNODES) and the Vaccine and Immunization Surveillance in Ontario (VISION); and two are European-the Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR) Alliance and the Vaccine Adverse Event Surveillance and Communication (VAESCO). Additionally, there is the Asian Pharmacoepidemiology Network (AsPEN) and the Shanghai Drug Monitoring and Evaluative System (SDMES). We identified two systems in the UK-the Vigilance and Risk Management of Medicines (VRMM) Division and the Drug Safety Research Unit (DSRU), an independent academic unit. These surveillance systems mostly use administrative claims or electronic medical records; most conduct pharmacovigilance on behalf of a regulatory agency. Either a common data model or a centralized model is used to access existing data. The systems have been built using national data alone or via partnership with other countries. However, active surveillance systems using existing data remain rare. North America and Europe have the most population coverage; with Asian countries making good advances.

  7. Event-triggered nonlinear consensus in directed multi-agent systems with combinational state measurements

    NASA Astrophysics Data System (ADS)

    Li, Huaqing; Chen, Guo; Xiao, Li

    2016-10-01

    Event-triggered sampling control is motivated by the applications of embedded microprocessors equipped in the agents with limited computation and storage resources. This paper studied global consensus in multi-agent systems with inherent nonlinear dynamics on general directed networks using decentralised event-triggered strategy. For each agent, the controller updates are event-based and only triggered at its own event times by only utilising the locally current sampling data. A high-performance sampling event that only needs local neighbours' states at their own discrete time instants is presented. Furthermore, we introduce two kinds of general algebraic connectivity for strongly connected networks and strongly connected components of the directed network containing a spanning tree so as to describe the system's ability for reaching consensus. A detailed theoretical analysis on consensus is performed and two criteria are derived by virtue of algebraic graph theory, matrix theory and Lyapunov control approach. It is shown that the Zeno behaviour of triggering time sequence is excluded during the system's whole working process. A numerical simulation is given to show the effectiveness of the theoretical results.

  8. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  9. [Implementation of "never events" checklists in a radiotherapy information system].

    PubMed

    Brusadin, G; Bour, M S; Deutsch, E; Kouchit, N; Corbin, S; Lefkopoulos, D

    2017-08-18

    In order to reduce the incidence of major accidents during external radiotherapy treatment, "never events" checklists have been incorporated into the "record and verify" system. This article details this process. Prospects for improvement are also proposed, including a peer-to-peer audit on the use of checklists and the availability of the radiotherapy information system manufacturer to collaborate in this process to secure the patients' journey. Copyright © 2017 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  10. Serious adverse event reporting in a medical device information system.

    PubMed

    Pecoraro, Fabrizio; Luzi, Daniela

    2011-01-01

    The paper describes the design of a module that manages Serious Adverse Events (SAEs) reporting within a Clinical investigation on Medical devices. This module is integrated in a Medical Device Information System (MEDIS) that collects data and documents exchanged between applicants and the National Competent Authority during the clinical investigation lifecycle. To improve information sharing among different stakeholders and systems MEDIS design and developed were based on the HL7 v.3 standards. The paper provides a conceptual model on SAEs based on HL7 RIM that underlines Medical Device characteristics.

  11. Power System Extreme Event Detection: The VulnerabilityFrontier

    SciTech Connect

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  12. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  13. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    NASA Astrophysics Data System (ADS)

    Guerrier, C.; Holcman, D.

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  14. A Low Intensity Rainfall Simulator and Runoff Collection System

    NASA Astrophysics Data System (ADS)

    Ponte, V. M.; Piechota, T. C.

    2001-12-01

    In the past, rainfall simulators have been designed for high intensity events; however there is a lack of studies on the simulation of low rainfall intensities. This research presents a system capable of performing rainfall simulations over large areas with intensities as low as 15.5mm/h (0.6in/h) and a Coefficient of Variation greater than 0.8. The system consists of towers distributed spatially in the corners of hypothetical equilateral triangles with varying sizes. The simulation area should be inside the triangles to ensure adequate coverage and minimize wind effects. Thus, the minimum configuration for the system is three towers. Each tower consists of three aluminum legs that are 2.5m (8ft) high and support a 1/4 GG - SS 10 nozzle, manufactured by Spraying Systems Corporation. A pressure gauge is incorporated in each tower to monitor the flow. The water supply is pumped from a tank into a main hose that has a flow control and pressure gauge. The water is subdivided into individual hoses that supply water to each tower. The hose diameters vary according to the number of towers used in each simulation. This configuration is capable of simulating a wide range of rainfall intensities over small and large regions. The collection of the runoff is accomplished with a 1 1/2 in semicircular PVC Pipe. This system is applied to a research project that evaluates the impacts of dust suppressants on disturbed lands.

  15. Using simulation to evaluate warhead monitoring system effectiveness

    SciTech Connect

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.; Liles, Karina R.; Meyer, Nicholas J.; Oster, Matthew R.; Waterworth, Angela M.

    2015-07-12

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled as declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical

  16. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  17. DeMO: An Ontology for Discrete-event Modeling and Simulation.

    PubMed

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-09-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.

  18. Three-Dimensional Event-Driven Hybrid Simulations of Magnetized Plasmas

    NASA Astrophysics Data System (ADS)

    Omelchenko, Y. A.; Karimabadi, H.; Vu, H. X.

    2012-12-01

    Existing space weather frameworks are based on global fluid models of the magnetosphere. However, a mature model of the coupling between regions and the global response of geospace to solar variations requires global kinetic-scale simulations. One reason for this is that most critical plasma processes regulating mass and energy transfer in the magnetosphere take place at relatively thin ion scale boundaries/discontinuities (e.g., bow shock, magnetopause, magnetotail) where ions control the essential physics. The region between these boundaries is also permeated with multiple ion species and ion-scale waves. Since fully kinetic (kinetic electrons and ions) 3D global simulations will remain out of reach in the foreseeable future, hybrid simulations (electron fluid, kinetic ions) have long been considered the next phase in the global modeling of the magnetosphere. Largely varying time and length scales impose severe numerical constraints on global simulations with hybrid codes. To enable larger simulations we developed a unique, uni-dimensional asynchronous (event-driven) hybrid code, HYPERS. Here we report preliminary results from first, 3D, parallel, asynchronous simulations of magnetized plasmas conducted with this new code.

  19. Two Hours of Teamwork Training Improves Teamwork in Simulated Cardiopulmonary Arrest Events.

    PubMed

    Mahramus, Tara L; Penoyer, Daleen A; Waterval, Eugene M E; Sole, Mary L; Bowe, Eileen M

    2016-01-01

    Teamwork during cardiopulmonary arrest events is important for resuscitation. Teamwork improvement programs are usually lengthy. This study assessed the effectiveness of a 2-hour teamwork training program. A prospective, pretest/posttest, quasi-experimental design assessed the teamwork training program targeted to resident physicians, nurses, and respiratory therapists. Participants took part in a simulated cardiac arrest. After the simulation, participants and trained observers assessed perceptions of teamwork using the Team Emergency Assessment Measure (TEAM) tool (ratings of 0 [low] to 4 [high]). A debriefing and 45 minutes of teamwork education followed. Participants then took part in a second simulated cardiac arrest scenario. Afterward, participants and observers assessed teamwork. Seventy-three team members participated-resident physicians (25%), registered nurses (32%), and respiratory therapists (41%). The physicians had significantly less experience on code teams (P < .001). Baseline teamwork scores were 2.57 to 2.72. Participants' mean (SD) scores on the TEAM tool for the first and second simulations were 3.2 (0.5) and 3.7 (0.4), respectively (P < .001). Observers' mean (SD) TEAM scores for the first and second simulations were 3.0 (0.5) and 3.7 (0.3), respectively (P < .001). Program evaluations by participants were positive. A 2-hour simulation-based teamwork educational intervention resulted in improved perceptions of teamwork behaviors. Participants reported interactions with other disciplines, teamwork behavior education, and debriefing sessions were beneficial for enhancing the program.

  20. Power electronics system modeling and simulation

    SciTech Connect

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  1. Evaluating resilience of DNP3-controlled SCADA systems against event buffer flooding

    SciTech Connect

    Yan, Guanhua; Nicol, David M; Jin, Dong

    2010-12-16

    The DNP3 protocol is widely used in SCADA systems (particularly electrical power) as a means of communicating observed sensor state information back to a control center. Typical architectures using DNP3 have a two level hierarchy, where a specialized data aggregator device receives observed state from devices within a local region, and the control center collects the aggregated state from the data aggregator. The DNP3 communication between control center and data aggregator is asynchronous with the DNP3 communication between data aggregator and relays; this leads to the possibility of completely filling a data aggregator's buffer of pending events, when a relay is compromised or spoofed and sends overly many (false) events to the data aggregator. This paper investigates how a real-world SCADA device responds to event buffer flooding. A Discrete-Time Markov Chain (DTMC) model is developed for understanding this. The DTMC model is validated by a Moebius simulation model and data collected on real SCADA testbed.

  2. Global Positioning System Simulator Field Operational Procedures

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Quinn, David A.; Day, John H. (Technical Monitor)

    2002-01-01

    Global Positioning System (GPS) simulation is an important activity in the development or qualification of GPS signal receivers for space flight. Because a GPS simulator is a critical resource it is highly desirable to develop a set of field operational procedures to supplement the basic procedures provided by most simulator vendors. Validated field procedures allow better utilization of the GPS simulator in the development of new test scenarios and simulation operations. These procedures expedite simulation scenario development while resulting in scenarios that are more representative of the true design, as well as enabling construction of more complex simulations than previously possible, for example, spacecraft maneuvers. One difficulty in the development of a simulation scenario is specifying various modes of test vehicle motion and associated maneuvers requiring that a user specify some (but not all) of a few closely related simulation parameters. Currently this can only be done by trial and error. A stand-alone procedure that implements the simulator maneuver motion equations and solves for the motion profile transient times, jerk and acceleration would be of considerable value. Another procedure would permit the specification of some configuration parameters that would determine the simulated GPS signal composition. The resulting signal navigation message, for example, would force the receiver under test to use only the intended C-code component of the simulated GPS signal. A representative class of GPS simulation-related field operational procedures is described in this paper. These procedures were developed and used in support of GPS integration and testing for many successful spacecraft missions such as SAC-A, EO-1, AMSAT, VCL, SeaStar, sounding rockets, and by using the industry standard Spirent Global Simulation Systems Incorporated (GSSI) STR series simulators.

  3. An electronic notebook for physical system simulation

    NASA Astrophysics Data System (ADS)

    Kelsey, Robert L.

    2003-09-01

    A scientist who sets up and runs experiments typically keeps notes of this process in a lab notebook. A scientist who runs computer simulations should be no different. Experiments and simulations both require a set-up process which should be documented along with the results of the experiment or simulation. The documentation is important for knowing and understanding what was attempted, what took place, and how to reproduce it in the future. Modern simulations of physical systems have become more complex due in part to larger computational resources and increased understanding of physical systems. These simulations may be performed by combining the results from multiple computer codes. The machines that these simulations are executed on are often massively parallel/distributed systems. The output result of one of these simulations can be a terabyte of data and can require months of computing. All of these things contribute to the difficulty of keeping a useful record of the process of setting up and executing a simulation for a physical system. An electronic notebook for physical system simulations has been designed to help document the set up and execution process. Much of the documenting is done automatically by the simulation rather than the scientist running the simulation. The simulation knows what codes, data, software libraries, and versions thereof it is drawing together. All of these pieces of information become documented in the electronic notebook. The electronic notebook is designed with and uses the eXtensible Markup Language (XML). XML facilitates the representation, storage, interchange, and further use of the documented information.

  4. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  5. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  6. The Evaluation of a Pulmonary Display to Detect Adverse Respiratory Events Using High Resolution Human Simulator

    PubMed Central

    Wachter, S. Blake; Johnson, Ken; Albert, Robert; Syroid, Noah; Drews, Frank; Westenskow, Dwayne

    2006-01-01

    Objective Authors developed a picture-graphics display for pulmonary function to present typical respiratory data used in perioperative and intensive care environments. The display utilizes color, shape and emergent alerting to highlight abnormal pulmonary physiology. The display serves as an adjunct to traditional operating room displays and monitors. Design To evaluate the prototype, nineteen clinician volunteers each managed four adverse respiratory events and one normal event using a high-resolution patient simulator which included the new displays (intervention subjects) and traditional displays (control subjects). Between-group comparisons included (i) time to diagnosis and treatment for each adverse respiratory event; (ii) the number of unnecessary treatments during the normal scenario; and (iii) self-reported workload estimates while managing study events. Measurements Two expert anesthesiologists reviewed video-taped transcriptions of the volunteers to determine time to treat and time to diagnosis. Time values were then compared between groups using a Mann-Whitney-U Test. Estimated workload for both groups was assessed using the NASA-TLX and compared between groups using an ANOVA. P-values < 0.05 were considered significant. Results Clinician volunteers detected and treated obstructed endotracheal tubes and intrinsic PEEP problems faster with graphical rather than conventional displays (p < 0.05). During the normal scenario simulation, 3 clinicians using the graphical display, and 5 clinicians using the conventional display gave unnecessary treatments. Clinician-volunteers reported significantly lower subjective workloads using the graphical display for the obstructed endotracheal tube scenario (p < 0.001) and the intrinsic PEEP scenario (p < 0.03). Conclusion Authors conclude that the graphical pulmonary display may serve as a useful adjunct to traditional displays in identifying adverse respiratory events. PMID:16929038

  7. Characteristics of flight simulator visual systems

    NASA Technical Reports Server (NTRS)

    Statler, I. C. (Editor)

    1981-01-01

    The physical parameters of the flight simulator visual system that characterize the system and determine its fidelity are identified and defined. The characteristics of visual simulation systems are discussed in terms of the basic categories of spatial, energy, and temporal properties corresponding to the three fundamental quantities of length, mass, and time. Each of these parameters are further addressed in relation to its effect, its appropriate units or descriptors, methods of measurement, and its use or importance to image quality.

  8. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  9. Systemic chemokine levels, coronary heart disease, and ischemic stroke events

    PubMed Central

    Canouï-Poitrine, F.; Luc, G.; Mallat, Z.; Machez, E.; Bingham, A.; Ferrieres, J.; Ruidavets, J.-B.; Montaye, M.; Yarnell, J.; Haas, B.; Arveiler, D.; Morange, P.; Kee, F.; Evans, A.; Amouyel, P.; Ducimetiere, P.

    2011-01-01

    Objectives: To quantify the association between systemic levels of the chemokine regulated on activation normal T-cell expressed and secreted (RANTES/CCL5), interferon-γ-inducible protein-10 (IP-10/CXCL10), monocyte chemoattractant protein-1 (MCP-1/CCL2), and eotaxin-1 (CCL11) with future coronary heart disease (CHD) and ischemic stroke events and to assess their usefulness for CHD and ischemic stroke risk prediction in the PRIME Study. Methods: After 10 years of follow-up of 9,771 men, 2 nested case-control studies were built including 621 first CHD events and 1,242 matched controls and 95 first ischemic stroke events and 190 matched controls. Standardized hazard ratios (HRs) for each log-transformed chemokine were estimated by conditional logistic regression. Results: None of the 4 chemokines were independent predictors of CHD, either with respect to stable angina or to acute coronary syndrome. Conversely, RANTES (HR = 1.70; 95% confidence interval [CI] 1.05–2.74), IP-10 (HR = 1.53; 95% CI 1.06–2.20), and eotaxin-1 (HR = 1.59; 95% CI 1.02–2.46), but not MCP-1 (HR = 0.99; 95% CI 0.68–1.46), were associated with ischemic stroke independently of traditional cardiovascular risk factors, hs-CRP, and fibrinogen. When the first 3 chemokines were included in the same multivariate model, RANTES and IP-10 remained predictive of ischemic stroke. Their addition to a traditional risk factor model predicting ischemic stroke substantially improved the C-statistic from 0.6756 to 0.7425 (p = 0.004). Conclusions: In asymptomatic men, higher systemic levels of RANTES and IP-10 are independent predictors of ischemic stroke but not of CHD events. RANTES and IP-10 may improve the accuracy of ischemic stroke risk prediction over traditional risk factors. PMID:21849651

  10. Quantum Simulation for Open-System Dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Dong-Sheng; de Oliveira, Marcos Cesar; Berry, Dominic; Sanders, Barry

    2013-03-01

    Simulat