Science.gov

Sample records for event system simulation

  1. Discrete event simulation of continuous systems

    SciTech Connect

    Nutaro, James J

    2007-01-01

    Computer simulation of a system described by differential equations requires that some element of the system be approximated by discrete quantities. There are two system aspects that can be made discrete; time and state. When time is discrete, the differential equation is approximated by a difference equation (i.e., a discrete time system), and the solution is calculated at fixed points in time. When the state is discrete, the differential equation is approximated by a discrete event system. Events correspond to jumps through the discrete state space of the approximation.

  2. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  3. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  4. An event-based hydrologic simulation model for bioretention systems.

    PubMed

    Roy-Poirier, A; Filion, Y; Champagne, P

    2015-01-01

    Bioretention systems are designed to treat stormwater and provide attenuated drainage between storms. Bioretention has shown great potential at reducing the volume and improving the quality of stormwater. This study introduces the bioretention hydrologic model (BHM), a one-dimensional model that simulates the hydrologic response of a bioretention system over the duration of a storm event. BHM is based on the RECARGA model, but has been adapted for improved accuracy and integration of pollutant transport models. BHM contains four completely-mixed layers and accounts for evapotranspiration, overflow, exfiltration to native soils and underdrain discharge. Model results were evaluated against field data collected over 10 storm events. Simulated flows were particularly sensitive to antecedent water content and drainage parameters of bioretention soils, which were calibrated through an optimisation algorithm. Temporal disparity was observed between simulated and measured flows, which was attributed to preferential flow paths formed within the soil matrix of the field system. Modelling results suggest that soil water storage is the most important short-term hydrologic process in bioretention, with exfiltration having the potential to be significant in native soils with sufficient permeability. PMID:26524443

  5. Rare event simulation of the chaotic Lorenz 96 dynamical system

    NASA Astrophysics Data System (ADS)

    Wouters, Jeroen; Bouchet, Freddy

    2015-04-01

    The simulation of rare events is becoming increasingly important in the climate sciences. Several sessions are devoted to rare and extreme events at this meeting and the IPCC has devoted a special report to risk management of extreme events (SREX). Brute force simulation of rare events can however be very costly. To obtain satisfactory statistics on a 1/1000y event, one needs to perform simulations over several thousands of years. Recently, a class of rare event simulation algorithms has been introduced that could yield significant increases in performance with respect to brute force simulations (see e.g. [1]). In these algorithms an ensemble of simulations is evolved in parallel, while at certain interaction times ensemble members are killed and cloned so as to have better statistics in the region of phase space that is relevant to the rare event of interest. We will discuss the implementational issues and performance gains for these algorithms. We also present results on a first application of a rare event simulation algorithm to a toy model for chaos in the atmosphere, the Lorenz 96 model. We demonstrate that for the estimation of the histogram tail of the energy observable, the algorithm gives a significant error reduction. We will furthermore discuss first results and an outlook on the application of rare event simulation algorithms to study blocking atmospheric circulation and heat wave events in the PlaSim climate model [2]. [1] Del Moral, P. & Garnier, J. Genealogical particle analysis of rare events. The Annals of Applied Probability 15, 2496-2534 (2005). [2] http://www.mi.uni-hamburg.de/Planet-Simul.216.0.html

  6. Enhancing Complex System Performance Using Discrete-Event Simulation

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    2010-01-01

    In this paper, we utilize discrete-event simulation (DES) merged with human factors analysis to provide the venue within which the separation and deconfliction of the system/human operating principles can occur. A concrete example is presented to illustrate the performance enhancement gains for an aviation cargo flow and security inspection system achieved through the development and use of a process DES. The overall performance of the system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and total number of pallets waiting for inspection in the queue. These metrics are performance indicators of the system's ability to service current needs and respond to additional requests. We studied and analyzed different scenarios by changing various model parameters such as the number of pieces per pallet ratio, number of inspectors and cargo handling personnel, number of forklifts, number and types of detection systems, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures identified effective ways to meet inspection requirements while maintaining or reducing overall operational cost and eliminating any shipping delays associated with any proposed changes in inspection requirements. With this understanding effective operational strategies can be developed to optimally use personnel while still maintaining plant efficiency, reducing process interruptions, and holding or reducing costs.

  7. An Early Warning System for Loan Risk Assessment Based on Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Zhou, Hong; Qiu, Yue; Wu, Yueqin

    System simulation is one of important tool for risk assessment. In this paper, a new method is presented to deal with credit risk assessment problems for commercial banks based on rare event simulation. The failure probability of repaying loans of listed company is taken as the criterion to measure the level of credit risk. The rare-event concept is adopted to construct the model of credit risk identification in commercial banks, and cross-entropy scheme is designed to implement the rare event simulation, based on which the loss probability can be assessed. Numerical experiments have shown that the method has a strong capability to identify the credit risk for commercial banks and offers a good tool for early warning.

  8. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  9. Validation of ground-motion simulations for historical events using SDoF systems

    USGS Publications Warehouse

    Galasso, C.; Zareian, F.; Iervolino, I.; Graves, R.W.

    2012-01-01

    The study presented in this paper is among the first in a series of studies toward the engineering validation of the hybrid broadband ground‐motion simulation methodology by Graves and Pitarka (2010). This paper provides a statistical comparison between seismic demands of single degree of freedom (SDoF) systems subjected to past events using simulations and actual recordings. A number of SDoF systems are selected considering the following: (1) 16 oscillation periods between 0.1 and 6 s; (2) elastic case and four nonlinearity levels, from mildly inelastic to severely inelastic systems; and (3) two hysteretic behaviors, in particular, nondegrading–nonevolutionary and degrading–evolutionary. Demand spectra are derived in terms of peak and cyclic response, as well as their statistics for four historical earthquakes: 1979 Mw 6.5 Imperial Valley, 1989 Mw 6.8 Loma Prieta, 1992 Mw 7.2 Landers, and 1994 Mw 6.7 Northridge.

  10. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    NASA Astrophysics Data System (ADS)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  11. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  12. The IDES framework: A case study in development of a parallel discrete-event simulation system

    SciTech Connect

    Nicol, D.M.; Johnson, M.M.; Yoshimura, A.S.

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  13. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  14. Spontaneous onset of a Madden-Julian oscillation event in a cloud-system-resolving simulation

    NASA Astrophysics Data System (ADS)

    Miura, Hiroaki; Satoh, Masaki; Katsumata, Masaki

    2009-07-01

    Spontaneous onset of a Madden-Julian Oscillation (MJO) event in November 2006 was reproduced at a proper location and time by a global cloud-resolving model (CRM) used with a relatively coarse horizontal grid. Preconditioning of moisture was simulated about 4-days prior to the onset in the Indian Ocean, which agreed with data obtained in an in-situ observation. To investigate influence of zonal Sea Surface Temperature (SST) gradient in the Indian Ocean, we conducted a sensitivity study comparing composites made from five ensemble simulations. It was found that the eastward-moving signal of this MJO event could be obscured if SST were zonally uniform in the western Indian Ocean. Zonal SST gradient has not been considered important in the previous studies about the MJO onset, but SST distribution locating cooler SST in the west side possibly help enhance convection in slow eastward-moving envelopes of the MJO.

  15. Weighted-ensemble Brownian dynamics simulation: sampling of rare events in nonequilibrium systems.

    PubMed

    Kromer, Justus A; Schimansky-Geier, Lutz; Toral, Raul

    2013-06-01

    We provide an algorithm based on weighted-ensemble (WE) methods, to accurately sample systems at steady state. Applying our method to different one- and two-dimensional models, we succeed in calculating steady-state probabilities of order 10(-300) and reproduce the Arrhenius law for rates of order 10(-280). Special attention is payed to the simulation of nonpotential systems where no detailed balance assumption exists. For this large class of stochastic systems, the stationary probability distribution density is often unknown and cannot be used as preknowledge during the simulation. We compare the algorithm's efficiency with standard Brownian dynamics simulations and the original WE method.

  16. Dynamic simulation recalls condensate piping event

    SciTech Connect

    Farrell, R.J.; Reneberg, K.O. ); Moy, H.C. )

    1994-05-01

    This article describes how experience gained from simulating and reconstructing a condensate piping event will be used by Consolidated Edison to analyze control system problems. A cooperative effort by Con Edison and the Chemical Engineering Department at Polytechnic University used modular modeling system to investigate the probable cause of a Con Edison condensate piping event. Con Edison commissioned the work to serve as a case study for the more general problem of control systems analysis using dynamic simulation and MMS.

  17. Integration of scheduling and discrete event simulation systems to improve production flow planning

    NASA Astrophysics Data System (ADS)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  18. APEX - a Petri net process modeling tool built on a discrete-event simulation system

    SciTech Connect

    Gish, J.W.

    1996-12-31

    APEX, the Animated Process Experimentation tool, provides a capability for defining, simulating and animating process models. Primarily constructed for the modeling and analysis of software process models, we have found that APEX is much more broadly applicable and is suitable for process modeling tasks outside the domain of software processes. APEX has been constructed as a library of simulation blocks that implement timed hierarchical colored Petri Nets. These Petri Net blocks operate in conjunction with EXTEND, a general purpose continuous and discrete-event simulation tool. EXTEND provides a flexible, powerful and extensible environment with features particularly suitable for the modeling of complex processes. APEX`s Petri Net block additions to EXTEND provide an inexpensive capability with well-defined and easily understood semantics that is a powerful, easy to use, flexible means to engage in process modeling and evaluation. The vast majority of software process research has focused on the enactment of software processes. Little has been said about the actual creation and evaluation of software process models necessary to support enactment. APEX has been built by the Software Engineering Process Technology Project at GTE Laboratories which has been focusing on this neglected area of process model definition and analysis. We have constructed high-level software lifecycle models, a set of models that demonstrate differences between four levels of the SEI Capability Maturity Model (CMM), customer care process models, as well as models involving more traditional synchronization and coordination problems such as producer-consumer and 2-phase commit. APEX offers a unique blend of technology from two different disciplines: discrete-event simulation and Petri Net modeling. Petri Nets provide a well-defined and rich semantics in a simple, easy to understand notation. The simulation framework allows for execution, animation, and measurement of the resultant models.

  19. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  20. Simulating extreme-mass-ratio systems in full general relativity: tidal disruption events

    NASA Astrophysics Data System (ADS)

    East, William; Pretorius, Frans

    2014-03-01

    Sparked by recent and anticipated observations, there is considerable interest in understanding events where a star is tidally disrupted by a massive black hole. Motivated by this and other applications, we introduce a new method for numerically evolving the full Einstein field equations in situations where the spacetime is dominated by a known background solution. The technique leverages the knowledge of the background solution to subtract off its contribution to the truncation error, thereby more efficiently achieving a desired level of accuracy. We demonstrate how the method can be applied to systems consisting of a solar-type star and a supermassive black hole with mass ratios >=106 . The self-gravity of the star is thus consistently modelled within the context of general relativity, and the star's interaction with the black hole computed with moderate computational cost, despite the over five orders of magnitude difference in gravitational potential (as defined by the ratio of mass to radius). We study the tidal deformation of the star during infall, as well as the gravitational wave emission, and discuss ongoing work to understand the importance of strong-field gravity effects on tidal disruption events.

  1. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    SciTech Connect

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  2. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  3. The global event system

    SciTech Connect

    Winans, J.

    1994-03-02

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  4. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  5. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  6. Numerical Simulations of Two Wildfire Events Using a Combined Modeling System (HIGRAD/BEHAVE)

    SciTech Connect

    Reisner, J.; Bossert, J.; Winterkamp, J.

    1997-12-31

    The ability to accurately forecast the spread of a wildfire would significantly reduce human suffering and loss of life, the destruction of property, and expenditures for assessment and recovery. To help achieve this goal we have developed a model which accurately simulates the interactions between winds and the heat source associated with a wildfire. We have termed our new model HIGRAD or High resolution model for strong GRA-Dient applications. HIGRAD employs a sophisticated numerical technique to prevent numerical Oscillations from occurring in the vicinity of the lire. Of importance for fire modeling, HIGRAD uses a numerical technique which allows for the use of a compressible equation set, but without the time-step restrictions associated with the propagation of sound-waves.

  7. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  8. Ocean Dynamics Simulation during an Extreme Bora Event using a Two-Way Coupled Atmosphere-Ocean Modeling System

    NASA Astrophysics Data System (ADS)

    Licer, Matjaz; Smerkol, Peter; Fettich, Anja; Ravdas, Michalis; Papapostolou, Alexandros; Mantziafou, Anneta; Cedilnik, Jure; Strajnar, Benedikt; Jeromel, Maja; Pristov, Neva; Jerman, Jure; Petan, Saso; Malacic, Vlado; Sofianos, Sarantis

    2015-04-01

    The response of the Adriatic Sea to cold north-easterly Bora wind forcing has been modelled numerous times, but usually using one-way coupling techniques. One of the most significant events of the kind took place in February 2012, when hurricane force Bora was blowing over the Northern Adriatic almost continuously for over three weeks, causing extreme air-sea interactions leading to severe water cooling (below 4 degrees Celsius) and extensive dense water formation (with density anomalies above 30.5 kg/m3). The intensity of the atmosphere-ocean interactions during such conditions calls for a two-way atmosphere-ocean coupling approach. We compare the performances of a) fully two-way coupled atmosphere-ocean modelling system and b) one way coupled ocean model (forced by the atmospheric model hourly output) to the available in-situ measurements (coastal buoy, CTD). The models used were ALADIN (4.4 km resolution) on the atmospheric side and POM (1/30°× 1/30° resolution) on the ocean side. The atmosphere-ocean coupling was implemented using the OASIS3-MCT model coupling toolkit. We show that the atmosphere-ocean two-way coupling significantly improves the simulated temperature and density response of the ocean since it represents short-termed transient features much better than the offline version of the ocean model.

  9. A Simbol-X Event Simulator

    SciTech Connect

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-05-11

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  10. Running Parallel Discrete Event Simulators on Sierra

    SciTech Connect

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  11. Discrete event simulation in the artificial intelligence environment

    SciTech Connect

    Egdorf, H.W.; Roberts, D.J.

    1987-01-01

    Discrete Event Simulations performed in an Artificial Intelligence (AI) environment provide benefits in two major areas. The productivity provided by Object Oriented Programming, Rule Based Programming, and AI development environments allows simulations to be developed and maintained more efficiently than conventional environments allow. Secondly, the use of AI techniques allows direct simulation of human decision making processes and Command and Control aspects of a system under study. An introduction to AI techniques is presented. Two discrete event simulations produced in these environments are described. Finally, a software engineering methodology is discussed that allows simulations to be designed for use in these environments. 3 figs.

  12. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  13. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  14. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  15. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  16. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  17. Simulation of EAST vertical displacement events by tokamak simulation code

    NASA Astrophysics Data System (ADS)

    Qiu, Qinglai; Xiao, Bingjia; Guo, Yong; Liu, Lei; Xing, Zhe; Humphreys, D. A.

    2016-10-01

    Vertical instability is a potentially serious hazard for elongated plasma. In this paper, the tokamak simulation code (TSC) is used to simulate vertical displacement events (VDE) on the experimental advanced superconducting tokamak (EAST). Key parameters from simulations, including plasma current, plasma shape and position, flux contours and magnetic measurements match experimental data well. The growth rates simulated by TSC are in good agreement with TokSys results. In addition to modeling the free drift, an EAST fast vertical control model enables TSC to simulate the course of VDE recovery. The trajectories of the plasma current center and control currents on internal coils (IC) fit experimental data well.

  18. MHD simulation of the Bastille day event

    NASA Astrophysics Data System (ADS)

    Linker, Jon; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-01

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 1033 ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  19. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, Ramona; Randrup, Joergen

    2008-04-17

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  20. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  1. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models. PMID:27140558

  2. Coupled Model Simulation of Snowfall Events Over the Black Hills

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong; Hjelmfelt, M. R.; Capehart, W. J.

    2000-01-01

    Although many long-term simulations of snow accumulation and oblation have been made using stand-alone land surface models and surface models coupled with GCMs, less research has focused on short-term event simulations. Actually, accurate event simulations of snow-related processes are the basis for successful long-term simulation. Three advantages of event simulations of snowfall and snow melting are availability of: (1) intensive observation data from field experiments for validation; (2) more physically-realistic precipitation schemes for use in atmospheric models to simulate snowfall; and (3) a more detailed analysis of the snow melting processes. In addition to the complexities of snow related processes themselves, terrain-induced effects on snowfall/snow melting make simulations of snow events more difficult. Climatological observations indicate that terrain features such as the Black Hills of South Dakota and Wyoming can exert important effects on snow accumulation and snow oblation processes. One of the primary effects is that the orography causes forced uplift of airflow and causes atmospheric waves to form both upwind and downwind of it. Airflow often splits around the obstacle, converging on the lee side. This convergence may lead to precipitation enhancement. It also provides an elevated heat and moisture source that enhances atmospheric instability. During the period of April 5-May 5, 1999, the Upper Missouri River Basin Pilot Project (UMRBPP) made intensive observations on precipitation events occurring in the Black Hills. Two moderate snowfall events were captured during the period. The resulting high temporal and spatial resolution data provides opportunities to investigate terrain effects on snowfall amount, distribution, and melting. Successful simulation of snowfall amount, distribution, and evolution using atmospheric models is important to subsequent modeling of snow melting using snow sub-models in land surface schemes. In this paper, a

  3. Distributed discrete event simulation. Final report

    SciTech Connect

    De Vries, R.C.

    1988-02-01

    The presentation given here is restricted to discrete event simulation. The complexity of and time required for many present and potential discrete simulations exceeds the reasonable capacity of most present serial computers. The desire, then, is to implement the simulations on a parallel machine. However, certain problems arise in an effort to program the simulation on a parallel machine. In one category of methods deadlock care arise and some method is required to either detect deadlock and recover from it or to avoid deadlock through information passing. In the second category of methods, potentially incorrect simulations are allowed to proceed. If the situation is later determined to be incorrect, recovery from the error must be initiated. In either case, computation and information passing are required which would not be required in a serial implementation. The net effect is that the parallel simulation may not be much better than a serial simulation. In an effort to determine alternate approaches, important papers in the area were reviewed. As a part of that review process, each of the papers was summarized. The summary of each paper is presented in this report in the hopes that those doing future work in the area will be able to gain insight that might not otherwise be available, and to aid in deciding which papers would be most beneficial to pursue in more detail. The papers are broken down into categories and then by author. Conclusions reached after examining the papers and other material, such as direct talks with an author, are presented in the last section. Also presented there are some ideas that surfaced late in the research effort. These promise to be of some benefit in limiting information which must be passed between processes and in better understanding the structure of a distributed simulation. Pursuit of these ideas seems appropriate.

  4. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  5. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  6. Discrete event simulation in an artificial intelligence environment: Some examples

    SciTech Connect

    Roberts, D.J.; Farish, T.

    1991-01-01

    Several Los Alamos National Laboratory (LANL) object-oriented discrete-event simulation efforts have been completed during the past three years. One of these systems has been put into production and has a growing customer base. Another (started two years earlier than the first project) was completed but has not yet been used. This paper will describe these simulation projects. Factors which were pertinent to the success of the one project, and to the failure of the second project will be discussed (success will be measured as the extent to which the simulation model was used as originally intended). 5 figs.

  7. Empirical study of simulated two-planet microlensing events

    SciTech Connect

    Zhu, Wei; Gould, Andrew; Penny, Matthew; Mao, Shude; Gendron, Rieul

    2014-10-10

    We undertake the first study of two-planet microlensing models recovered from simulations of microlensing events generated by realistic multiplanet systems in which 292 planetary events, including 16 two-planet events, were detected from 6690 simulated light curves. We find that when two planets are recovered, their parameters are usually close to those of the two planets in the system most responsible for the perturbations. However, in 1 of the 16 examples, the apparent mass of both detected planets was more than doubled by the unmodeled influence of a third, massive planet. This fraction is larger than but statistically consistent with the roughly 1.5% rate of serious mass errors due to unmodeled planetary companions for the 274 cases from the same simulation in which a single planet is recovered. We conjecture that an analogous effect due to unmodeled stellar companions may occur more frequently. For 7 out of 23 cases in which two planets in the system would have been detected separately, only one planet was recovered because the perturbations due to the two planets had similar forms. This is a small fraction (7/274) of all recovered single-planet models, but almost a third of all events that might plausibly have led to two-planet models. Still, in these cases, the recovered planet tends to have parameters similar to one of the two real planets most responsible for the anomaly.

  8. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  9. Autocharacterization feasibility system on Hunters Trophy event

    SciTech Connect

    Mills, R.A.

    1993-09-01

    An automated system to characterize cable systems at NTS has been developed to test the feasibility of such a system. A rack of electronic equipment including a fast pulse generator, digital sampling scope, coaxial switch matrix and GPIB controller was installed downhole at NTS for the Hunters Trophy event. It was used to test automated characterization. Recorded measurements of simulation and other instrument data were gathered to determine if a full scale automated system would be practical in full scale underground nuclear effects tests. The benefits of such a full scale system would be fewer personnel required downhole; more instrument control in the uphole recording room; faster acquisition of cable parameter data.

  10. Data Systems Dynamic Simulator

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Clark, Melana; Davenport, Bill; Message, Philip

    1993-01-01

    The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique.

  11. Calculation of Fission Observables Through Event-by-Event Simulation

    SciTech Connect

    Randrup, J; Vogt, R

    2009-06-04

    The increased interest in more exclusive fission observables has demanded more detailed models. We present here a new computational model, FREYA, that aims to met this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including arbitrary correlations. The various model assumptions are described and the potential utility of the model is illustrated by means of several novel correlation observables.

  12. Single event effects and laser simulation studies

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Schwartz, H.; Mccarty, K.; Coss, J.; Barnes, C.

    1993-01-01

    The single event upset (SEU) linear energy transfer threshold (LETTH) of radiation hardened 64K Static Random Access Memories (SRAM's) was measured with a picosecond pulsed dye laser system. These results were compared with standard heavy ion accelerator (Brookhaven National Laboratory (BNL)) measurements of the same SRAM's. With heavy ions, the LETTH of the Honeywell HC6364 was 27 MeV-sq cm/mg at 125 C compared with a value of 24 MeV-sq cm/mg obtained with the laser. In the case of the second type of 64K SRAM, the IBM640lCRH no upsets were observed at 125 C with the highest LET ions used at BNL. In contrast, the pulsed dye laser tests indicated a value of 90 MeV-sq cm/mg at room temperature for the SEU-hardened IBM SRAM. No latchups or multiple SEU's were observed on any of the SRAM's even under worst case conditions. The results of this study suggest that the laser can be used as an inexpensive laboratory SEU prescreen tool in certain cases.

  13. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  14. Earthquake Clustering and Triggering of Large Events in Simulated Catalogs

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Dieterich, J. H.; Richards-Dinger, K. B.; Xu, H.

    2013-12-01

    We investigate large event clusters (e.g. earthquake doublets and triplets) wherein secondary events in a cluster are triggered by stress transfer from previous events. We employ the 3D boundary element code RSQSim with a California fault model to generate synthetic catalogs spanning from tens of thousands up to a million years. The simulations incorporate rate-state fault constitutive properties, and the catalogs include foreshocks, aftershocks and occasional clusters of large events. Here we define a large event cluster as two or more M≥7 events within a few years. Most clustered events are closely grouped in space as well as time. Large event clusters show highly productive aftershock sequences where the aftershock locations of the first event in a cluster appear to correlate with the location of the next large event in the cluster. We find that the aftershock productivity of the first events in large event clusters is roughly double that of the unrelated, non-clustered events and that aftershock rate is a proxy for the stress state of the faults. The aftershocks of the first event in a large-event cluster migrate toward the point of nucleation of the next event in a large-event cluster. Furthermore, following a normal aftershock sequence, the average event rate increases prior to the second event in a large-event cluster. These increased event rates prior to the second event in a cluster follow an inverse Omori's law, which is characteristic of foreshocks. Clustering probabilities based on aftershock rates are higher than expected from Omori aftershock and Gutenberg-Richter magnitude frequency laws, which suggests that the high aftershock rates indicate near-critical stresses for failure in a large earthquake.

  15. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-10-17

    This quarter, we have focused on several tasks: (1) Building a high-quality catalog of earthquake source parameters for the Middle East and East Asia. In East Asia, we computed source parameters using the CAP method for a set of events studied by Herrman et al., (MRR, 2006) using a complete waveform technique. Results indicated excellent agreement with the moment magnitudes in the range 3.5 -5.5. Below magnitude 3.5 the scatter increases. For events with more than 2-3 observations at different azimuths, we found good agreement of focal mechanisms. Depths were generally consistent, although differences of up to 10 km were found. These results suggest that CAP modeling provides estimates of source parameters at least as reliable as complete waveform modeling techniques. However, East Asia and the Yellow Sea Korean Paraplatform (YSKP) region studied are relatively laterally homogeneous and may not benefit from the CAP method’s flexibility to shift waveform segments to account for path-dependent model errors. A more challenging region to study is the Middle East where strong variations in sedimentary basin, crustal thickness and crustal and mantle seismic velocities greatly impact regional wave propagation. We applied the CAP method to a set of events in and around Iran and found good agreement between estimated focal mechanisms and those reported by the Global Centroid Moment Tensor (CMT) catalog. We found a possible bias in the moment magnitudes that may be due to the thick low-velocity crust in the Iranian Plateau. (2) Testing Methods on a Lifetime Regional Data Set. In particular, the recent 2/21/08 Nevada Event and Aftershock Sequence occurred in the middle of USArray, producing over a thousand records per event. The tectonic setting is quite similar to Central Iran and thus provides an excellent testbed for CAP+ at ranges out to 10°, including extensive observations of crustal thinning and thickening and various Pnl complexities. Broadband modeling in 1D, 2D

  16. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, D; Tromp, J; Rodgers, A

    2007-07-16

    Comprehensive test ban monitoring in terms of location and discrimination has progressed significantly in recent years. However, the characterization of sources and the estimation of low yields remains a particular challenge. As the recent Korean shot demonstrated, we can probably expect to have a small set of teleseismic, far-regional and high-frequency regional data to analyze in estimating the yield of an event. Since stacking helps to bring signals out of the noise, it becomes useful to conduct comparable analyses on neighboring events, earthquakes in this case. If these auxiliary events have accurate moments and source descriptions, we have a means of directly comparing effective source strengths. Although we will rely on modeling codes, 1D, 2D, and 3D, we will also apply a broadband calibration procedure to use longer periods (P>5s) waveform data to calibrate short-period (P between .5 to 2 Hz) and high-frequency (P between 2 to 10 Hz) as path specify station corrections from well-known regional sources. We have expanded our basic Cut-and-Paste (CAP) methodology to include not only timing shifts but also amplitude (f) corrections at recording sites. The name of this method was derived from source inversions that allow timing shifts between 'waveform segments' (or cutting the seismogram up and re-assembling) to correct for crustal variation. For convenience, we will refer to these f-dependent refinements as CAP+ for (SP) and CAP++ for still higher frequency. These methods allow the retrieval of source parameters using only P-waveforms where radiation patterns are obvious as demonstrated in this report and are well suited for explosion P-wave data. The method is easily extended to all distances because it uses Green's function although there may be some changes required in t* to adjust for offsets between local vs. teleseismic distances. In short, we use a mixture of model-dependent and empirical corrections to tackle the path effects. Although we reply on the

  17. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  18. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    SciTech Connect

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  19. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    SciTech Connect

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    2008-04-15

    The recent Nevada Earthquake (M=6) produced an extraordinary set of crustal guided waves. In this study, we examine the three-component data at all the USArray stations in terms of how well existing models perform in predicting the various phases, Rayleigh waves, Love waves, and Pnl waves. To establish the source parameters, we applied the Cut and Paste Code up to distance of 5° for an average local crustal model which produced a normal mechanism (strike=35°,dip=41°,rake=-85°) at a depth of 9 km and Mw=5.9. Assuming this mechanism, we generated synthetics at all distances for a number of 1D and 3D models. The Pnl observations fit the synthetics for the simple models well both in timing (VPn=7.9km/s) and waveform fits out to a distance of about 5°. Beyond this distance a great deal of complexity can be seen to the northwest apparently caused by shallow subducted slab material. These paths require considerable crustal thinning and higher P-velocities. Small delays and advances outline the various tectonic province to the south, Colorado Plateau, etc. with velocities compatible with that reported on by Song et al.(1996). Five-second Rayleigh waves (Airy Phase) can be observed throughout the whole array and show a great deal of variation ( up to 30s). In general, the Love waves are better behaved than the Rayleigh waves. We are presently adding higher frequency to the source description by including source complexity. Preliminary inversions suggest rupture to northeast with a shallow asperity. We are, also, inverting the aftershocks to extend the frequencies to 2 Hz and beyond following the calibration method outlined in Tan and Helmberger (2007). This will allow accurate directivity measurements for events with magnitude larger than 3.5. Thus, we will address the energy decay with distance as s function of frequency band for the various source types.

  20. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  1. Simulating Single-Event Upsets in Bipolar RAM's

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.

    1986-01-01

    Simulation technique saves testing. Uses interactive version of SPICE (Simulation Program with Integrated Circuit Emphasis). Device and subcircuit models available in software used to construct macromodel for an integrated bipolar transistor. Time-dependent current generators placed inside transistor macromodel to simulate charge collection from ion track. Significant finding of experiments is standard design practice of reducing power in unaddressed bipolar RAM cell increases sensitivity of cell to single-event upsets.

  2. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  3. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  4. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  5. Event-by-event fission simulation code, generates complete fission events

    2013-04-01

    FREYA is a computer code that generates complete fission events. The output includes the energy and momentum of these final state particles: fission products, prompt neutrons and prompt photons. The version of FREYA that is to be released is a module for MCNP6.

  6. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  7. Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Prive, Nikki

    2015-01-01

    This presentation gives an overview of Observing System Simulation Experiments (OSSEs). The components of an OSSE are described, along with discussion of the process for validating, calibrating, and performing experiments. a.

  8. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  9. Advanced Simulation of Coupled Earthquake and Tsunami Events (ASCETE) - Simulation Techniques for Realistic Tsunami Process Studies

    NASA Astrophysics Data System (ADS)

    Behrens, Joern; Bader, Michael; Breuer, Alexander N.; van Dinther, Ylona; Gabriel, Alice-A.; Galvez Barron, Percy E.; Rahnema, Kaveh; Vater, Stefan; Wollherr, Stephanie

    2015-04-01

    At the End of phase 1 of the ASCETE project a simulation framework for coupled physics-based rupture generation with tsunami propagation and inundation is available. Adaptive mesh tsunami propagation and inundation by discontinuous Galerkin Runge-Kutta methods allows for accurate and conservative inundation schemes. Combined with a tree-based refinement strategy to highly optimize the code for high-performance computing architectures, a modeling tool for high fidelity tsunami simulations has been constructed. Validation results demonstrate the capacity of the software. Rupture simulation is performed by an unstructured tetrahedral discontinuous Galerking ADER discretization, which allows for accurate representation of complex geometries. The implemented code was nominated for and was selected as a finalist for the Gordon Bell award in high-performance computing. Highly realistic rupture events can be simulated with this modeling tool. The coupling of rupture induced wave activity and displacement with hydrodynamic equations still poses a major problem due to diverging time and spatial scales. Some insight from the ASCETE set-up could be gained and the presentation will focus on the coupled behavior of the simulation system. Finally, an outlook to phase 2 of the ASCETE project will be given in which further development of detailed physical processes as well as near-realistic scenario computations are planned. ASCETE is funded by the Volkswagen Foundation.

  10. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  11. Disaster Response Modeling Through Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  12. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  13. SPICE: Simulation Package for Including Flavor in Collider Events

    NASA Astrophysics Data System (ADS)

    Engelhard, Guy; Feng, Jonathan L.; Galon, Iftah; Sanford, David; Yu, Felix

    2010-01-01

    We describe SPICE: Simulation Package for Including Flavor in Collider Events. SPICE takes as input two ingredients: a standard flavor-conserving supersymmetric spectrum and a set of flavor-violating slepton mass parameters, both of which are specified at some high "mediation" scale. SPICE then combines these two ingredients to form a flavor-violating model, determines the resulting low-energy spectrum and branching ratios, and outputs HERWIG and SUSY Les Houches files, which may be used to generate collider events. The flavor-conserving model may be any of the standard supersymmetric models, including minimal supergravity, minimal gauge-mediated supersymmetry breaking, and anomaly-mediated supersymmetry breaking supplemented by a universal scalar mass. The flavor-violating contributions may be specified in a number of ways, from specifying charges of fields under horizontal symmetries to completely specifying all flavor-violating parameters. SPICE is fully documented and publicly available, and is intended to be a user-friendly aid in the study of flavor at the Large Hadron Collider and other future colliders. Program summaryProgram title: SPICE Catalogue identifier: AEFL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 8153 No. of bytes in distributed program, including test data, etc.: 67 291 Distribution format: tar.gz Programming language: C++ Computer: Personal computer Operating system: Tested on Scientific Linux 4.x Classification: 11.1 External routines: SOFTSUSY [1,2] and SUSYHIT [3] Nature of problem: Simulation programs are required to compare theoretical models in particle physics with present and future data at particle colliders. SPICE determines the masses and decay branching ratios of

  14. Towards Flexible Exascale Stream Processing System Simulation

    SciTech Connect

    Li, Cheng-Hong; Nair, Ravi; Ohba, Noboyuki; Shvadron, Uzi; Zaks, Ayal; Schenfeld, Eugen

    2012-01-01

    Stream processing is an important emerging computational model for performing complex operations on and across multi-source, high-volume, unpredictable dataflows. We present Flow, a platform for parallel and distributed stream processing system simulation that provides a flexible modeling environment for analyzing stream processing applications. The Flow stream processing system simulator is a high-performance, scalable simulator that automatically parallelizes chunks of the model space and incurs near-zero synchronization overhead for acyclic stream application graphs. We show promising parallel and distributed event rates exceeding 149 million events per second on a cluster with 512 processor cores.

  15. Top Event Matrix Analysis Code System.

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  16. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  17. Simulation Of Advanced Train Control Systems

    NASA Astrophysics Data System (ADS)

    Craven, Paul; Oman, Paul

    This paper describes an Advanced Train Control System (ATCS) simulation environment created using the Network Simulator 2 (ns-2) discrete event network simulation system. The ATCS model is verified using ATCS monitoring software, laboratory results and a comparison with a mathematical model of ATCS communications. The simulation results are useful in understanding ATCS communication characteristics and identifying protocol strengths, weaknesses, vulnerabilities and mitigation techniques. By setting up a suite of ns-2 scripts, an engineer can simulate hundreds of possible scenarios in the space of a few seconds to investigate failure modes and consequences.

  18. Extreme events evaluation over African cities with regional climate simulations

    NASA Astrophysics Data System (ADS)

    Bucchignani, Edoardo; Mercogliano, Paola; Simonis, Ingo; Engelbrecht, Francois

    2013-04-01

    The warming of the climate system in recent decades is evident from observations and is mainly related to the increase of anthropogenic greenhouse gas concentrations (IPCC, 2012). Given the expected climate change conditions on the African continent, as underlined in different publications, and their associated socio-economic impacts, an evaluation of the specific effects on some strategic African cities on the medium and long-term is of crucial importance with regard to the development of adaptation strategies. Assessments usually focus on averages climate properties rather than on variability or extremes, but often these last ones have more impacts on the society than averages values. Global Coupled Models (GCM) are generally used to simulate future climate scenarios as they guarantee physical consistency between variables; however, due to the coarse spatial resolution, their output cannot be used for impact studies on local scales, which makes necessary the generation of higher resolution climate change data. Regional Climate Models (RCM) describe better the phenomena forced by orography or by coastal lines, or that are related to convection. Therefore they can provide more detailed information on climate extremes that are hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws. The normal bias of the RCM to represent the local climatology is reduced using adequate statistical techniques based on the comparison of the simulated results with long observational time series. In the framework of the EU-FP7 CLUVA (Climate Change and Urban Vulnerability in Africa) project, regional projections of climate change at high resolution (about 8 km), have been performed for selected areas surrounding five African cities. At CMCC, the regional climate model COSMO-CLM has been employed: it is a non-hydrostatic model. For each domain, two simulations have been performed, considering the RCP4.5 and RCP8.5 emission

  19. Flash heat simulation events in the north Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Mazon, Jordi; Pino, David

    2013-04-01

    According to the definition of flash heat event proposed by Mazon et al. in the European Meteorology Meeting (2011 and 2012) from the studied case produced in the Northeast of the Iberian peninsula on 27th August 20120, some other flash heat events have been detected by automatic weather stations around the in the Mediterranean basin (South Italy, Crete island, South Greece and the northeast of the Iberian peninsula). Flash heat event covers those events in which a large increase of temperature last a spatial and temporal scale between heat wave (defined by the WMO as a phenomenon in which the daily maximum temperature of more than five consecutive days exceeds the average maximum temperature by 5°C, with respect to the 1961-1990 period) and heat burst (defined by the AMS as a rare atmospheric event characterized by gusty winds and a rapid increase in temperature and decrease in humidity that can last some minutes). Thus flash heat event may be considered as a rapid modification of the temperature that last several hours, lower than 48 hours, but usually less than 24 hours. Two different flash heat events have been simulated with the WRF mesoscale model in the Mediterranean basin. The results show that two different mechanisms are the main causes of these flash heat events. The first one occurred on 23rd March 2008 in Crete Island due to a strong Foehn effect caused by a strong south and southeast wind, in which the maximum temperature increased during some hours on the night at 32°C. The second one occurred on 1st August 2012 in the northeast of the Iberian Peninsula, caused by a rapid displacement of warm a ridge from North Africa that lasted around 24 hours.

  20. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  1. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  2. Three Dimensional Simulation of the Baneberry Nuclear Event

    SciTech Connect

    Lomov, I

    2003-07-16

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  3. Attribution of extreme weather and climate events overestimated by unreliable climate simulations

    NASA Astrophysics Data System (ADS)

    Bellprat, Omar; Doblas-Reyes, Francisco

    2016-03-01

    Event attribution aims to estimate the role of an external driver after the occurrence of an extreme weather and climate event by comparing the probability that the event occurs in two counterfactual worlds. These probabilities are typically computed using ensembles of climate simulations whose simulated probabilities are known to be imperfect. The implications of using imperfect models in this context are largely unknown, limited by the number of observed extreme events in the past to conduct a robust evaluation. Using an idealized framework, this model limitation is studied by generating large number of simulations with variable reliability in simulated probability. The framework illustrates that unreliable climate simulations are prone to overestimate the attributable risk to climate change. Climate model ensembles tend to be overconfident in their representation of the climate variability which leads to systematic increase in the attributable risk to an extreme event. Our results suggest that event attribution approaches comprising of a single climate model would benefit from ensemble calibration in order to account for model inadequacies similarly as operational forecasting systems.

  4. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  5. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  6. Extended event driven molecular dynamics for simulating dense granular matter

    NASA Astrophysics Data System (ADS)

    González, S.; Risso, D.; Soto, R.

    2009-12-01

    A new numerical method is presented to efficiently simulate the inelastic hard sphere (IHS) model for granular media, when fluid and frozen regions coexist in the presence of gravity. The IHS model is extended by allowing particles to change their dynamics into either a frozen state or back to the normal collisional state, while computing the dynamics only for the particles in the normal state. Careful criteria, local in time and space, are designed such that particles become frozen only at mechanically stable positions. The homogeneous deposition over a static surface and the dynamics of a rotating drum are studied as test cases. The simulations agree with previous experimental results. The model is much more efficient than the usual event driven method and allows to overcome some of the difficulties of the standard IHS model, such as the existence of a static limit.

  7. Temperature-accelerated dynamics for simulation of infrequent events

    SciTech Connect

    Soerensen, Mads R.; Voter, Arthur F.

    2000-06-01

    We present a method for accelerating dynamic simulations of activated processes in solids. By raising the temperature, but allowing only those events that should occur at the original temperature, the time scale of a simulation is extended by orders of magnitude compared to ordinary molecular dynamics, while preserving the correct dynamics at the original temperature. The main assumption behind the method is harmonic transition state theory. Importantly, the method does not require any prior knowledge about the transition mechanisms. As an example, the method is applied to a study of surface diffusion, where concerted processes play a key role. In the example, times of hours are achieved at a temperature of 150 K. (c) 2000 American Institute of Physics.

  8. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  9. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    SciTech Connect

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  10. 3D Simulation of External Flooding Events for the RISMC Pathway

    SciTech Connect

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  11. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  12. Cardiovascular Events in Systemic Lupus Erythematosus

    PubMed Central

    Fernández-Nebro, Antonio; Rúa-Figueroa, Íñigo; López-Longo, Francisco J.; Galindo-Izquierdo, María; Calvo-Alén, Jaime; Olivé-Marqués, Alejandro; Ordóñez-Cañizares, Carmen; Martín-Martínez, María A.; Blanco, Ricardo; Melero-González, Rafael; Ibáñez-Rúan, Jesús; Bernal-Vidal, José Antonio; Tomero-Muriel, Eva; Uriarte-Isacelaya, Esther; Horcada-Rubio, Loreto; Freire-González, Mercedes; Narváez, Javier; Boteanu, Alina L.; Santos-Soler, Gregorio; Andreu, José L.; Pego-Reigosa, José M.

    2015-01-01

    Abstract This article estimates the frequency of cardiovascular (CV) events that occurred after diagnosis in a large Spanish cohort of patients with systemic lupus erythematosus (SLE) and investigates the main risk factors for atherosclerosis. RELESSER is a nationwide multicenter, hospital-based registry of SLE patients. This is a cross-sectional study. Demographic and clinical variables, the presence of traditional risk factors, and CV events were collected. A CV event was defined as a myocardial infarction, angina, stroke, and/or peripheral artery disease. Multiple logistic regression analysis was performed to investigate the possible risk factors for atherosclerosis. From 2011 to 2012, 3658 SLE patients were enrolled. Of these, 374 (10.9%) patients suffered at least a CV event. In 269 (7.4%) patients, the CV events occurred after SLE diagnosis (86.2% women, median [interquartile range] age 54.9 years [43.2–66.1], and SLE duration of 212.0 months [120.8–289.0]). Strokes (5.7%) were the most frequent CV event, followed by ischemic heart disease (3.8%) and peripheral artery disease (2.2%). Multivariate analysis identified age (odds ratio [95% confidence interval], 1.03 [1.02–1.04]), hypertension (1.71 [1.20–2.44]), smoking (1.48 [1.06–2.07]), diabetes (2.2 [1.32–3.74]), dyslipidemia (2.18 [1.54–3.09]), neurolupus (2.42 [1.56–3.75]), valvulopathy (2.44 [1.34–4.26]), serositis (1.54 [1.09–2.18]), antiphospholipid antibodies (1.57 [1.13–2.17]), low complement (1.81 [1.12–2.93]), and azathioprine (1.47 [1.04–2.07]) as risk factors for CV events. We have confirmed that SLE patients suffer a high prevalence of premature CV disease. Both traditional and nontraditional risk factors contribute to this higher prevalence. Although it needs to be verified with future studies, our study also shows—for the first time—an association between diabetes and CV events in SLE patients. PMID:26200625

  13. Weather Climate Interactions and Extreme Events in the Climate System

    NASA Astrophysics Data System (ADS)

    Redmond, K. T.

    2014-12-01

    The most pronounced local impacts of climate change would occur in association with extreme weather events superimposed on the altered climate. Thus a major thrust of recent efforts in the climate community has been to assess how extreme regional events such as cold air outbreaks, heat waves, tropical cyclones, floods, droughts, and severe weather might change with the climate. Many of these types of events are poorly simulated in climate models because of insufficient spatial resolution and insufficient quality parameterization of sub grid scale convection and radiation processes. This talk summarizes examples selected from those discussed below of how weather and climate events can be interconnected so that the physics of natural climate and weather phenomena depend on each other, thereby complicating our ability to simulate extreme events. A major focus of the chapter is on the Madden Julian oscillation (MJO), which is associated with alternating eastward-moving planetary scale regions of enhanced and suppressed moist deep convection favoring warm pool regions in the tropics. The MJO modulates weather events around the world and influences the evolution of interannual climate variability. We first discuss how the MJO evolves together with the seasonal cycle, the El Niño/southern oscillation (ENSO), and the extratropical circulation, then continue with a case study illustration of how El Niño is intrinsically coupled to intraseasonal and synoptic weather events such as the MJO and westerly wind bursts. This interconnectedness in the system implies that modeling many types of regional extreme weather events requires more than simply downscaling coarse climate model signals to nested regional models because extreme outcomes in a region can depend on poorly simulated extreme weather in distant parts of the world. The authors hope that an improved understanding of these types of interactions between signals across scales of time and space will ultimately yield

  14. Weather Climate Interactions and Extreme Events in the Climate System

    NASA Astrophysics Data System (ADS)

    Roundy, P. E.

    2015-12-01

    The most pronounced local impacts of climate change would occur in association with extreme weather events superimposed on the altered climate. Thus a major thrust of recent efforts in the climate community has been to assess how extreme regional events such as cold air outbreaks, heat waves, tropical cyclones, floods, droughts, and severe weather might change with the climate. Many of these types of events are poorly simulated in climate models because of insufficient spatial resolution and insufficient quality parameterization of sub grid scale convection and radiation processes. This talk summarizes examples selected from those discussed below of how weather and climate events can be interconnected so that the physics of natural climate and weather phenomena depend on each other, thereby complicating our ability to simulate extreme events. A major focus of the chapter is on the Madden Julian oscillation (MJO), which is associated with alternating eastward-moving planetary scale regions of enhanced and suppressed moist deep convection favoring warm pool regions in the tropics. The MJO modulates weather events around the world and influences the evolution of interannual climate variability. We first discuss how the MJO evolves together with the seasonal cycle, the El Niño/southern oscillation (ENSO), and the extratropical circulation, then continue with a case study illustration of how El Niño is intrinsically coupled to intraseasonal and synoptic weather events such as the MJO and westerly wind bursts. This interconnectedness in the system implies that modeling many types of regional extreme weather events requires more than simply downscaling coarse climate model signals to nested regional models because extreme outcomes in a region can depend on poorly simulated extreme weather in distant parts of the world. The authors hope that an improved understanding of these types of interactions between signals across scales of time and space will ultimately yield

  15. The Flexible Rare Event Sampling Harness System (FRESHS)

    NASA Astrophysics Data System (ADS)

    Kratzer, Kai; Berryman, Joshua T.; Taudt, Aaron; Zeman, Johannes; Arnold, Axel

    2014-07-01

    We present the software package FRESHS (http://www.freshs.org) for parallel simulation of rare events using sampling techniques from the ‘splitting’ family of methods. Initially, Forward Flux Sampling (FFS) and Stochastic Process Rare Event Sampling (SPRES) have been implemented. These two methods together make rare event sampling available for both quasi-static and full non-equilibrium regimes. Our framework provides a plugin system for software implementing the underlying physics of the system of interest. At present, example plugins exist for our framework to steer the popular MD packages GROMACS, LAMMPS and ESPResSo, but due to the simple interface of our plugin system, it is also easy to attach other simulation software or self-written code. Use of our framework does not require recompilation of the simulation program. The modular structure allows the flexible implementation of further sampling methods or physics engines and creates a basis for objective comparison of different sampling algorithms. Our code is designed to make optimal use of available compute resources. System states are managed using standard database technology so as to allow checkpointing, scaling and flexible analysis. The communication within the framework uses plain TCP/IP networking and is therefore suited to high-performance parallel hardware as well as to distributed or even heterogeneous networks of inexpensive machines. For FFS we implemented an automatic interface placement that ensures optimal, nearly constant flux through the interfaces. We introduce ‘ghost’ (or ‘look-ahead’) runs that remedy the bottleneck which occurs when progressing to the next interface. FRESHS is open-source, providing a publicly available parallelized rare event sampling system.

  16. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  17. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  18. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed.

  19. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  20. Simulated cold events in the northern North Atlantic during the last millennium

    NASA Astrophysics Data System (ADS)

    Moreno-Chamarro, Eduardo; Zanchettin, Davide; Lohmann, Katja; Jungclaus, Johann

    2014-05-01

    Paleoceanographic data show large inter-decadal cold excursions in sea-surface temperatures (SSTs) in the western subpolar gyre region and north of Iceland throughout the last millennium. A series of such events could have contributed to demise the Norse settlements over Greenland during the 13th to the 15th century due to associated deteriorating environmental conditions in the region. However, spatial extent, attribution and mechanism(s) of these cold events are not known. In this contribution, we use climate model simulations to clarify the role of the ocean and of coupled ocean-atmosphere dynamics in triggering these cold events, and to assess whether they can be explained by internal climate variability alone. Specifically, we investigate the North Atlantic-Arctic climate variability in a 1000-year control run describing an unperturbed pre-industrial climate, and in a 3-member ensemble of full-forcing transient simulations of the last millennium. Simulations are performed with the Max Planck Institute-Earth System Model for paleo-applications. In the control and transient simulations, we identified cold events of similar amplitude and duration to the reconstructed data. Spatial patterns and temporal evolutions of simulated cold events are similar in both simulation types. In the transient runs, furthermore, they do not robustly coincide with periods of strong external forcing (e.g. of major volcanic eruptions). We therefore conclude that such events can emerge because of internally-generated regional climate variability alone. Local ocean-atmosphere coupled processes in the North Atlantic subpolar gyre region appear as key part of the mechanism of simulated cold events. In particular, they are typically associated with the onset of prolonged positive sea-level pressure anomalies over the North Atlantic and associated weaker and south-eastward displaced subpolar gyre. The salt transport reduction by the Irminger Current together with an intensification of the

  1. Exploring team avionics systems by simulation

    NASA Technical Reports Server (NTRS)

    Brent, G. A.; Mccalla, T. M., Jr.

    1978-01-01

    Configurations of software and hardware in a no-critical-element team architecture are under study for future general aviation aircraft avionics. The team integrated avionics system, based on microprocessors, can monitor and partially interpret all flight instrument data, engine parameters, and navigation information faster than a human pilot. Simulation programs based on an event-oriented simulation language are being used to design team architectures.

  2. Importance of Model Simulations in Cassini In-Flight Mission Events

    NASA Technical Reports Server (NTRS)

    Brown, Jay; Wang, Eric; Hernandez, Juan; Lee, Allan Y.

    2009-01-01

    Simulation environments have been an integral part of Cassini's heritage. From the time of flight software development and testing to the beginning of the spacecraft's extended mission operations, both softsim and hardware-in-the-loop testbeds have played vital roles in verifying and validating key mission events. Satellite flybys and mission-critical events have established the need to model Titan's atmospheric torque, Enceladus' plume density, and other key parametric spacecraft environments. This paper will focus on enhancements to Cassini's Flight Software Development System (FSDS) and Integrated Test Laboratory (ITL) to model key event attributes which establish valid test environments and ensure safe spacecraft operability. Comparisons between simulated to in-flight data are presented which substantiate model validity.

  3. Transportation Anslysis Simulation System

    SciTech Connect

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at the level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not account

  4. Transportation Anslysis Simulation System

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at themore » level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not

  5. The neural basis of event simulation: an FMRI study.

    PubMed

    Yomogida, Yukihito; Sugiura, Motoaki; Akimoto, Yoritaka; Miyauchi, Carlos Makoto; Kawashima, Ryuta

    2014-01-01

    Event simulation (ES) is the situational inference process in which perceived event features such as objects, agents, and actions are associated in the brain to represent the whole situation. ES provides a common basis for various cognitive processes, such as perceptual prediction, situational understanding/prediction, and social cognition (such as mentalizing/trait inference). Here, functional magnetic resonance imaging was used to elucidate the neural substrates underlying important subdivisions within ES. First, the study investigated whether ES depends on different neural substrates when it is conducted explicitly and implicitly. Second, the existence of neural substrates specific to the future-prediction component of ES was assessed. Subjects were shown contextually related object pictures implying a situation and performed several picture-word-matching tasks. By varying task goals, subjects were made to infer the implied situation implicitly/explicitly or predict the future consequence of that situation. The results indicate that, whereas implicit ES activated the lateral prefrontal cortex and medial/lateral parietal cortex, explicit ES activated the medial prefrontal cortex, posterior cingulate cortex, and medial/lateral temporal cortex. Additionally, the left temporoparietal junction plays an important role in the future-prediction component of ES. These findings enrich our understanding of the neural substrates of the implicit/explicit/predictive aspects of ES-related cognitive processes. PMID:24789353

  6. Rare-event simulation methods for equilibrium and non-equilibrium events

    NASA Astrophysics Data System (ADS)

    Ziff, Robert

    2014-03-01

    Rare events are those that occur with a very low probability in experiment, or are common but difficult to sample using standard computer simulation techniques. Such processes require advanced methods in order to obtain useful results in reasonable amounts of computer time. We discuss some of those techniques here, including the ``barrier'' method, splitting methods, and a Forward-Flux Sampling in Time (FFST) algorithm, and apply them to measure the nucleation times of the first-order transition in the Ziff-Gulari-Barshad model of surface catalysis, including nucleation in finite equilibrium states, which are measured to occur with probabilities as low as 10°C(-40). We also study the transitions in the Maier-Stein model of chemical kinetics, and use the methods to find the harmonic measure in percolation and Diffusion-Limited Aggregation (DLA) clusters. co-authors: David Adams, Google, and Leonard Sander, University of Michigan.

  7. Simulating neural systems with Xyce.

    SciTech Connect

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting; Warrender, Christina E.; Aimone, James Bradley; Teeter, Corinne; Duda, Alex M.

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  8. Production of Nitrogen Oxides by Laboratory Simulated Transient Luminous Events

    NASA Astrophysics Data System (ADS)

    Peterson, H.; Bailey, M.; Hallett, J.; Beasley, W.

    2007-12-01

    Restoration of the polar stratospheric ozone layer has occurred at rates below those originally expected following reductions in chlorofluorocarbon (CFC) usage. Additional reactions affecting ozone depletion now must also be considered. This research examines nitrogen oxides (NOx) produced in the middle atmosphere by transient luminous events (TLEs), with NOx production in this layer contributing to the loss of stratospheric ozone. In particular, NOx produced by sprites in the mesosphere would be transported to the polar stratosphere via the global meridional circulation and downward diffusion. A pressure-controlled vacuum chamber was used to simulate middle atmosphere pressures, while a power supply and in-chamber electrodes were used to simulate TLEs in the pressure controlled environment. Chemiluminescence NOx analyzers were used to sample NOx produced by the chamber discharges- originally a Monitor Labs Model 8440E, later a Thermo Environment Model 42. Total NOx production for each discharge as well as NOx per ampere of current and NOx per Joule of discharge energy were plotted. Absolute NOx production was greatest for discharge environments with upper tropospheric pressures (100-380 torr), while NOx/J was greatest for discharge environments with stratospheric pressures (around 10 torr). The different production efficiencies in NOx/J as a function of pressure pointed to three different production regimes, each with its own reaction mechanisms: one for tropospheric pressures, one for stratospheric pressures, and one for upper stratospheric to mesospheric pressures (no greater than 1 torr).

  9. Aging and brain rejuvenation as systemic events

    PubMed Central

    Bouchard, Jill; Villeda, Saul A

    2015-01-01

    The effects of aging were traditionally thought to be immutable, particularly evident in the loss of plasticity and cognitive abilities occurring in the aged central nervous system (CNS). However, it is becoming increasingly apparent that extrinsic systemic manipulations such as exercise, caloric restriction, and changing blood composition by heterochronic parabiosis or young plasma administration can partially counteract this age-related loss of plasticity in the aged brain. In this review, we discuss the process of aging and rejuvenation as systemic events. We summarize genetic studies that demonstrate a surprising level of malleability in organismal lifespan, and highlight the potential for systemic manipulations to functionally reverse the effects of aging in the CNS. Based on mounting evidence, we propose that rejuvenating effects of systemic manipulations are mediated, in part, by blood-borne ‘pro-youthful’ factors. Thus, systemic manipulations promoting a younger blood composition provide effective strategies to rejuvenate the aged brain. As a consequence, we can now consider reactivating latent plasticity dormant in the aged CNS as a means to rejuvenate regenerative, synaptic, and cognitive functions late in life, with potential implications even for extending lifespan. PMID:25327899

  10. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  11. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    SciTech Connect

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  12. Hydrogen Event Containment Response Code System.

    1999-11-23

    Version: 00 Distribution is restricted to the United States Only. HECTR1.5 (Hydrogen Event-Containment Transient Response) is a lumped-volume containment analysis program that is most useful for performing parametric studies. Its main purpose is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other containment problems. Six gases; steam, nitrogen, oxygen, hydrogen, carbon monoxide, and carbonmore » dioxide are modified along with sumps containing liquid water. HECTR can model virtually all the containment systems of importance in ice condenser, large dry and Mark III containments. A postprocessor, ACHILES1.5, is included. It processes the time-dependent variable output (compartment pressures, flow junction velocities, surface temperatures, etc.) produced by HECTR. ACHILES can produce tables and graphs of these data.« less

  13. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  14. Numerical simulation of a continental-scale Saharan dust event

    NASA Astrophysics Data System (ADS)

    Shao, Yaping; Fink, Andreas H.; Klose, Martina

    2010-07-01

    Using an integrated dust-storm modeling system, we simulate the severe Saharan dust episode between 1 and 10 March 2004. The simulations are compared with surface synoptic data and satellite images and are found to agree well with the observations. The synoptic systems that generated the dust storms and the evolution of the dust patterns are analyzed. It is revealed that a cyclogenesis over central Sahara, accompanied by an anticyclone over the Atlantic and a monsoon trough in the tropics, was responsible for the widespread continental-scale dust storms in North Africa. Dust first appeared in west Sahara, then in east Sahara and much of the dust emitted from east Sahara was transported to the monsoon trough, resulting in high concentrations of suspended dust over the Sahel (column dust load up to 10 g m-2). The main dust source regions were (1) Mauritania, (2) Chad and Niger, and (3) Libya, Egypt, and Sudan. The region between 10°N and 17°N was one of net dust deposition. We estimate that 715.8 megatons (Mt) of dust was emitted from North Africa during the episode, 608.2 Mt of which was deposited to the continent, and the net dust emission was 107.6 Mt. Of the 107.6 Mt, with respect to the model domain, 7.3 Mt was deposited to the ocean, 79.8 Mt was transported across the domain boundaries, and 20.5 Mt was suspended in the atmosphere.

  15. Multi-timescale event-scheduling in multi-agent immune simulation models.

    PubMed

    Guo, Zaiyi; Tay, Joc Cing

    2008-01-01

    Multi-agent (or MA) -based design approaches have received much research attention lately for modeling immunological systems due to their efficacy in representing non-heterogeneous behaviors in the population under dynamic environmental and topological conditions. The update scheme of a MA model refers to the frequency of agent state updates and how these are related in temporal order. In contrast to verifiable agent behavioral rules at the individual level, the update scheme is a design decision made by the model developer at the systems level that is subject to realism and computational efficiency issues that directly affect the credibility and the usefulness of the simulation results. Previous works have mainly focused on the issue of realism with respect to synchrony of updates but have often overlooked the necessary heterogeneity in update frequencies due to multi-timescales in immunological phenomena. To incorporate such multi-timescales for realism, the efficiency of the update scheme arises as a nontrivial issue. An event-scheduling based asynchronous update scheme has the advantage of allowing arbitrary smaller timescales for realism and avoiding unnecessary execution and delays to achieve efficiency. In this paper we present the application of the event-scheduling update scheme to realistically model the B cell life cycle, and empirically compare its simulation performance with the widely adopted uniform time-step update scheme. The simulation results show a significantly reduced execution time (40 times faster) and also reveal the conditions where the event-scheduling update scheme is superior.

  16. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    PubMed

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  17. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  18. Numerical simulation diagnostics of a flash flood event in Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Samman, Ahmad

    On 26 January 2011, a severe storm hit the city of Jeddah, the second largest city in the Kingdom of Saudi Arabia. The storm resulted in heavy rainfall, which produced a flash flood in a short period of time. This event caused at least eleven fatalities and more than 114 injuries. Unfortunately, the observed rainfall data are limited to the weather station at King Abdul Aziz International airport, which is north of the city, while the most extreme precipitation occurred over the southern part of the city. This observation was useful to compare simulation result even though it does not reflect the severity of the event. The Regional Atmospheric Modeling System (RAMS) developed at Colorado State University was used to study this storm event. RAMS simulations indicted that a quasi-stationary Mesoscale convective system developed over the city of Jeddah and lasted for several hours. It was the source of the huge amount of rainfall. The model computed a total rainfall of more than 110 mm in the southern part of the city, where the flash flood occurred. This precipitation estimation was confirmed by the actual observation of the weather radar. While the annual rainfall in Jeddah during the winter varies from 50 to 100 mm, the amount of the rainfall resulting from this storm event exceeded the climatological total annual rainfall. The simulation of this event showed that warm sea surface temperature, combined with high humidity in the lower atmosphere and a large amount of convective available potential energy (CAPE) provided a favorable environment for convection. It also showed the presence of a cyclonic system over the north and eastern parts of the Mediterranean Sea, and a subtropical anti-cyclone over Northeastern Africa that contributed to cold air advection bringing cold air to the Jeddah area. In addition, an anti-cyclone (blocking) centered over east and southeastern parts of the Arabian Peninsula and the Arabian Sea produced a low level jet over the southern

  19. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  20. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  1. Monte Carlo simulation of dense polymer melts using event chain algorithms.

    PubMed

    Kampmann, Tobias A; Boltz, Horst-Holger; Kierfeld, Jan

    2015-07-28

    We propose an efficient Monte Carlo algorithm for the off-lattice simulation of dense hard sphere polymer melts using cluster moves, called event chains, which allow for a rejection-free treatment of the excluded volume. Event chains also allow for an efficient preparation of initial configurations in polymer melts. We parallelize the event chain Monte Carlo algorithm to further increase simulation speeds and suggest additional local topology-changing moves ("swap" moves) to accelerate equilibration. By comparison with other Monte Carlo and molecular dynamics simulations, we verify that the event chain algorithm reproduces the correct equilibrium behavior of polymer chains in the melt. By comparing intrapolymer diffusion time scales, we show that event chain Monte Carlo algorithms can achieve simulation speeds comparable to optimized molecular dynamics simulations. The event chain Monte Carlo algorithm exhibits Rouse dynamics on short time scales. In the absence of swap moves, we find reptation dynamics on intermediate time scales for long chains.

  2. Regional Climate Simulation of the Anomalous Events of 1998 using a Stretched-Grid GCM with Multiple Areas of Interest

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The GEOS (Goddard Earth Observing System) stretched-grid (SG) GCM developed and thoroughly tested over the last few years, is used for simulating the major anomalous regional climate events of 1998. The anomalous regional climate events are simulated simultaneously during the 13 months long (November-1997 - December-1998) SG-GCM simulation due to using the new SG-design with multiple (four) areas of interest. The following areas/regions of interest (one at each global quadrant) are implemented: U.S./Northern Mexico, the El-Nino/Brazil area, India-China, and Eastern Indian Ocean/Australia.

  3. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...(a) and 1 CFR part 51. (3) A notice of any changes made to the material incorporated by reference... 10 Energy 1 2010-01-01 2010-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The...

  4. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  5. Decision support system for managing oil spill events.

    PubMed

    Keramitsoglou, Iphigenia; Cartalis, Constantinos; Kassomenos, Pavlos

    2003-08-01

    The Mediterranean environment is exposed to various hazards, including oil spills, forest fires, and floods, making the development of a decision support system (DSS) for emergency management an objective of utmost importance. The present work presents a complete DSS for managing marine pollution events caused by oil spills. The system provides all the necessary tools for early detection of oil-spills from satellite images, monitoring of their evolution, estimation of the accident consequences and provision of support to responsible Public Authorities during clean-up operations. The heart of the system is an image processing-geographic information system and other assistant individual software tools that perform oil spill evolution simulation and all other necessary numerical calculations as well as cartographic and reporting tasks related to a specific management of the oil spill event. The cartographic information is derived from the extant general maps representing detailed information concerning several regional environmental and land-cover characteristics as well as financial activities of the application area. Early notification of the authorities with up-to-date accurate information on the position and evolution of the oil spill, combined with the detailed coastal maps, is of paramount importance for emergency assessment and effective clean-up operations that would prevent environmental hazard. An application was developed for the Region of Crete, an area particularly vulnerable to oil spills due to its location, ecological characteristics, and local economic activities.

  6. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  7. Event Plane Resolution Simulations for The Fast Interaction Trigger Detector of ALICE at the LHC

    NASA Astrophysics Data System (ADS)

    Sulaimon, Isiaka; Harton, Austin; Garcia, Edmundo; Alice-Fit Collaboration

    2016-03-01

    CERN (European Center for Nuclear Research) is a global laboratory that studies proton and heavy ion collisions at the Large Hadron Collider (LHC). ALICE (A Large Ion Collider Experiment) is one of four large experiments of the LHC. ALICE is dedicated to the study of the transition of matter to Quark Gluon Plasma in heavy ion collisions. In the present ALICE detector there are two sub-detectors, (the T0 and V0), that provide minimum bias trigger, multiplicity trigger, beam-gas event rejection, collision time for other sub detectors, on line multiplicity and event plane determination. In order to adapt these functionalities to the collision rates expected for the LHC upgrade after 2020, it is planned to replace these systems by a single detector system, called the Fast Interaction Trigger (FIT). In this presentation we describe the performance parameters of the FIT upgrade; show the proposed characteristics of the T0-Plus and the simulations that support the conceptual design of this detector. In particular we describe the performance simulations of the event plane resolution. This material is based upon work supported by the National Science Foundation under Grants NSF-PHY-0968903 and NSF-PHY-1305280.

  8. Towards High Performance Discrete-Event Simulations of Smart Electric Grids

    SciTech Connect

    Perumalla, Kalyan S; Nutaro, James J; Yoginath, Srikanth B

    2011-01-01

    Future electric grid technology is envisioned on the notion of a smart grid in which responsive end-user devices play an integral part of the transmission and distribution control systems. Detailed simulation is often the primary choice in analyzing small network designs, and the only choice in analyzing large-scale electric network designs. Here, we identify and articulate the high-performance computing needs underlying high-resolution discrete event simulation of smart electric grid operation large network scenarios such as the entire Eastern Interconnect. We focus on the simulator's most computationally intensive operation, namely, the dynamic numerical solution for the electric grid state, for both time-integration as well as event-detection. We explore solution approaches using general-purpose dense and sparse solvers, and propose a scalable solver specialized for the sparse structures of actual electric networks. Based on experiments with an implementation in the THYME simulator, we identify performance issues and possible solution approaches for smart grid experimentation in the large.

  9. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    PubMed Central

    van Beek, Johannes H. G. M.; Supandi, Farahaniza; Gavai, Anand K.; de Graaf, Albert A.; Binsl, Thomas W.; Hettling, Hannes

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible. PMID:21969677

  10. Connecting Macroscopic Observables and Microscopic Assembly Events in Amyloid Formation Using Coarse Grained Simulations

    PubMed Central

    Bieler, Noah S.; Knowles, Tuomas P. J.; Frenkel, Daan; Vácha, Robert

    2012-01-01

    The pre-fibrillar stages of amyloid formation have been implicated in cellular toxicity, but have proved to be challenging to study directly in experiments and simulations. Rational strategies to suppress the formation of toxic amyloid oligomers require a better understanding of the mechanisms by which they are generated. We report Dynamical Monte Carlo simulations that allow us to study the early stages of amyloid formation. We use a generic, coarse-grained model of an amyloidogenic peptide that has two internal states: the first one representing the soluble random coil structure and the second one the -sheet conformation. We find that this system exhibits a propensity towards fibrillar self-assembly following the formation of a critical nucleus. Our calculations establish connections between the early nucleation events and the kinetic information available in the later stages of the aggregation process that are commonly probed in experiments. We analyze the kinetic behaviour in our simulations within the framework of the theory of classical nucleated polymerisation, and are able to connect the structural events at the early stages in amyloid growth with the resulting macroscopic observables such as the effective nucleus size. Furthermore, the free-energy landscapes that emerge from these simulations allow us to identify pertinent properties of the monomeric state that could be targeted to suppress oligomer formation. PMID:23071427

  11. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism.

    PubMed

    van Beek, Johannes H G M; Supandi, Farahaniza; Gavai, Anand K; de Graaf, Albert A; Binsl, Thomas W; Hettling, Hannes

    2011-11-13

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible.

  12. Healthcare system simulation using Witness

    NASA Astrophysics Data System (ADS)

    Khakdaman, Masoud; Zeinahvazi, Milad; Zohoori, Bahareh; Nasiri, Fardokht; Yew Wong, Kuan

    2013-02-01

    Simulation techniques have a proven track record in manufacturing industry as well as other areas such as healthcare system improvement. In this study, simulation model of a health center in Malaysia is developed through the application of WITNESS simulation software which has shown its flexibility and capability in manufacturing industry. Modelling procedure is started through process mapping and data collection and continued with model development, verification, validation and experimentation. At the end, final results and possible future improvements are demonstrated.

  13. Parallelized event chain algorithm for dense hard sphere and polymer systems

    SciTech Connect

    Kampmann, Tobias A. Boltz, Horst-Holger; Kierfeld, Jan

    2015-01-15

    We combine parallelization and cluster Monte Carlo for hard sphere systems and present a parallelized event chain algorithm for the hard disk system in two dimensions. For parallelization we use a spatial partitioning approach into simulation cells. We find that it is crucial for correctness to ensure detailed balance on the level of Monte Carlo sweeps by drawing the starting sphere of event chains within each simulation cell with replacement. We analyze the performance gains for the parallelized event chain and find a criterion for an optimal degree of parallelization. Because of the cluster nature of event chain moves massive parallelization will not be optimal. Finally, we discuss first applications of the event chain algorithm to dense polymer systems, i.e., bundle-forming solutions of attractive semiflexible polymers.

  14. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  15. The simulation of a MCS event in the South America using a radiative transfer model

    NASA Astrophysics Data System (ADS)

    Silveira, B. B.; Aravéquia, J. A.

    2011-12-01

    The Mesoescale Convective Systems (MCS) have an important role in the total precipitation in some regions in the world. The Southeast of South America is one of these regions, because in this area the environment favors the development of MCS. The satellite image is an important data used in the identification and characterization of these systems. In these images the MCSs are characterize for have a low values of Brightness Temperature (BT). A channel utilized to identify these systems is 4 (infrared) of the sensor imager of GOES 10 satellite. With the objective of identify a MCS with an atmospheric model 12h forecast was realized a simulation of BT to channel 4 of GOES 10 using a radiative transfer model. The MCS event chosen was one that occur between 9 and 10 November 2008 and this system reached the North of Argentine and Paraguay. This MCS was identified using the outputs of FORTACC (Forecast and Tracking of Active Convective Cells). The BT simulation was realized using the radiative transfer model CRTM version 2.0.2 (Community Radiative Transfer Model) from JCSDA (Joint Center for Satellite Data Assimilation). To realize the simulation was used a 12 hours forecast from ETA model, this atmospheric model is an operational model from the CPTEC/INPE (Centro de Previsão de Tempo e Estudos Climáticos/ Instituto Nacional de Pesquisas Epaciais). The ETA model has 20x20 Km horizontal spatial resolution and 19 levels in the vertical. The simulation of BT values with CRTM indicates the region where the MCS occurred. However the BT values are overestimated by the CRTM, the simulated amounts are quantitatively higher than the observed by the channel 4 from GOES 10. The area with BT values related to the MCS is smaller than the observed in the satellite image, the system shape also wasn't simulated the satisfactory way.

  16. A multiprocessor operating system simulator

    SciTech Connect

    Johnston, G.M.; Campbell, R.H. . Dept. of Computer Science)

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  17. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  18. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  19. GPS system simulation methodology

    NASA Technical Reports Server (NTRS)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  20. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    SciTech Connect

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-06-19

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described.

  1. Joint modeling and simulation system

    NASA Astrophysics Data System (ADS)

    Boyer, Richard T.; McQuay, William K.

    1993-08-01

    The defense budget is shrinking. Weapon systems are getting more complex. Test requirements are increasing. The training and war gaming scenarios are getting more demanding as fielded systems and training simulators are integrated to support combined arms training. To cope with these requirements and still stay within the budget, the Department of Defense is relying on modeling and simulation. The state of the modeling and simulation (M&S) art has advanced to the point where a user can now create incredibly realistic, extremely detailed models which can augment test and evaluation, support the acquisition process, enhance training and war gaming, facilitate intelligence gathering, and support detailed engineering.

  2. Assessing mid-latitude dynamics in extreme event attribution systems

    NASA Astrophysics Data System (ADS)

    Mitchell, Daniel; Davini, Paolo; Harvey, Ben; Massey, Neil; Haustein, Karsten; Woollings, Tim; Jones, Richard; Otto, Fredi; Guillod, Benoit; Sparrow, Sarah; Wallom, David; Allen, Myles

    2016-08-01

    Atmospheric modes of variability relevant for extreme temperature and precipitation events are evaluated in models currently being used for extreme event attribution. A 100 member initial condition ensemble of the global circulation model HadAM3P is compared with both the multi-model ensemble from the Coupled Model Inter-comparison Project, Phase 5 (CMIP5) and the CMIP5 atmosphere-only counterparts (AMIP5). The use of HadAM3P allows for huge ensembles to be computed relatively fast, thereby providing unique insights into the dynamics of extremes. The analysis focuses on mid Northern Latitudes (primarily Europe) during winter, and is compared with ERA-Interim reanalysis. The tri-modal Atlantic eddy-driven jet distribution is remarkably well captured in HadAM3P, but not so in the CMIP5 or AMIP5 multi-model mean, although individual models fare better. The well known underestimation of blocking in the Atlantic region is apparent in CMIP5 and AMIP5, and also, to a lesser extent, in HadAM3P. Pacific blocking features are well produced in all modeling initiatives. Blocking duration is biased towards models reproducing too many short-lived events in all three modelling systems. Associated storm tracks are too zonal over the Atlantic in the CMIP5 and AMIP5 ensembles, but better simulated in HadAM3P with the exception of being too weak over Western Europe. In all cases, the CMIP5 and AMIP5 performances were almost identical, suggesting that the biases in atmospheric modes considered here are not strongly coupled to SSTs, and perhaps other model characteristics such as resolution are more important. For event attribution studies, it is recommended that rather than taking statistics over the entire CMIP5 or AMIP5 available models, only models capable of producing the relevant dynamical phenomena be employed.

  3. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  4. AP1000 Design Basis Event Simulation at the APEX-1000 Test Facility

    SciTech Connect

    Wright, Richard F.; Groome, John

    2004-07-01

    The AP1000 is a 1000 MWe advanced nuclear power plant that uses passive safety features to enhance plant safety and to provide significant and measurable improvements in plant simplification, reliability, investment protection and plant costs. The AP1000 relies heavily on the 600 MWe AP600 which received design certification in 1999. A critical part of the AP600 design certification process involved the testing of the passive safety systems. A one-fourth height, one-fourth pressure test facility, APEX-600, was constructed at the Oregon State University to study design basis events, and to provide a body of data to be used to validate the computer models used to analyze the AP600. This facility was extensively modified to reflect the design changes for AP1000 including higher power in the electrically heated rods representing the reactor core, and changes in the size of the pressurizer, core makeup tanks and automatic depressurization system. Several design basis events are being simulated at APEX-1000 including a double-ended direct vessel injection (DEDVI) line break and a 2-inch cold leg break. These tests show that the core remains covered with ample margin until gravity injection is established regardless of the initiating event. The tests also show that liquid entrainment from the upper plenum which is proportional to the reactor power does not impact the ability of the passive core cooling system to keep the core covered. (authors)

  5. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  6. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    PubMed Central

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-01-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the dermis upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  7. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  8. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  9. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    NASA Astrophysics Data System (ADS)

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  10. Stochastic Event Counter for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Tae-Sic Yoo; Humberto E. Garcia

    2008-06-01

    This paper addresses the issues of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). First, we develop a noble recursive procedure that updates active counter information state sequentially with available observations. In general, the cardinality of active counter information state is unbounded, which makes the exact recursion infeasible computationally. To overcome this difficulty, we develop an approximated recursive procedure that regulates and bounds the size of active counter information state. Using the approximated active counting information state, we give an approximated minimum mean square error (MMSE) counter. The developed algorithms are then applied to count special routing events in a material flow system.

  11. The influence of spectral nudging in simulating Vb-events with COSMO-CLM

    NASA Astrophysics Data System (ADS)

    Paumann, Manuela; Anders, Ivonne; Hofstätter, Michael; Chimani, Barbara

    2015-04-01

    In previous studies certain European cyclones have been investigated in terms of related extreme precipitation events in Austria. Those systems passing the Mediterranean are of special interest as the atmospheric moisture content is increased. It has been shown in recent investigations that state-of-the-art RCMs can approximately reproduce observed heavy precipitation characteristics. This provides a basic confidence in the models ability to capture future changes of such events under increased greenhouse gas conditions as well. In this contribution we focus on high spatial and temporal scales and assess the currently achievable accuracy in the simulation of Vb-events. The state-of-the-art regional climate model CCLM is applied in a hindcast-mode to the case of individual Vb-events in August 2002 and Mai/June 2013. Besides the conventional forcing of the regional climate model at its lateral boundaries a spectral nudging technique is applied. This means that inside the model area the regional model is forced to accept the analysis for large scales whereas it has no effect on the small scales. The simulations for the Vb-events mentioned above covering the European domain have been varied systematically by changing nudging factor, number of nudged waves, nudged variables, and other parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). Varying the spectral nudging setup in the short-term Vb-cases helps us on one hand learn something about 3D-processes during Vb-events e.g. vorticity and formation but on the other hand identify the model deficiencies. The results show, that increasing the number of nudged waves from 1 to 7 as well as the choice of the variables used in the nudging process have a large influence on the development of the low pressure system and the related precipitation patterns. On the contrary, the nudging

  12. MCNP6. Simulating Correlated Data in Fission Events

    SciTech Connect

    Rising, Michael Evan; Sood, Avneet

    2015-12-03

    This report is a series of slides discussing the MCNP6 code and its status in simulating fission. Applications of interest include global security and nuclear nonproliferation, detection of special nuclear material (SNM), passive and active interrogation techniques, and coincident neutron and photon leakage.

  13. Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events

    PubMed Central

    2015-01-01

    Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed. PMID:26390294

  14. Systems Engineering Simulator (SES) Simulator Planning Guide

    NASA Technical Reports Server (NTRS)

    McFarlane, Michael

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  15. Mesoscale Simulations of a Wind Ramping Event for Wind Energy Prediction

    SciTech Connect

    Rhodes, M; Lundquist, J K

    2011-09-21

    Ramping events, or rapid changes of wind speed and wind direction over a short period of time, present challenges to power grid operators in regions with significant penetrations of wind energy in the power grid portfolio. Improved predictions of wind power availability require adequate predictions of the timing of ramping events. For the ramping event investigated here, the Weather Research and Forecasting (WRF) model was run at three horizontal resolutions in 'mesoscale' mode: 8100m, 2700m, and 900m. Two Planetary Boundary Layer (PBL) schemes, the Yonsei University (YSU) and Mellor-Yamada-Janjic (MYJ) schemes, were run at each resolution as well. Simulations were not 'tuned' with nuanced choices of vertical resolution or tuning parameters so that these simulations may be considered 'out-of-the-box' tests of a numerical weather prediction code. Simulations are compared with sodar observations during a wind ramping event at a 'West Coast North America' wind farm. Despite differences in the boundary-layer schemes, no significant differences were observed in the abilities of the schemes to capture the timing of the ramping event. As collaborators have identified, the boundary conditions of these simulations probably dominate the physics of the simulations. They suggest that future investigations into characterization of ramping events employ ensembles of simulations, and that the ensembles include variations of boundary conditions. Furthermore, the failure of these simulations to capture not only the timing of the ramping event but the shape of the wind profile during the ramping event (regardless of its timing) indicates that the set-up and execution of such simulations for wind power forecasting requires skill and tuning of the simulations for a specific site.

  16. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC

  17. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  18. Simulation of LHC events on a millions threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2015-12-01

    Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.

  19. Analysis of Extreme Events in Regional Climate Model Simulations for the Pacific Northwest using weatherathome

    NASA Astrophysics Data System (ADS)

    Mera, R. J.; Mote, P.; Weber, J.

    2011-12-01

    One of the most prominent impacts of climate change over the Pacific Northwest is the potential for an elevated number of extreme precipitation events over the region. Planning for natural hazards such as increasing number of floods related to high-precipitation events have, in general, focused on avoiding development in floodplains and conditioning development to withstand inundation with a minimum of losses. Nationwide, the Federal Emergency Management Agency (FEMA) estimates that about one quarter of its payments cover damage that has occurred outside mapped floodplains. It is clear that traditional flood-based planning will not be sufficient to predict and avoid future losses resulting from climate-related hazards such as high-precipitation events. In order to address this problem, the present study employs regional climate model output for future climate change scenarios to aid with the development of a map-based inventory of future hazard risks that can contribute to the development of a "planning-scale" decision support system for the Oregon Department of Land Conservation and Development (DLCD). Climate model output is derived from the climateprediction.net (CPDN) weatherathome project, an innovative climate science experiment that utilizes volunteer computers from users worldwide to produce hundreds of thousands superensembles of regional climate simulations of the Western United States climate from 1950 to 2050. The spatial and temporal distribution of extreme weather events are analyzed for the Pacific Northwest to diagnose the model's capabilities as an input for map products such as impacts on hydrology. Special attention is given to intensity and frequency of Atmospheric River events in historical and future climate contexts.

  20. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired. PMID:27286268

  1. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  2. Power system extreme event screening using graphpartitioning

    SciTech Connect

    Lesieutre, Bernard C.; Roy, Sandip; Donde, Vaibhav; Pinar, Ali

    2006-09-06

    We propose a partitioning problem in a power system contextthat weighs the two objectives of minimizing cuts between partitions andmaximizing the power imbalance between partitions. We then pose theproblem in a purely graph theoretic sense. We offer an approximatesolution through relaxation of the integer problem and suggest refinementusing stochastic methods. Results are presented for the IEEE 30-bus and118-bus electric power systems.

  3. An event generator for simulations of complex β-decay experiments

    NASA Astrophysics Data System (ADS)

    Jordan, D.; Algora, A.; Tain, J. L.

    2016-08-01

    This article describes a Monte Carlo event generator for the design, optimization and performance characterization of beta decay spectroscopy experimental set-ups. The event generator has been developed within the Geant4 simulation architecture and provides new features and greater flexibility in comparison with the current available decay generator.

  4. Calculation of 239Pu fission observables in an event-by-event simulation

    SciTech Connect

    Vogt, R; Randrup, J; Pruet, J; Younes, W

    2010-03-31

    The increased interest in more exclusive fission observables has demanded more detailed models. We describe a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including any interesting correlations. The various model assumptions are described and the potential utility of the model is illustrated. As a concrete example, we use formal statistical methods, experimental data on neutron production in neutron-induced fission of {sup 239}Pu, along with FREYA, to develop quantitative insights into the relation between reaction observables and detailed microscopic aspects of fission. Current measurements of the mean number of prompt neutrons emitted in fission taken together with less accurate current measurements for the prompt post-fission neutron energy spectrum, up to the threshold for multi-chance fission, place remarkably fine constraints on microscopic theories.

  5. Simulation of a precipitation event in the western United States

    SciTech Connect

    Kim, Jinwon; Soong, S.T.

    1993-09-01

    Wintertime precipitation is the major water resources in the western United States. Thus correct assessment of the wintertime precipitation is important in planning the summertime water supply and the development of future water resources. Precipitation forecast is also important for early warning of flood. Regional precipitation depends chiefly on the large scale flow and local topography. Large scale moisture influx determines the amount of moisture available for the precipitation. Topography affects two major factors in the local precipitation: Low-level vertical motion and local water vapor transport. Topography also affects partitioning the total precipitation into rain and snow since the snow line is usually lower than the peaks of major mountain ranges during wintertime. Modeling the features of regional precipitation has been intensely studied. Previous studies show that limited area models nested within large scale models or analysis can capture realistic features of storms. As limited area models usually have much finer spatial resolutions than large scale models, parameterized precipitation processes developed for large scale models may not be adequate for limited area models. We present a simulation of twelve-day precipitation over California from Feb. 11 to Feb. 23 1986. We focus on features of precipitation such as the local distribution, total amount, and the occurrence of snowfall and rainfall. The simulation is carried out using a primitive-equation limited-area model.

  6. Rare event molecular dynamics simulations of plasma induced surface ablation

    SciTech Connect

    Sharia, Onise; Holzgrafe, Jeffrey; Park, Nayoung; Henkelman, Graeme

    2014-08-21

    The interaction of thermal Ar plasma particles with Si and W surfaces is modeled using classical molecular dynamics (MD) simulations. At plasma energies above the threshold for ablation, the ablation yield can be calculated directly from MD. For plasma energies below threshold, the ablation yield becomes exponentially low, and direct MD simulations are inefficient. Instead, we propose an integration method where the yield is calculated as a function of the Ar incident kinetic energy. Subsequent integration with a Boltzmann distribution at the temperature of interest gives the thermal ablation yield. At low plasma temperatures, the ablation yield follows an Arrhenius form in which the activation energy is shown to be the threshold energy for ablation. Interestingly, equilibrium material properties, including the surface and bulk cohesive energy, are not good predictors of the threshold energy for ablation. The surface vacancy formation energy is better, but is still not a quantitative predictor. An analysis of the trajectories near threshold shows that ablation occurs by different mechanisms on different material surfaces, and both the mechanism and the binding of surface atoms determine the threshold energy.

  7. Rare event molecular dynamics simulations of plasma induced surface ablation.

    PubMed

    Sharia, Onise; Holzgrafe, Jeffrey; Park, Nayoung; Henkelman, Graeme

    2014-08-21

    The interaction of thermal Ar plasma particles with Si and W surfaces is modeled using classical molecular dynamics (MD) simulations. At plasma energies above the threshold for ablation, the ablation yield can be calculated directly from MD. For plasma energies below threshold, the ablation yield becomes exponentially low, and direct MD simulations are inefficient. Instead, we propose an integration method where the yield is calculated as a function of the Ar incident kinetic energy. Subsequent integration with a Boltzmann distribution at the temperature of interest gives the thermal ablation yield. At low plasma temperatures, the ablation yield follows an Arrhenius form in which the activation energy is shown to be the threshold energy for ablation. Interestingly, equilibrium material properties, including the surface and bulk cohesive energy, are not good predictors of the threshold energy for ablation. The surface vacancy formation energy is better, but is still not a quantitative predictor. An analysis of the trajectories near threshold shows that ablation occurs by different mechanisms on different material surfaces, and both the mechanism and the binding of surface atoms determine the threshold energy. PMID:25149805

  8. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family

    NASA Astrophysics Data System (ADS)

    Custodio, M. D. S.; Ambrizzi, T.; Da Rocha, R.

    2015-12-01

    The increased horizontal resolution of climate models aims to improve the simulations accuracy and to understand the non-linear processes during interactions between different spatial scales within the climate system. Up to this moment, these interactions did not have a good representation on low horizontal resolution GCMs. The variations of extreme climatic events had been described and analyzed in the scientific literature. In a scenario of global warming it is necessary understanding and explaining extreme events and to know if global models may represent these events. The purpose of this study was to understand the impact of the horizontal resolution in high resolution coupled and atmospheric global models of HiGEM project in simulating atmospheric patterns and processes of interaction between spatial scales. Moreover, evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model in capturing the signal of interannual and intraseasonal variability of precipitation over Amazon region. The results indicated that the grid refinement and ocean-atmosphere coupling contributes to a better representation of seasonal patterns, both precipitation and temperature, on the Amazon region. Besides, the climatic models analyzed represent better than other models (regional and global) the climatic characteristics of this region. This indicates a breakthrough in the development of high resolution climate models. Both coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The interannual variability analysis showed that coupled simulations intensify the impact of the ENSO in the Amazon. In the intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. The simulation of ENSO in GCMs can be attributed to their high

  9. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  10. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  11. Performance and system flexibility of the CDF Hardware Event Builder

    SciTech Connect

    Shaw, T.M.; Schurecht, K.; Sinervo, P.

    1991-11-01

    The CDF Hardware Event Builder [1] is a flexible system which is built from a combination of three different 68020-based single width Fastbus modules. The system may contain as few as three boards or as many as fifteen, depending on the specific application. Functionally, the boards receive a command to read out the raw event data from a set of Fastbus based data buffers (``scanners``), reformat data and then write the data to a Level 3 trigger/processing farm which will decide to throw the event away or to write it to tape. The data acquisition system at CDF will utilize two nine board systems which will allow an event rate of up to 35 Hz into the Level 3 trigger. This paper will present detailed performance factors, system and individual board architecture, and possible system configurations.

  12. Performance and system flexibility of the CDF Hardware Event Builder

    SciTech Connect

    Shaw, T.M.; Schurecht, K. ); Sinervo, P. . Dept. of Physics)

    1991-11-01

    The CDF Hardware Event Builder (1) is a flexible system which is built from a combination of three different 68020-based single width Fastbus modules. The system may contain as few as three boards or as many as fifteen, depending on the specific application. Functionally, the boards receive a command to read out the raw event data from a set of Fastbus based data buffers ( scanners''), reformat data and then write the data to a Level 3 trigger/processing farm which will decide to throw the event away or to write it to tape. The data acquisition system at CDF will utilize two nine board systems which will allow an event rate of up to 35 Hz into the Level 3 trigger. This paper will present detailed performance factors, system and individual board architecture, and possible system configurations.

  13. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings. PMID:26310949

  14. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings.

  15. Simulations of Solar AO Systems

    NASA Astrophysics Data System (ADS)

    Sridharan, R.; Bayanna, A. Raja; Venkatakrishnan, P.

    In this paper, first we compare the two kinds of algorithms that are being used in solar AO systems to sense a distorted wave-front through simulations. Then, we comment on the various issues related to solar AO systems and describe solar features that can be studied using AO as a tool. Then we briefly describe the laboratory model of AO that is being built at the Udaipur Solar Observatory (USO), India.

  16. Experience producing simulated events for the DZero experiment on the SAM-Grid

    SciTech Connect

    Garzoglio, G.; Terekhov, I.; Snow, J.; Jain, S.; Nishandar, A.; /Texas U., Arlington

    2004-12-01

    Most of the simulated events for the DZero experiment at Fermilab have been historically produced by the ''remote'' collaborating institutions. One of the principal challenges reported concerns the maintenance of the local software infrastructure, which is generally different from site to site. As the understanding of the distributed computing community over distributively owned and shared resources progresses, the adoption of grid technologies to address the production of Monte Carlo events for high energy physics experiments becomes increasingly interesting. SAM-Grid is a software system developed at Fermilab, which integrates standard grid technologies for job and information management with SAM, the data handling system of the DZero and CDF experiments. During the past few months, this grid system has been tailored for the Monte Carlo production of DZero. Since the initial phase of deployment, this experience has exposed an interesting series of requirements to the SAM-Grid services, the standard middleware, the resources and their management and to the analysis framework of the experiment. As of today, the inefficiency due to the grid infrastructure has been reduced to as little as 1%. In this paper, we present our statistics and the ''lessons learned'' in running large high energy physics applications on a grid infrastructure.

  17. Aided targeting system simulation evaluation

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Becker, Curtis

    1994-01-01

    Simulation research was conducted at the Crew Station Research and Development Facility on the effectiveness and ease of use of three targeting systems. A manual system required the aviator to scan a target array area with a simulated second generation forward looking infrared (FLIR) sensor, locate and categorize targets, and construct a target hand-off list. The interface between the aviator and the system was like that of an advanced scout helicopter (manual mode). Two aided systems detected and categorized targets automatically. One system used only the FLIR sensor and the second used FLIR fused with Longbow radar. The interface for both was like that of an advanced scout helicopter aided mode. Exposure time while performing the task was reduced substantially with the aided systems, with no loss of target hand-off list accuracy. The fused sensor system showed lower time to construct the target hand-off list and a slightly lower false alarm rate than the other systems. A number of issues regarding system sensitivity and criterion, and operator interface design are discussed.

  18. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  19. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  20. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  1. An intelligent simulation training system

    NASA Technical Reports Server (NTRS)

    Biegel, John E.

    1990-01-01

    The Department of Industrial Engineering at the University of Central Florida, Embry-Riddle Aeronautical University and General Electric (SCSD) have been funded by the State of Florida to build an Intelligent Simulation Training System. The objective was and is to make the system generic except for the domain expertise. Researchers accomplished this objective in their prototype. The system is modularized and therefore it is easy to make any corrections, expansions or adaptations. The funding by the state of Florida has exceeded $3 million over the past three years and through the 1990 fiscal year. UCF has expended in excess of 15 work years on the project. The project effort has been broken into three major tasks. General Electric provides the simulation. Embry-Riddle Aeronautical University provides the domain expertise. The University of Central Florida has constructed the generic part of the system which is comprised of several modules that perform the tutoring, evaluation, communication, status, etc. The generic parts of the Intelligent Simulation Training Systems (ISTS) are described.

  2. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    NASA Astrophysics Data System (ADS)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  3. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  4. The Impact of Land Cover Change on a Simulated Storm Event in the Sydney Basin

    NASA Astrophysics Data System (ADS)

    Gero, A. F.; Pitman, A. J.

    2006-02-01

    The Regional Atmospheric Modeling System (RAMS) was run at a 1-km grid spacing over the Sydney basin in Australia to assess the impact of land cover change on a simulated storm event. The simulated storm used NCEP NCAR reanalysis data, first with natural (i.e., pre-European settlement in 1788) land cover and then with satellite-derived land cover representing Sydney's current land use pattern. An intense convective storm develops in the model in close proximity to Sydney's dense urban central business district under current land cover. The storm is absent under natural land cover conditions. A detailed investigation of why the change in land cover generates a storm was performed using factorial analysis, which revealed the storm to be sensitive to the presence of agricultural land in the southwest of the domain. This area interacts with the sea breeze and affects the horizontal divergence and moisture convergence—the triggering mechanisms of the storm. The existence of the storm over the dense urban area of Sydney is therefore coincidental. The results herein support efforts to develop parameterization of urban surfaces in high-resolution simulations of Sydney's meteorological environment but also highlight the need to improve the parameterization of other types of land cover change at the periphery of the urban area, given that these types dominate the explanation of the results.

  5. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, Debra P. C.; Sala, O.E.; Allen, C.D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas. ?? The Ecological Society of America.

  6. A Distributed Processing and Analysis System for Heliophysic Events

    NASA Astrophysics Data System (ADS)

    Hurlburt, N.; Cheung, M.; Bose, P.

    2008-12-01

    With several Virtual Observatories now under active development, the time is ripe to consider how they will interact to enable integrated studies that span the full range of Heliophysics. We present a solution that builds upon components of the Heliophysics Event Knowledgebase (HEK) being developed for the Solar Dynamics Observatory and the Heliophysics Event List Manager (HELMS), recently selected as part of the NASA VxO program. A Heliophysics Event Analysis and Processing System (HEAPS) could increase the scientific productivity of Heliophysics data by increasing the visibility of relevant events contained within them while decreasing the incremental costs of incorporating more events in research studies. Here we present the relevant precursors to such a system and show how it could operate within the Heliophysics Data Environment.

  7. Performance and efficiency of geotextile-supported erosion control measures during simulated rainfall events

    NASA Astrophysics Data System (ADS)

    Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin

    2013-04-01

    Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show

  8. Improving outpatient phlebotomy service efficiency and patient experience using discrete-event simulation.

    PubMed

    Yip, Kenneth; Pang, Suk-King; Chan, Kui-Tim; Chan, Chi-Kuen; Lee, Tsz-Leung

    2016-08-01

    Purpose - The purpose of this paper is to present a simulation modeling application to reconfigure the outpatient phlebotomy service of an acute regional and teaching hospital in Hong Kong, with an aim to improve service efficiency, shorten patient queuing time and enhance workforce utilization. Design/methodology/approach - The system was modeled as an inhomogeneous Poisson process and a discrete-event simulation model was developed to simulate the current setting, and to evaluate how various performance metrics would change if switched from a decentralized to a centralized model. Variations were then made to the model to test different workforce arrangements for the centralized service, so that managers could decide on the service's final configuration via an evidence-based and data-driven approach. Findings - This paper provides empirical insights about the relationship between staffing arrangement and system performance via a detailed scenario analysis. One particular staffing scenario was chosen by manages as it was considered to strike the best balance between performance and workforce scheduled. The resulting centralized phlebotomy service was successfully commissioned. Practical implications - This paper demonstrates how analytics could be used for operational planning at the hospital level. The authors show that a transparent and evidence-based scenario analysis, made available through analytics and simulation, greatly facilitates management and clinical stakeholders to arrive at the ideal service configuration. Originality/value - The authors provide a robust method in evaluating the relationship between workforce investment, queuing reduction and workforce utilization, which is crucial for managers when deciding the delivery model for any outpatient-related service. PMID:27477930

  9. Improving outpatient phlebotomy service efficiency and patient experience using discrete-event simulation.

    PubMed

    Yip, Kenneth; Pang, Suk-King; Chan, Kui-Tim; Chan, Chi-Kuen; Lee, Tsz-Leung

    2016-08-01

    Purpose - The purpose of this paper is to present a simulation modeling application to reconfigure the outpatient phlebotomy service of an acute regional and teaching hospital in Hong Kong, with an aim to improve service efficiency, shorten patient queuing time and enhance workforce utilization. Design/methodology/approach - The system was modeled as an inhomogeneous Poisson process and a discrete-event simulation model was developed to simulate the current setting, and to evaluate how various performance metrics would change if switched from a decentralized to a centralized model. Variations were then made to the model to test different workforce arrangements for the centralized service, so that managers could decide on the service's final configuration via an evidence-based and data-driven approach. Findings - This paper provides empirical insights about the relationship between staffing arrangement and system performance via a detailed scenario analysis. One particular staffing scenario was chosen by manages as it was considered to strike the best balance between performance and workforce scheduled. The resulting centralized phlebotomy service was successfully commissioned. Practical implications - This paper demonstrates how analytics could be used for operational planning at the hospital level. The authors show that a transparent and evidence-based scenario analysis, made available through analytics and simulation, greatly facilitates management and clinical stakeholders to arrive at the ideal service configuration. Originality/value - The authors provide a robust method in evaluating the relationship between workforce investment, queuing reduction and workforce utilization, which is crucial for managers when deciding the delivery model for any outpatient-related service.

  10. Optimized Hypervisor Scheduler for Parallel Discrete Event Simulations on Virtual Machine Platforms

    SciTech Connect

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2013-01-01

    With the advent of virtual machine (VM)-based platforms for parallel computing, it is now possible to execute parallel discrete event simulations (PDES) over multiple virtual machines, in contrast to executing in native mode directly over hardware as is traditionally done over the past decades. While mature VM-based parallel systems now offer new, compelling benefits such as serviceability, dynamic reconfigurability and overall cost effectiveness, the runtime performance of parallel applications can be significantly affected. In particular, most VM-based platforms are optimized for general workloads, but PDES execution exhibits unique dynamics significantly different from other workloads. Here we first present results from experiments that highlight the gross deterioration of the runtime performance of VM-based PDES simulations when executed using traditional VM schedulers, quantitatively showing the bad scaling properties of the scheduler as the number of VMs is increased. The mismatch is fundamental in nature in the sense that any fairness-based VM scheduler implementation would exhibit this mismatch with PDES runs. We also present a new scheduler optimized specifically for PDES applications, and describe its design and implementation. Experimental results obtained from running PDES benchmarks (PHOLD and vehicular traffic simulations) over VMs show over an order of magnitude improvement in the run time of the PDES-optimized scheduler relative to the regular VM scheduler, with over 20 reduction in run time of simulations using up to 64 VMs. The observations and results are timely in the context of emerging systems such as cloud platforms and VM-based high performance computing installations, highlighting to the community the need for PDES-specific support, and the feasibility of significantly reducing the runtime overhead for scalable PDES on VM platforms.

  11. Multimodal interaction in the perception of impact events displayed via a multichannel audio and simulated structure-borne vibration

    NASA Astrophysics Data System (ADS)

    Martens, William L.; Woszczyk, Wieslaw

    2005-09-01

    For multimodal display systems in which realistic reproduction of impact events is desired, presenting structure-borne vibration along with multichannel audio recordings has been observed to create a greater sense of immersion in a virtual acoustic environment. Furthermore, there is an increased proportion of reports that the impact event took place within the observer's local area (this is termed ``presence with'' the event, in contrast to ``presence in'' the environment in which the event occurred). While holding the audio reproduction constant, varying the intermodal arrival time and level of mechanically displayed, synthetic whole-body vibration revealed a number of other subjective attributes that depend upon multimodal interaction in the perception of a representative impact event. For example, when the structure-borne component of the displayed impact event arrived 10 to 20 ms later than the airborne component, the intermodal delay was not only tolerated, but gave rise to an increase in the proportion of reports that the impact event had greater power. These results have enabled the refinement of a multimodal simulation in which the manipulation of synthetic whole-body vibration can be used to control perceptual attributes of impact events heard within an acoustic environment reproduced via a multichannel loudspeaker array.

  12. Solar system events at high spatial resolution

    SciTech Connect

    Baines, K H; Gavel, D T; Getz, A M; Gibbartd, S G; MacIntosh, B; Max, C E; McKay, C P; Young, E F; de Pater, I

    1999-02-19

    Until relatively recent advances in technology, astronomical observations from the ground were limited in image resolution by the blurring effects of earth's atmosphere. The blur extent, ranging typically from 0.5 to 2 seconds of arc at the best astronomical sights, precluded ground-based observations of the details of the solar system's moons, asteroids, and outermost planets. With the maturing of a high resolution image processing technique called speckle imaging the resolution limitation of the atmosphere can now be largely overcome. Over the past three years they have used speckle imaging to observe Titan, a moon of Saturn with an atmospheric density comparable to Earth's, Io, the volcanically active innermost moon of Jupiter, and Neptune, a gas giant outer planet which has continually changing planet-encircling storms. These observations were made at the world's largest telescope, the Keck telescope in Hawaii and represent the highest resolution infrared images of these objects ever taken.

  13. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  14. Event-triggered consensus tracking of multi-agent systems with Lur'e nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Na; Duan, Zhisheng; Wen, Guanghui; Zhao, Yu

    2016-05-01

    In this paper, distributed consensus tracking problem for networked Lur'e systems is investigated based on event-triggered information interactions. An event-triggered control algorithm is designed with the advantages of reducing controller update frequency and sensor energy consumption. By using tools of ?-procedure and Lyapunov functional method, some sufficient conditions are derived to guarantee that consensus tracking is achieved under a directed communication topology. Meanwhile, it is shown that Zeno behaviour of triggering time sequences is excluded for the proposed event-triggered rule. Finally, some numerical simulations on coupled Chua's circuits are performed to illustrate the effectiveness of the theoretical algorithms.

  15. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  16. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  17. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Licensee event report system. 50.73 Section 50.73 Energy... systems, including: emergency diesel generators (EDGs); hydroelectric facilities used in lieu of EDGs at... component, if known. (F) The Energy Industry Identification System component function identifier and...

  18. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Licensee event report system. 50.73 Section 50.73 Energy... systems, including: emergency diesel generators (EDGs); hydroelectric facilities used in lieu of EDGs at... component, if known. (F) The Energy Industry Identification System component function identifier and...

  19. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Licensee event report system. 50.73 Section 50.73 Energy... systems, including: emergency diesel generators (EDGs); hydroelectric facilities used in lieu of EDGs at... component, if known. (F) The Energy Industry Identification System component function identifier and...

  20. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Licensee event report system. 50.73 Section 50.73 Energy... systems, including: emergency diesel generators (EDGs); hydroelectric facilities used in lieu of EDGs at... component, if known. (F) The Energy Industry Identification System component function identifier and...

  1. Capturing the serial nature of older drivers' responses towards challenging events: a simulator study.

    PubMed

    Bélanger, Alexandre; Gagnon, Sylvain; Yamin, Stephanie

    2010-05-01

    Older drivers' ability to trigger simultaneous responses in reaction to simulated challenging road events was examined through crash risk and local analyses of acceleration and direction data provided by the simulator. This was achieved by segregating and averaging the simulator's primary measures according to six short time intervals, one before and five during the challenging events. Twenty healthy adults aged 25-45 years old (M=29.5+/-4.32) and 20 healthy adults aged 65 and older (M=73.4+/-5.17) were exposed to five simulated scenarios involving sudden, complex and unexpected maneuvres. Participants were also administered the Useful Field of View (UFOV), single reaction time and choice reaction time tests, a visual secondary task in the simulator, and a subjective workload evaluation (NASA-TLX). Results indicated that the challenging event that required multiple synchronized reactions led to a higher crash rate in older drivers. Acceleration and orientation data analyses confirmed that the drivers who crashed limited their reaction. The other challenging events did not generate crashes because they could be anticipated and one response (braking) was sufficient to avoid crash. Our findings support the proposal (Hakamies-Blomqvist, L., Mynttinen, S., Backman, M., Mikkonen, V., 1999. Age-related differences in driving: are older drivers more serial? International Journal of Behavioral Development 23, 575-589) that older drivers have more difficulty activating car controls simultaneously putting them at risk when facing challenging and time pressure road events. PMID:20380907

  2. A CORBA event system for ALMA common software

    NASA Astrophysics Data System (ADS)

    Fugate, David W.

    2004-09-01

    The ALMA Common Software notification channel framework provides developers with an easy to use, high-performance, event-driven system supported across multiple programming languages and operating systems. It sits on top of the CORBA notification service and hides nearly all CORBA from developers. The system is based on a push event channel model where suppliers push events onto the channel and consumers process these asynchronously. This is a many-to-many publishing model whereby multiple suppliers send events to multiple consumers on the same channel. Furthermore, these event suppliers and consumers can be coded in C++, Java, or Python on any platform supported by ACS. There are only two classes developers need to be concerned with: SimpleSupplier and Consumer. SimpleSupplier was designed so that ALMA events (defined as IDL structures) could be published in the simplest manner possible without exposing any CORBA to the developer. Essentially all that needs to be known is the channel's name and the IDL structure being published. The API takes care of everything else. With the Consumer class, the developer is responsible for providing the channel's name as well as associating event types with functions that will handle them.

  3. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  4. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  5. Simulation System for Training in Laparoscopic Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao

    2003-01-01

    A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set

  6. Simulation of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Schnepfleitner, Harald; Rode, Matthias; Sass, Oliver

    2014-05-01

    Rock moisture distribution during freeze-thaw events is the key to understanding frost weathering and subsequent rockfall. Data on moisture levels of natural rock walls are scarce and difficult to measure. An innovative and cheap way to avoid these problems is the use of simulation calculations. Although they are an abstraction of the real system they are widely used in natural science. A novel way to simulate moisture in natural rock walls is the use of the software WUFI which has been developed to understand the moisture behavior in building materials. However, the enormous know-how behind these commercial applications has not been exploited for geomorphological research to date. Necessary input data for the simulation are climate data in hourly resolution (temperature, rainfall, wind, irradiation) and material properties (porosity, sorption and diffusivity parameters) of the prevailing rock. Two different regions were analysed, the Gesäuse (Johnsbachtal: 700 m, limestone and dolomite) and the Sonnblick (3000 m, gneiss and granite). We aimed at comparing the two regions in terms of general susceptibility to frost weathering, as well as the influence of aspect, inclination and rock parameters and the possible impact of climate change. The calculated 1D-moisture profiles and temporal progress of rock moisture - in combination with temperature data - allow to detect possible periods of active weathering and resulting rockfalls. These results were analyzed based on two different frost weathering theories, the "classical" frost shattering theory (requiring high number of freeze-thaw cycles and a pore saturation of 90%) and the segregation ice theory (requiring a long freezing period and a pore saturation threshold of approx. 60%). An additionally considered critical factor for both theories was the frost depth, namely the duration of the "frost cracking window" (between -3 and -10°C) at each site. The results shows that in both areas, north-facing rocks are

  7. Deterministic chaos in a simulated sequence of slip events on a single isolated asperity

    NASA Astrophysics Data System (ADS)

    Kato, Naoyuki

    2014-08-01

    Numerical simulation of repeated occurrences of slip events on a fault patch (asperity) is used to interpret the mechanism of irregular sequences of slip events. The fault is uniformly shear loaded at a constant rate, and the frictional stress acting on the fault is assumed to obey a rate- and state-dependent friction (RSF) law. A circular patch with velocity-weakening frictional property is embedded in the fault, which apart from this has velocity-strengthening frictional property. The numerical simulations are conducted using various characteristic slip distances L of the RSF law. For small values of L seismic slip events (earthquakes) repeatedly occur at regular intervals. With increasing L, the recurrence of slip events becomes more complex. A period doubled slip pattern, where seismic and aseismic slip events alternately occur, multiperiodic patterns and aperiodic patterns occur. At the same time, slip tends to become aseismic with increasing L. The distributions of shear stress on the fault just before slip events are variable because of variation in the residual stress of the preceding slip event and the shear stress generated by aseismic sliding during interseismic periods. These variations in shear stress cause the complex sequence of slip events seen here. An iteration map of the recurrence intervals of slip events for an aperiodic sequence of slip events is expressed by a simple curve, indicating that the timing of an event is predictable from the previous time interval, and the sequence of slip events exhibits deterministic chaos. To help interpret these results for a sequence of slip events on a velocity-weakening patch embedded in a velocity-strengthening region, a numerical simulation is conducted of slip on a velocity-weakening patch enclosed by a permanently locked region. In this case, no complex recurrence of slip events is observed. When L is less than a critical value, seismic slip events repeatedly occur at a constant interval. Stable sliding

  8. A method for accelerating the molecular dynamics simulation of infrequent events

    SciTech Connect

    Voter, A.F.

    1997-03-01

    For infrequent-event systems, transition state theory (TST) is a powerful approach for overcoming the time scale limitations of the molecular dynamics (MD) simulation method, provided one knows the locations of the potential-energy basins (states) and the TST dividing surfaces (or the saddle points) between them. Often, however, the states to which the system will evolve are not known in advance. We present a new, TST-based method for extending the MD time scale that does not require advanced knowledge of the states of the system or the transition states that separate them. The potential is augmented by a bias potential, designed to raise the energy in regions {ital other} than at the dividing surfaces. State to state evolution on the biased potential occurs in the proper sequence, but at an accelerated rate with a nonlinear time scale. Time is no longer an independent variable, but becomes a statistically estimated property that converges to the exact result at long times. The long-time dynamical behavior is exact if there are no TST-violating correlated dynamical events, and appears to be a good approximation even when this condition is not met. We show that for strongly coupled (i.e., solid state) systems, appropriate bias potentials can be constructed from properties of the Hessian matrix. This new {open_quotes}hyper-MD{close_quotes} method is demonstrated on two model potentials and for the diffusion of a Ni atom on a Ni(100) terrace for a duration of 20 {mu}s. {copyright} {ital 1997 American Institute of Physics.}

  9. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  10. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  11. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  12. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP. PMID:26285220

  13. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  14. Discrete-event simulation of a wide-area health care network.

    PubMed Central

    McDaniel, J G

    1995-01-01

    OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646

  15. Simulation of linear mechanical systems

    NASA Technical Reports Server (NTRS)

    Sirlin, S. W.

    1993-01-01

    A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.

  16. Event-triggered sliding mode control for a class of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Behera, Abhisek K.; Bandyopadhyay, Bijnan

    2016-09-01

    Event-triggering strategy is one of the real-time control implementation techniques which aims at achieving minimum resource utilisation while ensuring the satisfactory performance of the closed-loop system. In this paper, we address the problem of robust stabilisation for a class of nonlinear systems subject to external disturbances using sliding mode control (SMC) by event-triggering scheme. An event-triggering scheme is developed for SMC to ensure the sliding trajectory remains confined in the vicinity of sliding manifold. The event-triggered SMC brings the sliding mode in the system and thus the steady-state trajectories of the system also remain bounded within a predesigned region in the presence of disturbances. The design of event parameters is also given considering the practical constraints on control execution. We show that the next triggering instant is larger than its immediate past triggering instant by a given positive constant. The analysis is also presented with taking delay into account in the control updates. An upper bound for delay is calculated to ensure stability of the system. It is shown that with delay steady-state bound of the system is increased than that of the case without delay. However, the system trajectories remain bounded in the case of delay, so stability is ensured. The performance of this event-triggered SMC is demonstrated through a numerical simulation.

  17. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. PMID:27153981

  18. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  19. Simulation for CZT Compton PET (Maximization of the efficiency for PET using Compton event)

    NASA Astrophysics Data System (ADS)

    Yoon, Changyeon; Lee, Wonho; Lee, Taewoong

    2011-10-01

    Multiple interactions in positron emission tomography (PET) using scintillators are generally treated as noise events because each interacted position and energy of the multiple interactions cannot be obtained individually and the sequence of multiple scattering is not fully known. Therefore, the first interaction position, which is the crucial information for a PET image reconstruction, cannot be determined correctly. However, in the case of a pixelized semiconductor detector, such as CdZnTe, each specific position and energy information of multiple interactions can be obtained. Moreover, for the emission of two 511 keV radiations in PET, if one radiation deposits all the energy in one position (photoelectric effect) and the other radiation undergoes Compton scattering followed by the photoelectric effect, the sequence of Compton scattering followed by the photoelectric effect can be determined using the Compton scattering formula. Hence, the correct position of Compton scattering can be determined, and the Compton scattering effect, which is discarded in conventional PET systems can be recovered in the new system reported in this study. The PET system in this study, which was simulated using GATE 5.0 code, was composed of 20 mm×10 mm×10 mm CdZnTe detectors consisting of 1 mm×0.5 mm×2.5 mm pixels. The angular uncertainties caused by Doppler broadening, pixelization effect and energy broadening were estimated and compared. The pixelized effect was the main factor in increasing the angular uncertainty and was strongly dependent on the distance between the 1st and 2nd interaction positions. The effect of energy broadening to an angular resolution less than expected and that of Doppler broadening was minimal. The number of Compton events was double that of the photoelectric effect assuming full energy absorption. Therefore, the detection efficiency of this new PET system can be improved greatly because both the photoelectric effect and Compton scattering are

  20. An abrupt climate event in a coupled ocean-atmosphere simulation without external forcing.

    PubMed

    Hall, A; Stouffer, R J

    2001-01-11

    Temperature reconstructions from the North Atlantic region indicate frequent abrupt and severe climate fluctuations during the last glacial and Holocene periods. The driving forces for these events are unclear and coupled atmosphere-ocean models of global circulation have only simulated such events by inserting large amounts of fresh water into the northern North Atlantic Ocean. Here we report a drastic cooling event in a 15,000-yr simulation of global circulation with present-day climate conditions without the use of such external forcing. In our simulation, the annual average surface temperature near southern Greenland spontaneously fell 6-10 standard deviations below its mean value for a period of 30-40 yr. The event was triggered by a persistent northwesterly wind that transported large amounts of buoyant cold and fresh water into the northern North Atlantic Ocean. Oceanic convection shut down in response to this flow, concentrating the entire cooling of the northern North Atlantic by the colder atmosphere in the uppermost ocean layer. Given the similarity between our simulation and observed records of rapid cooling events, our results indicate that internal atmospheric variability alone could have generated the extreme climate disruptions in this region. PMID:11196636

  1. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event.

  2. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event. PMID:18302490

  3. BEEC: An event generator for simulating the Bc meson production at an e+e- collider

    NASA Astrophysics Data System (ADS)

    Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You

    2013-12-01

    The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in

  4. Simulated seismic event release fraction data: Progress report, April 1986-April 1987

    SciTech Connect

    Langer, G.; Deitesfeld, C.A.

    1987-11-15

    The object of this project is to obtain experimental data on the release of airborne particles during seismic events involving plutonium handling facilities. In particular, cans containing plutonium oxide powder may be involved and some of the powder may become airborne. No release fraction data for such scenarios are available and risk assessment calculations for such events lacked specificity describing the physical processes involved. This study has provided initial data based on wind tunnel tests simulating the impact of the debris on simulated cans of plutonium oxide powder. The release fractions are orders of magnitude smaller than previously available estimates. 8 refs., 3 figs., 2 tabs.

  5. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    USGS Publications Warehouse

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  6. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  7. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

    PubMed Central

    Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  8. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees.

  9. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital.

    PubMed

    Luo, Li; Liu, Hangjiang; Liao, Huchang; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  10. Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model

    NASA Astrophysics Data System (ADS)

    Mikolajewicz, Uwe; Ziemen, Florian

    2016-04-01

    Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.

  11. Decentralised consensus for multiple Lagrangian systems based on event-triggered strategy

    NASA Astrophysics Data System (ADS)

    Liu, Xiangdong; Du, Changkun; Lu, Pingli; Yang, Dapeng

    2016-06-01

    This paper considers the decentralised event-triggered consensus problem for multi-agent systems with Lagrangian dynamics under undirected graphs. First, a distributed, leaderless, and event-triggered consensus control algorithm is presented based on the definition of generalised positions and velocities for all agents. There is only one triggering function for both the generalised positions and velocities and no Zeno behaviour exhibited under the proposed consensus strategy. Second, an adaptive event-triggered consensus control algorithm is proposed for such multi-agent systems with unknown constant parameters. Third, based on sliding-mode method, an event-triggered consensus control algorithm is considered for the case with external disturbance. Finally, simulation results are given to illustrate the theoretical results.

  12. Designing power system simulators for the smart grid: combining controls, communications, and electro-mechanical dynamics

    SciTech Connect

    Nutaro, James J

    2011-01-01

    Open source software has a leading role in research on simulation technology for electrical power systems. Research simulators demonstrate new features for which there is nascent but growing demand not yet provided for by commercial simulators. Of particular interest is the inclusion of models of software-intensive and communication-intensive controls in simulations of power system transients. This paper describes two features of the ORNL power system simulator that help it meet this need. First is its use of discrete event simulation for all aspects of the model: control, communication, and electro-mechanical dynamics. Second is an interoperability interface that enables the ORNL power system simulator to be integrated with existing, discrete event simulators of digital communication systems. The paper concludes with a brief discussion of how these aspects of the ORNL power system simulator might be inserted into production-grade simulation tools.

  13. Model for the evolution of the time profile in optimistic parallel discrete event simulations

    NASA Astrophysics Data System (ADS)

    Ziganurova, L.; Novotny, M. A.; Shchur, L. N.

    2016-02-01

    We investigate synchronisation aspects of an optimistic algorithm for parallel discrete event simulations (PDES). We present a model for the time evolution in optimistic PDES. This model evaluates the local virtual time profile of the processing elements. We argue that the evolution of the time profile is reminiscent of the surface profile in the directed percolation problem and in unrestricted surface growth. We present results of the simulation of the model and emphasise predictive features of our approach.

  14. A verilog simulation of the CDF DAQ system

    SciTech Connect

    Schurecht, K.; Harris, R.; Sinervo, P.; Grindley, R.

    1991-11-01

    A behavioral simulation of the CDF data acquisition system was written in the Verilog modeling language in order to investigate the effects of various improvements to the existing system. This system is modeled as five separate components that communicate with each other via Fastbus interrupt messages. One component of the system, the CDF event builder, is modeled in substantially greater detail due to its complex structure. This simulation has been verified by comparing its performance with that of the existing DAQ system. Possible improvements to the existing systems were studied using the simulation, and the optimal upgrade path for the system was chosen on the basis of these studies. The overall throughput of the modified system is estimated to be double that of the existing setup. Details of this modeling effort will be discussed, including a comparison of the modeled and actual performance of the existing system.

  15. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  16. Simulation of hailstorm event using Mesoscale Model MM5 with modified cloud microphysics scheme

    NASA Astrophysics Data System (ADS)

    Chatterjee, P.; Pradhan, D.; de, U. K.

    2008-11-01

    Mesoscale model MM5 (Version 3.5) with some modifications in the cloud microphysics scheme of Schultz (1995), has been used to simulate two hailstorm events over Gangetic Plain of West Bengal, India. While the first event occurred on 12 March 2003 and the hails covered four districts of the state of West Bengal, India, the second hailstorm event struck Srinikatan (22.65° N, 87.7° E) on 10 April 2006 at 11:32 UT and it lasted for 2 3 min. Both these events can be simulated, if the same modifications are introduced in the cloud microphysics scheme of Schultz. However, the original scheme of Schultz cannot simulate any hail. The results of simulation were compared with the necessary products of Doppler Weather Radar (DWR) located at Kolkata (22.57° N, 88.35° E). Model products like reflectivity, graupel and horizontal wind are compared with the corresponding products of DWR. The pattern of hail development bears good similarity between model output and observation from DWR, if necessary modifications are introduced in the model. The model output of 24 h accumulated rain from 03:00 UT to next day 03:00 UT has also been compared with the corresponding product of the satellite TRMM.

  17. Effects of a simulated agricultural runoff event on sediment toxicity in a managed backwater wetland

    Technology Transfer Automated Retrieval System (TEKTRAN)

    permethrin (both cis and trans isomers), on 10-day sediment toxicity to Hyalella azteca in a managed natural backwater wetland after a simulated agricultural runoff event. Sediment samples were collected at 10, 40, 100, 300, and 500 m from inflow 13 days prior to amendment and 1, 5, 12, 22, and 36 ...

  18. Explicit spatial scattering for load balancing in conservatively synchronized parallel discrete-event simulations

    SciTech Connect

    Thulasidasan, Sunil; Kasiviswanathan, Shiva; Eidenbenz, Stephan; Romero, Philip

    2010-01-01

    We re-examine the problem of load balancing in conservatively synchronized parallel, discrete-event simulations executed on high-performance computing clusters, focusing on simulations where computational and messaging load tend to be spatially clustered. Such domains are frequently characterized by the presence of geographic 'hot-spots' - regions that generate significantly more simulation events than others. Examples of such domains include simulation of urban regions, transportation networks and networks where interaction between entities is often constrained by physical proximity. Noting that in conservatively synchronized parallel simulations, the speed of execution of the simulation is determined by the slowest (i.e most heavily loaded) simulation process, we study different partitioning strategies in achieving equitable processor-load distribution in domains with spatially clustered load. In particular, we study the effectiveness of partitioning via spatial scattering to achieve optimal load balance. In this partitioning technique, nearby entities are explicitly assigned to different processors, thereby scattering the load across the cluster. This is motivated by two observations, namely, (i) since load is spatially clustered, spatial scattering should, intuitively, spread the load across the compute cluster, and (ii) in parallel simulations, equitable distribution of CPU load is a greater determinant of execution speed than message passing overhead. Through large-scale simulation experiments - both of abstracted and real simulation models - we observe that scatter partitioning, even with its greatly increased messaging overhead, significantly outperforms more conventional spatial partitioning techniques that seek to reduce messaging overhead. Further, even if hot-spots change over the course of the simulation, if the underlying feature of spatial clustering is retained, load continues to be balanced with spatial scattering leading us to the observation that

  19. A systems model of phosphorylation for inflammatory signaling events.

    PubMed

    Sadreev, Ildar I; Chen, Michael Z Q; Welsh, Gavin I; Umezawa, Yoshinori; Kotov, Nikolay V; Valeyev, Najl V

    2014-01-01

    Phosphorylation is a fundamental biochemical reaction that modulates protein activity in cells. While a single phosphorylation event is relatively easy to understand, multisite phosphorylation requires systems approaches for deeper elucidation of the underlying molecular mechanisms. In this paper we develop a mechanistic model for single- and multi-site phosphorylation. The proposed model is compared with previously reported studies. We compare the predictions of our model with experiments published in the literature in the context of inflammatory signaling events in order to provide a mechanistic description of the multisite phosphorylation-mediated regulation of Signal Transducer and Activator of Transcription 3 (STAT3) and Interferon Regulatory Factor 5 (IRF-5) proteins. The presented model makes crucial predictions for transcription factor phosphorylation events in the immune system. The model proposes potential mechanisms for T cell phenotype switching and production of cytokines. This study also provides a generic framework for the better understanding of a large number of multisite phosphorylation-regulated biochemical circuits.

  20. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller. PMID:27398276

  1. Dust events in Arizona: Long-term satellite and surface observations, and the National Air Quality Forecasting Capability CMAQ simulations

    NASA Astrophysics Data System (ADS)

    Huang, M.; Tong, D.; Lee, P.; Pan, L.; Tang, Y.; Stajner, I.; Pierce, R. B.; McQueen, J.

    2015-12-01

    Dust events in Arizona: An analysis integrating satellite and surface weather and aerosol measurements, and National Air Quality Forecasting Capability CMAQ simulations Dust records in Arizona during 2005-2013 are developed using multiple observation datasets, including level 2 deep blue aerosol product by the Moderate Resolution Imaging Spectroradiometer (MODIS) and the in-situ measurements at the surface Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) sites in Phoenix. The satellite and surface aerosol observations were anti-correlated with three drought indicators (i.e., MODIS vegetation index, a European satellite soil moisture dataset, and Palmer Drought Severity Index). During the dusty year of 2007, we show that the dust events were stronger and more frequent in the afternoon hours than in the morning due to faster winds and drier soil, and the Sonoran and Chihuahuan deserts were important dust source regions during identified dust events in Phoenix as indicated by NOAA's Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) Model calculations. Based on these findings, we suggested a potential for use of satellite soil moisture and vegetation index products to interpret and predict dust activity. We also emphasized the importance of using hourly observations for better capturing dust events, and expect the hourly geostationary satellite observations in the future to well complement the current surface PM and meteorological observations considering their broader spatial coverage. Additionally, the performance of the National Air Quality Forecasting Capability (NAQFC) 12 km CMAQ model simulation is evaluated during a recent strong dust event in the western US accompanied by stratospheric ozone intrusion. The current modeling system well captured the temporal variability and the magnitude of aerosol concentrations during this event. Directions of integrating satellite weather and vegetation observations

  2. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  3. Simulation of the 18 April 2002 Sawtooth Event Using the Rice Convection Model

    NASA Astrophysics Data System (ADS)

    Yang, J.; Toffoletto, F.; Wolf, R.; Sazykin, S.; Brandt, P.; Henderson, M.

    2007-12-01

    We present simulation results of the 18 April 2002 sawtooth event using the Rice Convection Model (RCM) where we treat this event as a series of recurrent substorms with a period of 2-4 hours. The simulation uses the storm time magnetic field model as well as the time dependent plasma sheet model as inputs to RCM. During the substorm expansion phase, we use an empirical substorm current wedge model in order to dipolarize the magnetic field on the nightside and we deplete flux tube content on the outer boundary over a wide range of local time. The simulated energetic proton fluxes at the geosynchronous orbit show a well-defined global sawtooth pattern and the calculated ENA fluxes show that oxygen is enhanced more significantly after the substorm onset than hydrogen, which is consistent with the IMAGE/HENA observations.

  4. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  5. Mineral dust events over the Mediterranean: Multi-platform characterization and comparison with GEOS-Chem simulation

    NASA Astrophysics Data System (ADS)

    Bartlett, K. S.; Luo, G.; Yu, F.; Aerosol Microphysics Research Group

    2011-12-01

    Kevin S. Bartlett, Gan Luo, and Fangqun Yu Atmospheric Sciences Research Center, State University of New York, 251 Fuller Road, Albany, New York 12203, USA Abstract. The Mediterranean Basin (MB) has a population of 210 million with approximately 50% living within the eastern half of the MB where seasonal lows tracking southeastward across Italy transport climate impacting, visibility and air quality reducing mineral dust from North African deserts. Located 1000 km NE of African dust and 400 km SE of Greek population centers, Crete is uniquely located at a crossroads of the seasonal storm track and a mineral dust pathway recording some of the highest average AODs within the region, and is well instrumented to observe this trend. Investigating dust transport using NCEP reanalysis data, surface based METAR, AERONET and SMPS as well as space based MODIS, MISR and CALIPSO observations coupled with nested GEOS-Chem-APM simulations, this study exploits sensor strengths while noting weaknesses to characterize seasonal trends and individual dust events. To verify seasonal trends, METAR, AERONET and NCEP data were collected over Crete from 2003-2010. This 8 year period showed that the majority of large aerosol events occurred during late winter and early spring during passage of transitional low pressure systems. During 2009, newly released, SMPS particle size distribution measurements were acquired from Finokalia. These in situ measurements coupled with the observation collective were key to confirming dust events and for comparing the nested GEOS-Chem's particle size distribution and AOD simulations, which showed some skill capturing dust events as well as the seasonal dust peaks early in the year. In this study we exploited the strengths of the observation collective and model runs to prove seasonal North African dust transport across Crete. We also noted the limitations of surface based: point observations, limited to localized conditions of transport and fallout, where

  6. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  7. Soil organic carbon loss and selective transportation under field simulated rainfall events.

    PubMed

    Nie, Xiaodong; Li, Zhongwu; Huang, Jinquan; Huang, Bin; Zhang, Yan; Ma, Wenming; Hu, Yanbiao; Zeng, Guangming

    2014-01-01

    The study on the lateral movement of soil organic carbon (SOC) during soil erosion can improve the understanding of global carbon budget. Simulated rainfall experiments on small field plots were conducted to investigate the SOC lateral movement under different rainfall intensities and tillage practices. Two rainfall intensities (High intensity (HI) and Low intensity (LI)) and two tillage practices (No tillage (NT) and Conventional tillage (CT)) were maintained on three plots (2 m width × 5 m length): HI-NT, LI-NT and LI-CT. The rainfall lasted 60 minutes after the runoff generated, the sediment yield and runoff volume were measured and sampled at 6-min intervals. SOC concentration of sediment and runoff as well as the sediment particle size distribution were measured. The results showed that most of the eroded organic carbon (OC) was lost in form of sediment-bound organic carbon in all events. The amount of lost SOC in LI-NT event was 12.76 times greater than that in LI-CT event, whereas this measure in HI-NT event was 3.25 times greater than that in LI-NT event. These results suggest that conventional tillage as well as lower rainfall intensity can reduce the amount of lost SOC during short-term soil erosion. Meanwhile, the eroded sediment in all events was enriched in OC, and higher enrichment ratio of OC (ERoc) in sediment was observed in LI events than that in HI event, whereas similar ERoc curves were found in LI-CT and LI-NT events. Furthermore, significant correlations between ERoc and different size sediment particles were only observed in HI-NT event. This indicates that the enrichment of OC is dependent on the erosion process, and the specific enrichment mechanisms with respect to different erosion processes should be studied in future.

  8. Simulating the effects of hyperpycnal events on the stratigraphy of Poverty Shelf, New Zealand

    NASA Astrophysics Data System (ADS)

    Hutton, E. W.; Kettner, A. J.; Kubo, Y.; Gomez, B.; Syvitski, J. P.

    2007-12-01

    The hydrologic-transport model, HydroTrend indicates that suspended sediment discharge of the Waipaoa River, New Zealand increased from 2.3 to 15 Mt/y over the last 3000 years. Prior to the arrival of European colonists in the nineteenth century A.D., volcanic eruptions, natural fires and severe storms controlled erosion rates within the basin. Since then, clearing of much of the indigenous forest for sheep farming has caused suspended sediment discharge of the Waipaoa to increase by 850%. HydroTrend simulations indicate the Waipaoa was not able to generate hyperpycnal discharges before the arrival of European colonists. However, because of deforestation in the headwaters, suspended sediment concentrations of the Waipaoa are now able to exceed 40 kg/m3 during large flood events. The river density of these events is great enough to cause the manner by which sediment is transported from the river to change from a surface plume to a hyperpycnal plume. Although these hyperpycnal events are rare (recurrence intervals greater than 2 years), simulations suggest these events carry approximately one fifth of the total sediment load. Observational data of hyperpycnal flows are scarce as they often only occur during extreme weather events. Given the proper boundary conditions, these events have the potential to transport large amounts of sediment over the sheltered Poverty Bay shelf, and into the deep ocean. For this study, we have used HydroTrend results as input to the basin filling model, sedflux (coupled with the hyperpycnal plume model, sakura), to investigate the impact of these hyperpycnal events on the stratigraphy of the Poverty Bay shelf. We note that while some flood events generate hyperpycnal flows that are able to bypass the shelf, others are unable to ignite and deposit the bulk of their sediment on the shelf.

  9. A computer aided treatment event recognition system in radiation therapy

    SciTech Connect

    Xia, Junyi Mart, Christopher; Bayouth, John

    2014-01-15

    Purpose: To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. Methods: CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012–November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors’ clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. Results: All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when

  10. Exercise-Associated Collapse in Endurance Events: A Classification System.

    ERIC Educational Resources Information Center

    Roberts, William O.

    1989-01-01

    Describes a classification system devised for exercise-associated collapse in endurance events based on casualties observed at six Twin Cities Marathons. Major diagnostic criteria are body temperature and mental status. Management protocol includes fluid and fuel replacement, temperature correction, and leg cramp treatment. (Author/SM)

  11. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  12. CASINO: A Small System Simulator

    ERIC Educational Resources Information Center

    Christensen, Borge

    1978-01-01

    This article is a tutorial on writing a simulator--the example used is a casino. The nontechnical, step by step approach is designed to enable even non-programmers to understand the design of such a simulation. (Author)

  13. Interannual and Intraseasonal oscillations and extreme events over South America simulated by HIGEM models.

    NASA Astrophysics Data System (ADS)

    Custodio, Maria; Ambrizzi, Tercio

    2014-05-01

    The climatic system has its fluctuations determined mainly by the complex fluxes from the ocean and atmosphere. The fluxes transport energy, momentum and tracers within and between system components; they occur in a wide range of spatial and temporal scales. Because of this, according to Shaffrey et al. (2009) the development of high resolution global models is indispensable, to simulate the energy transfer to smaller scales and to capture the non linear interactions between wide ranges of spatial and temporal scales, and between the different components of climatic system. There are strong reasons to increase the resolution of all the atmospheric and oceanic components of coupled climatic models (AGCM) and uncoupled climatic models (GCM). The South America (SA) climate is characterized by different precipitation regimes and its variability has large influences of the large scale phenomena in the interanual (El Niño South Oscilation - ENSO) and intraseasonal (Maden Julian Oscilation - MJO) timescales. Normally, the AGCM and CGM use low horizontal resolution and present difficult in the representation of these low frequency variability phenomena. The goal of this work is to evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model, which will be denominated NUGEM (~60 Km), HiGEM (~90 km) and HadGEM (~135 km) and NUGAM (~60 Km), HiGAM (~90 Km) and HadGAM (~135 Km), respectively, in capturing the signal of interannual and intraseasonal variability of precipitation and temperature over SA. Basically we want discuss the impact of sea surface temperature in the annual cycle of atmospheric variables. The simulations were compared with precipitation data from Climate Prediction Center - Merged Analysis of Precipitation (CMAP) and with temperature data from ERA-Interim, both for the period 1979 to 2008. The precipitation and temperature time-series were filtered on the interanual (period > 365 days) and intraseasonal (30

  14. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  15. Recurrence time statistics of landslide events simulated by a cellular automaton model

    NASA Astrophysics Data System (ADS)

    Piegari, Ester; Di Maio, Rosa; Avella, Adolfo

    2014-05-01

    The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times

  16. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A recommendation and a specification for the visual simulation system design for the space shuttle mission simulator are presented. A recommended visual system is described which most nearly meets the visual design requirements. The cost analysis of the recommended system covering design, development, manufacturing, and installation is reported. Four alternate systems are analyzed.

  17. Modelling and real-time simulation of continuous-discrete systems in mechatronics

    SciTech Connect

    Lindow, H.

    1996-12-31

    This work presents a methodology for simulation and modelling of systems with continuous - discrete dynamics. It derives hybrid discrete event models from Lagrange`s equations of motion. This method combines continuous mechanical, electrical and thermodynamical submodels on one hand with discrete event models an the other hand into a hybrid discrete event model. This straight forward software development avoids numeric overhead.

  18. The Template of Events for Applied and Critical Healthcare Simulation (TEACH Sim): a tool for systematic simulation scenario design.

    PubMed

    Benishek, Lauren E; Lazzara, Elizabeth H; Gaught, William L; Arcaro, Lygia L; Okuda, Yasuharu; Salas, Eduardo

    2015-02-01

    Simulation-based training (SBT) affords practice opportunities for improving the quality of clinicians' technical and nontechnical skills. However, the development of practice scenarios is a process plagued by a set of challenges that must be addressed for the full learning potential of SBT to be realized. Scenario templates are useful tools for assisting with SBT and navigating its inherent challenges. This article describes existing SBT templates, explores considerations in choosing an appropriate template, and introduces the Template of Events for Applied and Critical Healthcare Simulation (TEACH Sim) as a tool for facilitating the formation of practice scenarios in accordance with an established evidence-based simulation design methodology. TEACH Sim's unique contributions are situated within the landscape of previously existing templates, and each of its component sections is explained in detail.

  19. Simulation of Anomalous Regional Climate Events with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    1999-01-01

    The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.

  20. Using Discrete Event Computer Simulation to Improve Patient Flow in a Ghanaian Acute Care Hospital

    PubMed Central

    Best, Allyson M.; Dixon, Cinnamon A.; Kelton, W. David; Lindsell, Christopher J.

    2014-01-01

    Objectives Crowding and limited resources have increased the strain on acute care facilities and emergency departments (EDs) worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation (DES) is a computer-based tool that can be used to estimate how changes to complex healthcare delivery systems, such as EDs, will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. Methods We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (e.g. modified staff start times and roles) and resource-additional (e.g. increased staff) operational interventions on patient throughput. Previously captured, de-identified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). Results The base-case (no change) scenario had a mean LOS of 292 minutes (95% CI 291, 293). In isolation, neither adding staffing, changing staff roles, nor varying shift times affected overall patient LOS. Specifically, adding two registration workers, history takers, and physicians resulted in a 23.8 (95% CI 22.3, 25.3) minute LOS decrease. However, when shift start-times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI 94, 98); and with the simultaneous combination of staff roles (Registration and History-taking) there was an overall mean LOS reduction of 152 minutes (95% CI 150, 154). Conclusions Resource-neutral interventions identified through DES modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. DES offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute

  1. Numerical Simulation and Analysis of the Localized Heavy Precipitation Event in South Korea based on diagnostic variables

    NASA Astrophysics Data System (ADS)

    Roh, Joon-Woo; Choi, Young-Jean

    2016-04-01

    Accurate prediction of precipitation is one of the most difficult and significant tasks in weather forecasting. Heavy precipitations in the Korean Peninsula are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. Many previous studies have used observations, numerical modeling, and statistics to investigate the potential causes of warm-season heavy precipitation in South Korea. Especially, the frequency of warm-season torrential rainfall events more than 30 mm/h precipitation has increased threefold in Seoul, a metropolitan city in South Korea, in recent 30 years. Localized heavy rainfall events in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances along the Changma front, or from convective instabilities resulting from unstable air masses. In order to investigate localized heavy precipitation system in Seoul metropolitan area, analysis and numerical experiment were performed for a typical event in 20 June 2014. This case is described to a structure of baroclinic instability associated with a short-wave trough from the northwest and high moist and warm air by a thermal low from the southwest of the Korean Peninsula. We investigated localized heavy precipitation in narrow zone of the Seoul urban area using numerical simulations based on the Weather Research and Forecast (WRF) model with convective scale. The topography and land use data of the revised U.S. Geological Survey (USGS) data and the appropriate set of physical scheme options for WRF model simulation were deliberated. Simulation experiments showed patches of primary physical structures related to the localized heavy precipitation using the diagnostic fields, which are storm relative helicity (SRH), updraft helicity (UH), and instantaneous contraction rates (ICON). SRH and UH are dominantly related to

  2. Numerical simulations of the jetted tidal disruption event Swift J1644+57

    NASA Astrophysics Data System (ADS)

    Mimica, Petar; Aloy, Miguel A.; Giannios, Dimitrios; Metzger, Brian D.

    2016-05-01

    In this work we focus on the technical details of the numerical simulations of the non-thermal transient Swift J1644+57, whose emission is probably produced by a two- component jet powered by a tidal disruption event. In this context we provide details of the coupling between the relativistic hydrodynamic simulations and the radiative transfer code. First, we consider the technical demands of one-dimensional simulations of a fast relativistic jet, and show to what extent (for the same physical parameters of the model) do the computed light curves depend on the numerical parameters of the different codes employed. In the second part we explain the difficulties of computing light curves from axisymmetric two dimensonal simulations and discuss a procedure that yields an acceptable tradeoff between the computational cost and the quality of the results.

  3. Atomistic-scale simulations of the initial chemical events in triacetonetriperoxide (TATP) detonation

    NASA Astrophysics Data System (ADS)

    van Duin, Adri; Zeiri, Yehuda; Goddard, William

    2005-03-01

    To study the initial chemical events related to the detonation of triacetonetriperoxide (TATP) we have performed a series of molecular dynamics (MD) simulations using the ReaxFF reactive force field [1,2], extended to reproduce the quantumchemical (QM)-derived relative energies of the reactants, products, intermediates and transition states related to the TATP unimolecular decomposition. We find excellent agreement between the reaction products predicted from QM and those observed from ReaxFF unimolecular cookoff simulations. Furthermore, the primary reaction products observed in the unimolecular cookoff simulations match closely with those observed from a TATP-condensed phase cookoff simulation, indicating that unimolecular decomposition dominates TATP-condensed phase initiation. [1] A.C.T. van Duin, S. Dasgupta, F. Lorant and W.A. Goddard (2001), J. Phys. Chem. A 105, 9396-9409.. [2] A. Strachan, A.C.T. van Duin, D. Chakraborty, S. Dasgupta and W.A. Goddard III (2003) Phys. Rev. Letters 91, 09301.

  4. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  5. Evaluating the aerosol indirect effect in WRF-Chem simulations of the January 2013 Beijing air pollution event.

    NASA Astrophysics Data System (ADS)

    Peckham, Steven; Grell, Georg; Xie, Ying; Wu, Jian-Bin

    2015-04-01

    In January 2013, an unusual weather pattern over Northern China produced unusually cool, moist conditions for the region. Recent peer-reviewed scientific manuscripts report that during this time period, Beijing experienced a historically severe haze and smog event with observed monthly average fine particulate matter (PM2.5) concentrations exceeding 225 micrograms per cubic meter. MODIS satellite observations produced AOD values of approximately 1.5 to 2 for the same time. In addition, over eastern and northern China record-breaking hourly average PM2.5 concentrations of more than 700 μg m-3 were observed. Clearly, the severity and persistence of this air pollution episode has raised the interest of the scientific community as well as widespread public attention. Despite the significance of this and similar air pollution events, several questions regarding the ability of numerical weather prediction models to forecast such events remain. Some of these questions are: • What is the importance of including aerosols in the weather prediction models? • What is the current capability of weather prediction models to simulate aerosol impacts upon the weather? • How important is it to include the aerosol feedbacks (direct and indirect effect) in the numerical model forecasts? In an attempt to address these and other questions, a Joint Working Group of the Commission for Atmospheric Sciences and the World Climate Research Programme has been convened. This Working Group on Numerical Experimentation (WGNE), has set aside several events of interest and has asked its members to generate numerical simulations of the events and examine the results. As part of this project, weather and pollution simulations were produced at the NOAA Earth System Research Laboratory using the Weather Research and Forecasting (WRF) chemistry model. These particular simulations include the aerosol indirect effect and are being done in collaboration with a group in China that will produce

  6. Detection and Attribution of Simulated Climatic Extreme Events and Impacts: High Sensitivity to Bias Correction

    NASA Astrophysics Data System (ADS)

    Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.

    2015-12-01

    Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of

  7. Safety monitoring in the Vaccine Adverse Event Reporting System (VAERS)

    PubMed Central

    Shimabukuro, Tom T.; Nguyen, Michael; Martin, David; DeStefano, Frank

    2015-01-01

    The Centers for Disease Control and Prevention (CDC) and the U.S. Food and Drug Administration (FDA) conduct post-licensure vaccine safety monitoring using the Vaccine Adverse Event Reporting System (VAERS), a spontaneous (or passive) reporting system. This means that after a vaccine is approved, CDC and FDA continue to monitor safety while it is distributed in the marketplace for use by collecting and analyzing spontaneous reports of adverse events that occur in persons following vaccination. Various methods and statistical techniques are used to analyze VAERS data, which CDC and FDA use to guide further safety evaluations and inform decisions around vaccine recommendations and regulatory action. VAERS data must be interpreted with caution due to the inherent limitations of passive surveillance. VAERS is primarily a safety signal detection and hypothesis generating system. Generally, VAERS data cannot be used to determine if a vaccine caused an adverse event. VAERS data interpreted alone or out of context can lead to erroneous conclusions about cause and effect as well as the risk of adverse events occurring following vaccination. CDC makes VAERS data available to the public and readily accessible online. We describe fundamental vaccine safety concepts, provide an overview of VAERS for healthcare professionals who provide vaccinations and might want to report or better understand a vaccine adverse event, and explain how CDC and FDA analyze VAERS data. We also describe strengths and limitations, and address common misconceptions about VAERS. Information in this review will be helpful for healthcare professionals counseling patients, parents, and others on vaccine safety and benefit-risk balance of vaccination. PMID:26209838

  8. Simulation system of airborne FLIR searcher

    NASA Astrophysics Data System (ADS)

    Sun, Kefeng; Li, Yu; Gao, Jiaobo; Wang, Jun; Wang, Jilong; Xie, Junhu; Ding, Na; Sun, Dandan

    2014-11-01

    Airborne Forward looking infra-red (FLIR) searcher simulation system can provide multi-mode simulated test environment that almost actual field environment, and can simulate integrated performance and external interface of airborne FLIR simulation system. Furthermore, the airborne FLIR searcher simulation system can support the algorithm optimization of image processing, and support the test and evaluation of electro-optical system, and also support the line test of software and evaluate the performance of the avionics system. The detailed design structure and information cross-linking relationship of each component are given in this paper. The simulation system is composed of the simulation center, the FLIR actuator, the FLIR emulator, and the display control terminal. The simulation center can generate the simulated target and aircraft flying data in the operation state of the airborne FLIR Searcher. The FLIR actuator can provide simulation scene. It can generate the infrared target and landform based scanning scene, response to the commands from simulation center and the FLIR actuator and operation control unit. The infrared image generated by the FLIR actuator can be processed by the FLIR emulator using PowerPC hardware framework and processing software based on VxWorks system. It can detect multi-target and output the DVI video and the multi-target detection information which corresponds to the working state of the FLIR searcher. Display control terminal can display the multi-target detection information in two-dimension situation format, and realize human-computer interaction function.

  9. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  10. Developing clinical competency in crisis event management: an integrated simulation problem-based learning activity.

    PubMed

    Liaw, S Y; Chen, F G; Klainin, P; Brammer, J; O'Brien, A; Samarasekera, D D

    2010-08-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session would be superior to those who completed the conventional problem-based session. The students were allocated into either simulation with problem-based discussion (SPBD) or problem-based discussion (PBD) for scenarios on respiratory and cardiac distress. Following completion of each scenario, students from both groups were invited to sit an optional individual test involving a systematic assessment and immediate management of a simulated patient facing a crisis event. A total of thirty students participated in the first post test related to a respiratory scenario and thirty-three participated in the second post test related to a cardiac scenario. Their clinical performances were scored using a checklist. Mean test scores for students completing the SPBD were significantly higher than those who completing the PBD for both the first post test (SPBD 20.08, PBD 18.19) and second post test (SPBD 27.56, PBD 23.07). Incorporation of simulation learning activities into problem-based discussion appeared to be an effective educational strategy for teaching nursing students to assess and manage crisis events.

  11. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    SciTech Connect

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  12. Computer simulation of initial events in the biochemical mechanisms of DNA damage

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Holley, W. R.

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article.

  13. Computer simulation of initial events in the biochemical mechanisms of DNA damage.

    PubMed

    Chatterjee, A; Holley, W R

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article. PMID:11537895

  14. Synthetic immunosurveillance systems: nanodevices to monitor physiological events.

    PubMed

    Woappi, Yvon L; Jangiti, Rahul; Singh, Om V

    2014-11-15

    The field of nanotechnology has recently seen vast advancements in its applications for therapeutic strategy. This technological revolution has led way to nanomedicine, which spurred the development of clever drug delivery designs and ingenious nanovehicles for the monitoring of cellular events in vivo. The clinical implementations of this technology are innumerable and have demonstrated utility as diagnostic tools and fortifying machineries for the mammalian immune system. Recently engineered viral vectors and multi-subunit packaging RNAs have verified stable enough for long-term existence in the physiological environment and therefore reveal unique potential as artificial immunosurveillance devices. Physiological and pathological events recorded by nanodevices could help develop "biocatalogs" of patients' infection history, frequency of disease, and much more. In this article, we introduce a novel design concept for a multilayer synthetic immune network parallel to the natural immune system; an artificial network of continuously patrolling nanodevices incorporated in the blood and lymphatic systems, and adapted for molecular event recording, anomaly detection, drug delivery, and gene silencing. We also aim to discuss the approaches and advances recently reported in nanomedicine, especially as it pertains to promising viral and RNA-based nanovehicles and their prospective applications for the development of a synthetic immunosurveillance system (SIS). Alternative suggestions and limitations of these technologies are also discussed.

  15. Validating numerical simulations of snow avalanches using dendrochronology: the Cerro Ventana event in Northern Patagonia, Argentina

    NASA Astrophysics Data System (ADS)

    Casteller, A.; Christen, M.; Villalba, R.; Martínez, H.; Stöckli, V.; Leiva, J. C.; Bartelt, P.

    2008-05-01

    The damage caused by snow avalanches to property and human lives is underestimated in many regions around the world, especially where this natural hazard remains poorly documented. One such region is the Argentinean Andes, where numerous settlements are threatened almost every winter by large snow avalanches. On 1 September 2002, the largest tragedy in the history of Argentinean mountaineering took place at Cerro Ventana, Northern Patagonia: nine persons were killed and seven others injured by a snow avalanche. In this paper, we combine both numerical modeling and dendrochronological investigations to reconstruct this event. Using information released by local governmental authorities and compiled in the field, the avalanche event was numerically simulated using the avalanche dynamics programs AVAL-1D and RAMMS. Avalanche characteristics, such as extent and date were determined using dendrochronological techniques. Model simulation results were compared with documentary and tree-ring evidences for the 2002 event. Our results show a good agreement between the simulated projection of the avalanche and its reconstructed extent using tree-ring records. Differences between the observed and the simulated avalanche, principally related to the snow height deposition in the run-out zone, are mostly attributed to the low resolution of the digital elevation model used to represent the valley topography. The main contributions of this study are (1) to provide the first calibration of numerical avalanche models for the Patagonian Andes and (2) to highlight the potential of textit{Nothofagus pumilio} tree-ring records to reconstruct past snow-avalanche events in time and space. Future research should focus on testing this combined approach in other forested regions of the Andes.

  16. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  17. Selective Attention in Multi-Chip Address-Event Systems

    PubMed Central

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model. PMID:22346689

  18. Wire chamber requirements and tracking simulation studies for tracking systems at the superconducting super collider

    SciTech Connect

    Hanson, G.G.; Niczyporuk, B.B.; Palounek, A.P.T.

    1989-02-01

    Limitations placed on wire chambers by radiation damage and rate requirements in the SSC environment are reviewed. Possible conceptual designs for wire chamber tracking systems which meet these requirements are discussed. Computer simulation studies of tracking in such systems are presented. Simulations of events from interesting physics at the SSC, including hits from minimum bias background events, are examined. Results of some preliminary pattern recognition studies are given. Such computer simulation studies are necessary to determine the feasibility of wire chamber tracking systems for complex events in a high-rate environment such as the SSC. 11 refs., 9 figs., 1 tab.

  19. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  20. Relativistic positioning systems: Numerical simulations

    NASA Astrophysics Data System (ADS)

    Puchades Colmenero, Neus

    The position of users located on the Earth's surface or near it may be found with the classic positioning systems (CPS). Certain information broadcast by satellites of global navigation systems, as GPS and GALILEO, may be used for positioning. The CPS are based on the Newtonian formalism, although relativistic post-Newtonian corrections are done when they are necessary. This thesis contributes to the development of a different positioning approach, which is fully relativistic from the beginning. In the relativistic positioning systems (RPS), the space-time position of any user (ship, spacecraft, and so on) can be calculated with the help of four satellites, which broadcast their proper times by means of codified electromagnetic signals. In this thesis, we have simulated satellite 4-tuples of the GPS and GALILEO constellations. If a user receives the signals from four satellites simultaneously, the emission proper times read -after decoding- are the user "emission coordinates". In order to find the user "positioning coordinates", in an appropriate almost inertial reference system, there are two possibilities: (a) the explicit relation between positioning and emission coordinates (broadcast by the satellites) is analytically found or (b) numerical codes are designed to calculate the positioning coordinates from the emission ones. Method (a) is only viable in simple ideal cases, whereas (b) allows us to consider realistic situations. In this thesis, we have designed numerical codes with the essential aim of studying two appropriate RPS, which may be generalized. Sometimes, there are two real users placed in different positions, which receive the same proper times from the same satellites; then, we say that there is bifurcation, and additional data are needed to choose the real user position. In this thesis, bifurcation is studied in detail. We have analyzed in depth two RPS models; in both, it is considered that the satellites move in the Schwarzschild's space

  1. A comparison of active adverse event surveillance systems worldwide.

    PubMed

    Huang, Yu-Lin; Moon, Jinhee; Segal, Jodi B

    2014-08-01

    Post-marketing drug surveillance for adverse drug events (ADEs) has typically relied on spontaneous reporting. Recently, regulatory agencies have turned their attention to more preemptive approaches that use existing data for surveillance. We conducted an environmental scan to identify active surveillance systems worldwide that use existing data for the detection of ADEs. We extracted data about the systems' structures, data, and functions. We synthesized the information across systems to identify common features of these systems. We identified nine active surveillance systems. Two systems are US based-the FDA Sentinel Initiative (including both the Mini-Sentinel Initiative and the Federal Partner Collaboration) and the Vaccine Safety Datalink (VSD); two are Canadian-the Canadian Network for Observational Drug Effect Studies (CNODES) and the Vaccine and Immunization Surveillance in Ontario (VISION); and two are European-the Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR) Alliance and the Vaccine Adverse Event Surveillance and Communication (VAESCO). Additionally, there is the Asian Pharmacoepidemiology Network (AsPEN) and the Shanghai Drug Monitoring and Evaluative System (SDMES). We identified two systems in the UK-the Vigilance and Risk Management of Medicines (VRMM) Division and the Drug Safety Research Unit (DSRU), an independent academic unit. These surveillance systems mostly use administrative claims or electronic medical records; most conduct pharmacovigilance on behalf of a regulatory agency. Either a common data model or a centralized model is used to access existing data. The systems have been built using national data alone or via partnership with other countries. However, active surveillance systems using existing data remain rare. North America and Europe have the most population coverage; with Asian countries making good advances. PMID:25022829

  2. Event-triggered nonlinear consensus in directed multi-agent systems with combinational state measurements

    NASA Astrophysics Data System (ADS)

    Li, Huaqing; Chen, Guo; Xiao, Li

    2016-10-01

    Event-triggered sampling control is motivated by the applications of embedded microprocessors equipped in the agents with limited computation and storage resources. This paper studied global consensus in multi-agent systems with inherent nonlinear dynamics on general directed networks using decentralised event-triggered strategy. For each agent, the controller updates are event-based and only triggered at its own event times by only utilising the locally current sampling data. A high-performance sampling event that only needs local neighbours' states at their own discrete time instants is presented. Furthermore, we introduce two kinds of general algebraic connectivity for strongly connected networks and strongly connected components of the directed network containing a spanning tree so as to describe the system's ability for reaching consensus. A detailed theoretical analysis on consensus is performed and two criteria are derived by virtue of algebraic graph theory, matrix theory and Lyapunov control approach. It is shown that the Zeno behaviour of triggering time sequence is excluded during the system's whole working process. A numerical simulation is given to show the effectiveness of the theoretical results.

  3. Comparison of NO2 long-range transport events in GOME-2 observations and CTM simulations

    NASA Astrophysics Data System (ADS)

    Zien, A.; Hilboll, A.; Richter, A.; Burrows, J. P.

    2012-04-01

    Atmospheric long-range transport (LRT) events relocate trace gases from emission to downwind regions on an intercontinental scale, drastically altering the atmospheric chemistry in remote regions. Tropospheric NO2 is a very short-lived, mainly anthropogenic trace gas with strong impact on the ozone chemistry. Emissions are very localized and allow identification of individual LRT events. Here, the phenomenon of NO2 LRT is investigated by satellite remote sensing observations and global chemical transport modelling, which both provide good spatial and temporal coverage as well as sufficient resolution for the identification of large-scale, multi-day events. This allows the modelled and measured estimation of seasonal, regional and global LRT statistics. We use a non-cloud-filtered GOME-2 NO2 observational data set and model data from global GEOS-Chem simulations. A dedicated algorithm is used to identify and verify LRT events in observational and model data. We present the comparison of these results concerning the occurrence of NO2 LRT events. We discuss seasonalities in frequency and typical routes of LRTs and compare estimations of the transported mass from observations to results from the model. Further, we discuss peculiarities in the comparison between results from models and observations.

  4. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  5. Power System Extreme Event Detection: The VulnerabilityFrontier

    SciTech Connect

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  6. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  7. State-space supervision of reconfigurable discrete event systems

    SciTech Connect

    Garcia, H.E.; Ray, A.

    1995-12-31

    The Discrete Event Systems (DES) theory of supervisory and state feedback control offers many advantages for implementing supervisory systems. Algorithmic concepts have been introduced to assure that the supervising algorithms are correct and meet the specifications. It is often assumed that the supervisory specifications are invariant or, at least, until a given supervisory task is completed. However, there are many practical applications where the supervising specifications update at real time. For example, in a Reconfigurable Discrete Event System (RDES) architecture, a bank of supervisors is defined to accommodate each identified operational condition or different supervisory specifications. This adaptive supervisory control system changes the supervisory configuration to accept coordinating commands or to adjust for changes in the controlled process. This paper addresses reconfiguration at the supervisory level of hybrid systems along with a RDES underlying architecture. It reviews the state-based supervisory control theory and extends it to the paradigm of RDES and in view of process control applications. The paper addresses theoretical issues with a limited number of practical examples. This control approach is particularly suitable for hierarchical reconfigurable hybrid implementations.

  8. Can discrete event simulation be of use in modelling major depression?

    PubMed Central

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-01-01

    Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes

  9. Method for simulating discontinuous physical systems

    DOEpatents

    Baty, Roy S.; Vaughn, Mark R.

    2001-01-01

    The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.

  10. Evaluating resilience of DNP3-controlled SCADA systems against event buffer flooding

    SciTech Connect

    Yan, Guanhua; Nicol, David M; Jin, Dong

    2010-12-16

    The DNP3 protocol is widely used in SCADA systems (particularly electrical power) as a means of communicating observed sensor state information back to a control center. Typical architectures using DNP3 have a two level hierarchy, where a specialized data aggregator device receives observed state from devices within a local region, and the control center collects the aggregated state from the data aggregator. The DNP3 communication between control center and data aggregator is asynchronous with the DNP3 communication between data aggregator and relays; this leads to the possibility of completely filling a data aggregator's buffer of pending events, when a relay is compromised or spoofed and sends overly many (false) events to the data aggregator. This paper investigates how a real-world SCADA device responds to event buffer flooding. A Discrete-Time Markov Chain (DTMC) model is developed for understanding this. The DTMC model is validated by a Moebius simulation model and data collected on real SCADA testbed.

  11. DKIST Adaptive Optics System: Simulation Results

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Schmidt, Dirk

    2016-05-01

    The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.

  12. Exercise-Associated Collapse in Endurance Events: A Classification System.

    PubMed

    Roberts, W O

    1989-05-01

    In brief: Athletes who compete in endurance sports may sustain exercise-associated collapse (EAC) during or after an event. A classification system was devised for EAC that can be used by physicians who cover endurance events. Symptoms and signs of EAC include exhaustion, nausea, cramps, abnormally high or low core body temperature, muscle spasms, and inability to walk unassisted. The three classes of EAC are hyperthermic, normothermic, and hypothermic; each class is subclassified as mild, moderate, or severe. Treatment of warm runners includes applying ice bags wrapped with wet towels to the major areas of heat loss (neck, axilla, groin) to lower the core body temperature. Treatment of cool runners involves removing wet clothing, drying the skin, and insulating with wool blankets.

  13. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  14. ERS-1 system simulation and calibration

    NASA Astrophysics Data System (ADS)

    Hans, P.; Braun, H. M.; Groebke, H.

    1984-08-01

    The ERS-1 Radar systems cannot totally be tested on ground, so comprehensive system simulators, able to take test results of system elements and simulate the entire system chains to derive the end-to-end measurement performances are proposed. After launch and stabilization of the spacecraft and the orbit, the inflight calibration is performed by comparison of the ERS-1 measurements with ground truth information and tuning of the target models, supported by simulations to identify whether an error is caused by the system or a model uncertainty.

  15. Systemic chemokine levels, coronary heart disease, and ischemic stroke events

    PubMed Central

    Canouï-Poitrine, F.; Luc, G.; Mallat, Z.; Machez, E.; Bingham, A.; Ferrieres, J.; Ruidavets, J.-B.; Montaye, M.; Yarnell, J.; Haas, B.; Arveiler, D.; Morange, P.; Kee, F.; Evans, A.; Amouyel, P.; Ducimetiere, P.

    2011-01-01

    Objectives: To quantify the association between systemic levels of the chemokine regulated on activation normal T-cell expressed and secreted (RANTES/CCL5), interferon-γ-inducible protein-10 (IP-10/CXCL10), monocyte chemoattractant protein-1 (MCP-1/CCL2), and eotaxin-1 (CCL11) with future coronary heart disease (CHD) and ischemic stroke events and to assess their usefulness for CHD and ischemic stroke risk prediction in the PRIME Study. Methods: After 10 years of follow-up of 9,771 men, 2 nested case-control studies were built including 621 first CHD events and 1,242 matched controls and 95 first ischemic stroke events and 190 matched controls. Standardized hazard ratios (HRs) for each log-transformed chemokine were estimated by conditional logistic regression. Results: None of the 4 chemokines were independent predictors of CHD, either with respect to stable angina or to acute coronary syndrome. Conversely, RANTES (HR = 1.70; 95% confidence interval [CI] 1.05–2.74), IP-10 (HR = 1.53; 95% CI 1.06–2.20), and eotaxin-1 (HR = 1.59; 95% CI 1.02–2.46), but not MCP-1 (HR = 0.99; 95% CI 0.68–1.46), were associated with ischemic stroke independently of traditional cardiovascular risk factors, hs-CRP, and fibrinogen. When the first 3 chemokines were included in the same multivariate model, RANTES and IP-10 remained predictive of ischemic stroke. Their addition to a traditional risk factor model predicting ischemic stroke substantially improved the C-statistic from 0.6756 to 0.7425 (p = 0.004). Conclusions: In asymptomatic men, higher systemic levels of RANTES and IP-10 are independent predictors of ischemic stroke but not of CHD events. RANTES and IP-10 may improve the accuracy of ischemic stroke risk prediction over traditional risk factors. PMID:21849651

  16. Disaster triage systems for large-scale catastrophic events.

    PubMed

    Bostick, Nathan A; Subbarao, Italo; Burkle, Frederick M; Hsu, Edbert B; Armstrong, John H; James, James J

    2008-09-01

    Large-scale catastrophic events typically result in a scarcity of essential medical resources and accordingly necessitate the implementation of triage management policies to minimize preventable morbidity and mortality. Accomplishing this goal requires a reconceptualization of triage as a population-based systemic process that integrates care at all points of interaction between patients and the health care system. This system identifies at minimum 4 orders of contact: first order, the community; second order, prehospital; third order, facility; and fourth order, regional level. Adopting this approach will ensure that disaster response activities will occur in a comprehensive fashion that minimizes the patient care burden at each subsequent order of intervention and reduces the overall need to ration care. The seamless integration of all orders of intervention within this systems-based model of disaster-specific triage, coordinated through health emergency operations centers, can ensure that disaster response measures are undertaken in a manner that is effective, just, and equitable. PMID:18769264

  17. Integrating Existing Simulation Components into a Cohesive Simulation System

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment

  18. Sensitivity of a simulated extreme precipitation event to spatial resolution, parametrisations and assimilation

    NASA Astrophysics Data System (ADS)

    Ferreira, J.; Carvalho, A.; Carvalheiro, L.; Rocha, A.; Castanheira, J.

    2010-09-01

    . The first part of this study evaluates the sensitivity of the model to horizontal resolution and physical parametrisations in the prediction of the selected precipitation extreme events. Additionally, two other sensitivity tests were performed with the OP1 configuration, one regarding the cumulus physics parametrisation, which has been switched of (i.e. explicit calculation of convective eddies), to compare the results with the operational configuration and the other with assimilation of surface and upper air data. Physical processes of the precipitation in this period have been revealed through the analysis of the precipitation fields associated with the microphysics and the cumulus parametrisations. During the early morning microphysics plays an important role, whereas for late morning precipitation is due to a squall line convective system. As expected, results show that model resolution affects the amount of predicted precipitation and the parameterizations affect the location and time of the extreme precipitation. For this particular event, assimilation seems to degrade the simulation, particularly the maximum of precipitation.

  19. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  20. Consistent simulations of multiple proxy responses to an abrupt climate change event

    PubMed Central

    LeGrande, A. N.; Schmidt, G. A.; Shindell, D. T.; Field, C. V.; Miller, R. L.; Koch, D. M.; Faluvegi, G.; Hoffmann, G.

    2006-01-01

    Isotope, aerosol, and methane records document an abrupt cooling event across the Northern Hemisphere at 8.2 kiloyears before present (kyr), while separate geologic lines of evidence document the catastrophic drainage of the glacial Lakes Agassiz and Ojibway into the Hudson Bay at approximately the same time. This melt water pulse may have been the catalyst for a decrease in North Atlantic Deep Water formation and subsequent cooling around the Northern Hemisphere. However, lack of direct evidence for ocean cooling has lead to speculation that this abrupt event was purely local to Greenland and called into question this proposed mechanism. We simulate the response to this melt water pulse using a coupled general circulation model that explicitly tracks water isotopes and with atmosphere-only experiments that calculate changes in atmospheric aerosol deposition (specifically 10Be and dust) and wetland methane emissions. The simulations produce a short period of significantly diminished North Atlantic Deep Water and are able to quantitatively match paleoclimate observations, including the lack of isotopic signal in the North Atlantic. This direct comparison with multiple proxy records provides compelling evidence that changes in ocean circulation played a major role in this abrupt climate change event. PMID:16415159

  1. Consistent simulations of multiple proxy responses to an abrupt climate change event.

    PubMed

    LeGrande, A N; Schmidt, G A; Shindell, D T; Field, C V; Miller, R L; Koch, D M; Faluvegi, G; Hoffmann, G

    2006-01-24

    Isotope, aerosol, and methane records document an abrupt cooling event across the Northern Hemisphere at 8.2 kiloyears before present (kyr), while separate geologic lines of evidence document the catastrophic drainage of the glacial Lakes Agassiz and Ojibway into the Hudson Bay at approximately the same time. This melt water pulse may have been the catalyst for a decrease in North Atlantic Deep Water formation and subsequent cooling around the Northern Hemisphere. However, lack of direct evidence for ocean cooling has lead to speculation that this abrupt event was purely local to Greenland and called into question this proposed mechanism. We simulate the response to this melt water pulse using a coupled general circulation model that explicitly tracks water isotopes and with atmosphere-only experiments that calculate changes in atmospheric aerosol deposition (specifically (10)Be and dust) and wetland methane emissions. The simulations produce a short period of significantly diminished North Atlantic Deep Water and are able to quantitatively match paleoclimate observations, including the lack of isotopic signal in the North Atlantic. This direct comparison with multiple proxy records provides compelling evidence that changes in ocean circulation played a major role in this abrupt climate change event. PMID:16415159

  2. System of systems engineering and risk management of extreme events: concepts and case study.

    PubMed

    Bristow, Michele; Fang, Liping; Hipel, Keith W

    2012-11-01

    The domain of risk analysis is expanded to consider strategic interactions among multiple participants in the management of extreme risk in a system of systems. These risks are fraught with complexity, ambiguity, and uncertainty, which pose challenges in how participants perceive, understand, and manage risk of extreme events. In the case of extreme events affecting a system of systems, cause-and-effect relationships among initiating events and losses may be difficult to ascertain due to interactions of multiple systems and participants (complexity). Moreover, selection of threats, hazards, and consequences on which to focus may be unclear or contentious to participants within multiple interacting systems (ambiguity). Finally, all types of risk, by definition, involve potential losses due to uncertain events (uncertainty). Therefore, risk analysis of extreme events affecting a system of systems should address complex, ambiguous, and uncertain aspects of extreme risk. To accomplish this, a system of systems engineering methodology for risk analysis is proposed as a general approach to address extreme risk in a system of systems. Our contribution is an integrative and adaptive systems methodology to analyze risk such that strategic interactions among multiple participants are considered. A practical application of the system of systems engineering methodology is demonstrated in part by a case study of a maritime infrastructure system of systems interface, namely, the Straits of Malacca and Singapore. PMID:22804565

  3. Did the Solar system form in a sequential triggered star formation event?

    NASA Astrophysics Data System (ADS)

    Parker, Richard J.; Dale, James E.

    2016-02-01

    The presence and abundance of the short-lived radioisotopes (SLRs) 26Al and 60Fe during the formation of the Solar system is difficult to explain unless the Sun formed in the vicinity of one or more massive star(s) that exploded as supernovae. Two different scenarios have been proposed to explain the delivery of SLRs to the protosolar nebula: (i) direct pollution of the protosolar disc by supernova ejecta, and (ii) the formation of the Sun in a sequential star formation event in which supernovae shockwaves trigger further star formation which is enriched in SLRs. The sequentially triggered model has been suggested as being more astrophysically likely than the direct pollution scenario. In this paper, we investigate this claim by analysing a combination of N-body and smoothed particle hydrodynamics simulations of star formation. We find that sequential star formation would result in large age spreads (or even bi-modal age distributions for spatially coincident events) due to the dynamical relaxation of the first star formation event(s). Secondly, we discuss the probability of triggering spatially and temporally discrete populations of stars and find this to be only possible in very contrived situations. Taken together, these results suggest that the formation of the Solar system in a triggered star formation event is as improbable, if not more so, than the direct pollution of the protosolar disc by a supernova.

  4. Primary single event effect studies on Xilinx 28-nm System-on-Chip (SoC)

    NASA Astrophysics Data System (ADS)

    Zhang, Yao; Liu, Shuhuan; Du, Xuecheng; Yuan, Yuan; He, Chaohui; Ren, Xiaotang; Du, Xiaozhi; Li, Yonghong

    2016-09-01

    Single Event Effect (SEE) on Xilinx 28-nm System-on-Chip (SoC) was investigated by both simulation and experiments in this study. In the simulation process, typical structure of NAND gate and flip-flop in SoC were designed using Cadence tool. Various kinds of radiation were simulated as pulsed current source in consideration of multilayer wiring and energy loss before reaching the sensitive area. The circuit modules were simulated as SEE occurred and malfunctioned when pulsed current source existed. The changes of the circuit modules output were observed when pulsed current signals were placed at different sensitive nodes or the circuit operated under different conditions. The sensitive nodes in typical modules and the possible reasons of test program malfunction were primarily studied. In the experimental process, SoC chip was irradiated with α particles, protons and laser respectively. The irradiation test results showed that Single Event Upset (SEU) occurred in typical modules of SoC, in accordance with the simulation results.

  5. Using simulation to evaluate warhead monitoring system effectiveness

    SciTech Connect

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.; Liles, Karina R.; Meyer, Nicholas J.; Oster, Matthew R.; Waterworth, Angela M.

    2015-07-12

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled as declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical

  6. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  7. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  8. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  9. An investigation into pilot and system response to critical in-flight events, volume 2

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1981-01-01

    Critical in-flight event is studied using mission simulation and written tests of pilot responses. Materials and procedures used in knowledge tests, written tests, and mission simulations are included

  10. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  11. Spatial and Temporal Signatures of Flux Transfer Events in Global Simulations of Magnetopause Dynamics

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Sibeck, David Gary; Hesse, Michael; Berrios, David; Rastaetter, Lutz; Toth, Gabor; Gombosi, Tamas I.

    2011-01-01

    Flux transfer events (FTEs) were originally identified by transient bipolar variations of the magnetic field component normal to the nominal magnetopause centered on enhancements in the total magnetic field strength. Recent Cluster and THEMIS multi-point measurements provided a wide range of signatures that are interpreted as evidence for FTE passage (e.g., crater FTE's, traveling magnetic erosion regions). We use the global magnetohydrodynamic (MHD) code BATS-R-US developed at the University of Michigan to model the global three-dimensional structure and temporal evolution of FTEs during multi-spacecraft magnetopause crossing events. Comparison of observed and simulated signatures and sensitivity analysis of the results to the probe location will be presented. We will demonstrate a variety of observable signatures in magnetic field profile that depend on space probe location with respect to the FTE passage. The global structure of FTEs will be illustrated using advanced visualization tools developed at the Community Coordinated Modeling Center

  12. Simulating single-event burnout of n-channel power MOSFET's

    SciTech Connect

    Johnson, G.H.; Hohl, J.H.; Schrimpf, R.D.; Galloway, K.F. )

    1993-05-01

    Heavy ions are ubiquitous in a space environment. Single-event burnout of power MOSFET's is a sudden catastrophic failure mechanism that is initiated by the passage of a heavy ion through the device structure. The passage of the heavy ion generates a current filament that locally turns on a parasitic n-p-n transistor inherent to the power MOSFET. Subsequent high currents and high voltage in the device induce second breakdown of the parasitic bipolar transistor and hence meltdown of the device. This paper presents a model that can be used for simulating the burnout mechanism in order to gain insight into the significant device parameters that most influence the single-event burnout susceptibility of n-channel power MOSFET's.

  13. Control of discrete event systems modeled as hierarchical state machines

    NASA Technical Reports Server (NTRS)

    Brave, Y.; Heymann, M.

    1991-01-01

    The authors examine a class of discrete event systems (DESs) modeled as asynchronous hierarchical state machines (AHSMs). For this class of DESs, they provide an efficient method for testing reachability, which is an essential step in many control synthesis procedures. This method utilizes the asynchronous nature and hierarchical structure of AHSMs, thereby illustrating the advantage of the AHSM representation as compared with its equivalent (flat) state machine representation. An application of the method is presented where an online minimally restrictive solution is proposed for the problem of maintaining a controlled AHSM within prescribed legal bounds.

  14. Description of the grout system dynamic simulation

    SciTech Connect

    Zimmerman, B.D.

    1993-07-01

    The grout system dynamic computer simulation was created to allow investigation of the ability of the grouting system to meet established milestones, for various assumed system configurations and parameters. The simulation simulates the movement of tank waste through the system versus time, from initial storage tanks, through feed tanks and the grout plant, then finally to a grout vault. The simulation properly accounts for the following (1) time required to perform various actions or processes, (2) delays involved in gaining regulatory approval, (3) random system component failures, (4) limitations on equipment capacities, (5) available parallel components, and (6) different possible strategies for vault filling. The user is allowed to set a variety of system parameters for each simulation run. Currently, the output of a run primarily consists of a plot of projected grouting campaigns completed versus time, for comparison with milestones. Other outputs involving any model component can also be quickly created or deleted as desired. In particular, sensitivity runs where the effect of varying a model parameter (flow rates, delay times, number of feed tanks available, etc.) on the ability of the system to meet milestones can be made easily. The grout system simulation was implemented using the ITHINK* simulation language for Macintosh** computers.

  15. Simulations of The Extreme Precipitation Event Enhanced by Sea Surface Temperature Anomaly over the Black Sea

    NASA Astrophysics Data System (ADS)

    Hakan Doǧan, Onur; Önol, Barış

    2016-04-01

    Istanbul Technical University, Aeronautics and Astronautics Faculty, Meteorological Engineering, Istanbul, Turkey In this study, we examined the extreme precipitation case over the Eastern Black Sea region of Turkey by using regional climate model, RegCM4. The flood caused by excessive rain in August 26, 2010 killed 12 people and the landslides in Rize province have damaged many buildings. The station based two days total precipitation exceeds 200 mm. One of the usual suspects for this extreme event is positive anomaly of sea surface temperature (SST) over the Black Sea where the significant warming trend is clear in the last three decades. In August 2010, the monthly mean SST is higher than 3 °C with respect to the period of 1981-2010. We designed three sensitivity simulations with RegCM4 to define the effects of the Black Sea as a moisture source. The simulation domain with 10-km horizontal resolution covers all the countries bordering the Black Sea and simulation period is defined for entire August 2010. It is also noted that the spatial variability of the precipitation produced by the reference simulation (Sim-0) is consistent with the TRMM data. In terms of analysis of the sensitivity to SST, we forced the simulations by subtracting 1 °C (Sim-1), 2 °C (Sim-2) and 3 °C (Sim-3) from the ERA-Interim 6-hourly SST data (considering only the Black Sea). The sensitivity simulations indicate that daily total precipitation for all these simulations gradually decreased based on the reference simulation (Sim-0). 3-hourly maximum precipitation rates for Sim-0, Sim-1, Sim-2 and Sim-3 are 32, 25, 13 and 10.5 mm respectively over the hotspot region. Despite the fact that the simulations signal points out the same direction, degradation of the precipitation intensity does not indicate the same magnitude for all simulations. It is revealed that 2 °C (Sim-2) threshold is critical for SST sensitivity. We also calculated the humidity differences from the simulation and these

  16. Role of land state in a high resolution mesoscale model for simulating the Uttarakhand heavy rainfall event over India

    NASA Astrophysics Data System (ADS)

    Rajesh, P. V.; Pattnaik, S.; Rai, D.; Osuri, K. K.; Mohanty, U. C.; Tripathy, S.

    2016-04-01

    In 2013, Indian summer monsoon witnessed a very heavy rainfall event (>30 cm/day) over Uttarakhand in north India, claiming more than 5000 lives and property damage worth approximately 40 billion USD. This event was associated with the interaction of two synoptic systems, i.e., intensified subtropical westerly trough over north India and north-westward moving monsoon depression formed over the Bay of Bengal. The event had occurred over highly variable terrain and land surface characteristics. Although global models predicted the large scale event, they failed to predict realistic location, timing, amount, intensity and distribution of rainfall over the region. The goal of this study is to assess the impact of land state conditions in simulating this severe event using a high resolution mesoscale model. The land conditions such as multi-layer soil moisture and soil temperature fields were generated from High Resolution Land Data Assimilation (HRLDAS) modelling system. Two experiments were conducted namely, (1) CNTL (Control, without land data assimilation) and (2) LDAS, with land data assimilation (i.e., with HRLDAS-based soil moisture and temperature fields) using Weather Research and Forecasting (WRF) modelling system. Initial soil moisture correlation and root mean square error for LDAS is 0.73 and 0.05, whereas for CNTL it is 0.63 and 0.053 respectively, with a stronger heat low in LDAS. The differences in wind and moisture transport in LDAS favoured increased moisture transport from Arabian Sea through a convectively unstable region embedded within two low pressure centers over Arabian Sea and Bay of Bengal. The improvement in rainfall is significantly correlated to the persistent generation of potential vorticity (PV) in LDAS. Further, PV tendency analysis confirmed that the increased generation of PV is due to the enhanced horizontal PV advection component rather than the diabatic heating terms due to modified flow fields. These results suggest that, two

  17. Hydrocode simulation of the Chicxulub impact event and the production of climatically active gases

    NASA Astrophysics Data System (ADS)

    Pierazzo, Elisabetta; Kring, David A.; Melosh, H. Jay

    1998-12-01

    We constructed a numerical model of the Chicxulub impact event using the Chart-D Squared (CSQ) code coupled with the ANalytic Equation Of State (ANEOS) package. In the simulations we utilized a target stratigraphy based on borehole data and employed newly developed equations of state for the materials that are believed to play a crucial role in the impact-related extinction hypothesis: carbonates (calcite) and evaporites (anhydrite). Simulations explored the effects of different projectile sizes (10 to 30 km in diameter) and porosity (0 to 50%). The effect of impact speed is addressed by doing simulations of asteroid impacts (vi=20km/s) and comet impacts (vi=50km/s). The masses of climatically important species injected into the upper atmosphere by the impact increase with the energy of the impact event, ranging from 350 to 3500 Gt for CO2, from 40 to 560 Gt for S, and from 200 to 1400 Gt for water vapor. While our results are in good agreement with those of Ivanov et al. [1996], our estimated CO2 production is 1 to 2 orders of magnitude lower than the results of Takata and Ahrens [1994], indicating that the impact event enhanced the end-Cretaceous atmospheric CO2 inventory by, at most, 40%. Consequently, sulfur may have been the most important climatically active gas injected into the stratosphere. The amount of S released by the impact is several orders of magnitude higher than any known volcanic eruption and, with H2O, is high enough to produce a sudden and significant perturbation of Earth's climate.

  18. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  19. INTEGRATED SYSTEM SIMULATION IN X-RAY RADIOGRAPHY

    SciTech Connect

    T. KWAN; ET AL

    2001-01-01

    An integrated simulation capability is being developed to examine the fidelity of a dynamic radiographic system. This capability consists of a suite of simulation codes which individually model electromagnetic and particle transport phenomena and are chained together to model an entire radiographic event. Our study showed that the electron beam spot size at the converter target plays the key role in determining material edge locations. The angular spectrum is a relatively insensitive factor in radiographic fidelity. We also found that the full energy spectrum of the imaging photons must be modeled to obtain an accurate analysis of material densities.

  20. Global Positioning System Simulator Field Operational Procedures

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Quinn, David A.; Day, John H. (Technical Monitor)

    2002-01-01

    Global Positioning System (GPS) simulation is an important activity in the development or qualification of GPS signal receivers for space flight. Because a GPS simulator is a critical resource it is highly desirable to develop a set of field operational procedures to supplement the basic procedures provided by most simulator vendors. Validated field procedures allow better utilization of the GPS simulator in the development of new test scenarios and simulation operations. These procedures expedite simulation scenario development while resulting in scenarios that are more representative of the true design, as well as enabling construction of more complex simulations than previously possible, for example, spacecraft maneuvers. One difficulty in the development of a simulation scenario is specifying various modes of test vehicle motion and associated maneuvers requiring that a user specify some (but not all) of a few closely related simulation parameters. Currently this can only be done by trial and error. A stand-alone procedure that implements the simulator maneuver motion equations and solves for the motion profile transient times, jerk and acceleration would be of considerable value. Another procedure would permit the specification of some configuration parameters that would determine the simulated GPS signal composition. The resulting signal navigation message, for example, would force the receiver under test to use only the intended C-code component of the simulated GPS signal. A representative class of GPS simulation-related field operational procedures is described in this paper. These procedures were developed and used in support of GPS integration and testing for many successful spacecraft missions such as SAC-A, EO-1, AMSAT, VCL, SeaStar, sounding rockets, and by using the industry standard Spirent Global Simulation Systems Incorporated (GSSI) STR series simulators.

  1. Electronic notebook for physical system simulation

    SciTech Connect

    Kelsey, R. L.

    2003-01-01

    A scientist who sets up and runs experiments typically keeps notes of this process in a lab notebook. A scientist who runs computer simulations should be no different. Experiments and simulations both require a set-up process which should be documented along with the results of the experiment or simulation. The documentation is important for knowing and understanding what was attempted, what took place, and how to reproduce it in the future. Modern simulations of physical systems have become more complex due in part to larger computational resources and increased understanding of physical systems. These simulations may be performed by combining the results from multiple computer codes. The machines that these simulations are executed on are often massively parallelldistributed systems. The output result of one of these simulations can be a terabyte of data and can require months of computing. All of these things contribute to the difficulty of keeping a useful record of the process of setting up and executing a simulation for a physical system. An electronic notebook for physical system simulations has been designed to help document the set up and execution process. Much of the documenting is done automatically by the simulation rather than the scientist running the simulation. Tho simulation knows what codes, data, software libraries, and versions thereof it is drawing together. All of these pieces of information become documented in the electronic notebook. The electronic notebook is designed with and uses the extensible Markup Language (XML). XML facilitates the representation, storage, interchange, and further use of the documented information.

  2. An electronic notebook for physical system simulation

    NASA Astrophysics Data System (ADS)

    Kelsey, Robert L.

    2003-09-01

    A scientist who sets up and runs experiments typically keeps notes of this process in a lab notebook. A scientist who runs computer simulations should be no different. Experiments and simulations both require a set-up process which should be documented along with the results of the experiment or simulation. The documentation is important for knowing and understanding what was attempted, what took place, and how to reproduce it in the future. Modern simulations of physical systems have become more complex due in part to larger computational resources and increased understanding of physical systems. These simulations may be performed by combining the results from multiple computer codes. The machines that these simulations are executed on are often massively parallel/distributed systems. The output result of one of these simulations can be a terabyte of data and can require months of computing. All of these things contribute to the difficulty of keeping a useful record of the process of setting up and executing a simulation for a physical system. An electronic notebook for physical system simulations has been designed to help document the set up and execution process. Much of the documenting is done automatically by the simulation rather than the scientist running the simulation. The simulation knows what codes, data, software libraries, and versions thereof it is drawing together. All of these pieces of information become documented in the electronic notebook. The electronic notebook is designed with and uses the eXtensible Markup Language (XML). XML facilitates the representation, storage, interchange, and further use of the documented information.

  3. Extended temperature-accelerated dynamics: Enabling long-time full-scale modeling of large rare-event systems

    SciTech Connect

    Bochenkov, Vladimir; Suetin, Nikolay; Shankar, Sadasivan

    2014-09-07

    A new method, the Extended Temperature-Accelerated Dynamics (XTAD), is introduced for modeling long-timescale evolution of large rare-event systems. The method is based on the Temperature-Accelerated Dynamics approach [M. Sørensen and A. Voter, J. Chem. Phys. 112, 9599 (2000)], but uses full-scale parallel molecular dynamics simulations to probe a potential energy surface of an entire system, combined with the adaptive on-the-fly system decomposition for analyzing the energetics of rare events. The method removes limitations on a feasible system size and enables to handle simultaneous diffusion events, including both large-scale concerted and local transitions. Due to the intrinsically parallel algorithm, XTAD not only allows studies of various diffusion mechanisms in solid state physics, but also opens the avenue for atomistic simulations of a range of technologically relevant processes in material science, such as thin film growth on nano- and microstructured surfaces.

  4. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Agocs, A.; Aiola, S.; Antolini, R.; Avanzini, C.; Baldini Ferroli, R.; Bencivenni, G.; Bossini, E.; Bressan, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gemme, G.; Gnesi, I.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Li, S.; Librizzi, F.; Maggiora, A.; Massai, M.; Miozzi, S.; Panareo, M.; Paoletti, R.; Perasso, L.; Pilo, F.; Piragino, G.; Regano, A.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Spandre, G.; Squarcia, S.; Taiuti, M.; Tosello, F.; Votano, L.; Williams, M. C. S.; Yánez, G.; Zichichi, A.; Zuyeuski, R.

    2014-08-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 105 km2. In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  5. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model. PMID:26262262

  6. Simulations of Wave Propagation in the Jovian Atmosphere after SL9 Impact Events

    NASA Astrophysics Data System (ADS)

    Pond, Jarrad W.; Palotai, C.; Korycansky, D.; Harrington, J.

    2013-10-01

    Our previous numerical investigations into Jovian impacts, including the Shoemaker Levy- 9 (SL9) event (Korycansky et al. 2006 ApJ 646. 642; Palotai et al. 2011 ApJ 731. 3), the 2009 bolide (Pond et al. 2012 ApJ 745. 113), and the ephemeral flashes caused by smaller impactors in 2010 and 2012 (Hueso et al. 2013; Submitted to A&A), have covered only up to approximately 3 to 30 seconds after impact. Here, we present further SL9 impacts extending to minutes after collision with Jupiter’s atmosphere, with a focus on the propagation of shock waves generated as a result of the impact events. Using a similar yet more efficient remapping method than previously presented (Pond et al. 2012; DPS 2012), we move our simulation results onto a larger computational grid, conserving quantities with minimal error. The Jovian atmosphere is extended as needed to accommodate the evolution of the features of the impact event. We restart the simulation, allowing the impact event to continue to progress to greater spatial extents and for longer times, but at lower resolutions. This remap-restart process can be implemented multiple times to achieve the spatial and temporal scales needed to investigate the observable effects of waves generated by the deposition of energy and momentum into the Jovian atmosphere by an SL9-like impactor. As before, we use the three-dimensional, parallel hydrodynamics code ZEUS-MP 2 (Hayes et al. 2006 ApJ.SS. 165. 188) to conduct our simulations. Wave characteristics are tracked throughout these simulations. Of particular interest are the wave speeds and wave positions in the atmosphere as a function of time. These properties are compared to the characteristics of the HST rings to see if shock wave behavior within one hour of impact is consistent with waves observed at one hour post-impact and beyond (Hammel et al. 1995 Science 267. 1288). This research was supported by National Science Foundation Grant AST-1109729 and NASA Planetary Atmospheres Program Grant

  7. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  8. Simulating Rain Fade In A Communication System

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Kurt A.; Nagy, Lawrence A.; Svoboda, James K.

    1994-01-01

    Automated, computer-controlled assembly of electronic equipment developed for use in simulation testing of downlink portion of Earth/satellite microwave digital communication system. Designed to show effects upon performance of system of rain-induced fading in received signal and increases in transmitted power meant to compensate for rain-induced fading. Design of communication system improved iteratively in response to results of simulations, leading eventually to design ensuring clear, uninterrupted transmission of digital signals.

  9. Characteristics of flight simulator visual systems

    NASA Technical Reports Server (NTRS)

    Statler, I. C. (Editor)

    1981-01-01

    The physical parameters of the flight simulator visual system that characterize the system and determine its fidelity are identified and defined. The characteristics of visual simulation systems are discussed in terms of the basic categories of spatial, energy, and temporal properties corresponding to the three fundamental quantities of length, mass, and time. Each of these parameters are further addressed in relation to its effect, its appropriate units or descriptors, methods of measurement, and its use or importance to image quality.

  10. NUMERICAL SIMULATION OF A TORRENTIAL RAINFALL EVENT OCCURRED AT ITABASHI ON JULY 5, 2010

    NASA Astrophysics Data System (ADS)

    Ushiyama, Tomoki; Yorozuya, Atsuhiro; Kanno, Yuya; Fukami, Kazuhiko

    Torrential rainfall events often brought sudden flood and damages especially in urban area. However, it is still hard to forecast the occurrence of torrential rainfall. In this study we analyzed evolution process of a torrential rainfall event that was occurred at Itabashi, Tokyo, on July 5, 2010, to accumulate knowledge of developing mechanism and possibility of forecast of this type of rainfall. The regional meteorological model, WRF (Weather Research and Forecasting), reproduced the torrential rainfall event fairly well by the use of JMA-MSM (Japan Meteorological Agency-Mesoscale Model) data as initial and boundary conditions. For the development and maintenance of the rainfall, three streams of see-breezes from Sagami bay/ Tokyo bay/ Kashima Nada played a key role in accumulating moisture and supplying it into the precipitating system. The model reproduced the sea-breezes well, that is why it could reproduce the rainfall well.

  11. Quantum Simulation for Open-System Dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Dong-Sheng; de Oliveira, Marcos Cesar; Berry, Dominic; Sanders, Barry

    2013-03-01

    Simulations are essential for predicting and explaining properties of physical and mathematical systems yet so far have been restricted to classical and closed quantum systems. Although forays have been made into open-system quantum simulation, the strict algorithmic aspect has not been explored yet is necessary to account fully for resource consumption to deliver bounded-error answers to computational questions. An open-system quantum simulator would encompass classical and closed-system simulation and also solve outstanding problems concerning, e.g. dynamical phase transitions in non-equilibrium systems, establishing long-range order via dissipation, verifying the simulatability of open-system dynamics on a quantum Turing machine. We construct an efficient autonomous algorithm for designing an efficient quantum circuit to simulate many-body open-system dynamics described by a local Hamiltonian plus decoherence due to separate baths for each particle. The execution time and number of gates for the quantum simulator both scale polynomially with the system size. DSW funded by USARO. MCO funded by AITF and Brazilian agencies CNPq and FAPESP through Instituto Nacional de Ciencia e Tecnologia-Informacao Quantica (INCT-IQ). DWB funded by ARC Future Fellowship (FT100100761). BCS funded by AITF, CIFAR, NSERC and USARO.

  12. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family.

    NASA Astrophysics Data System (ADS)

    Custodio, Maria; Ambrizzi, Tercio; da Rocha, Rosmeri

    2015-04-01

    coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The austral summer and winter composites of interannual and intraseasonal anomalies showed for wet and dry extreme events the same spatial distribution in models and reanalyses. The interannual variability analysis showed that coupled simulations intensify the impact of the El Niño Southern Oscillation (ENSO) in the Amazon. In the Intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. Note that there are differences between simulated and observed IS anomalies indicating that the models have problems to correctly represent the intensity of low frequency phenomena in this scale. The simulation of ENSO in GCMs can be attributed to their high resolution, mainly in the oceanic component, which contributes to the better solution of the small scale vortices in the ocean. This implies in improvements in the forecasting of sea surface temperature (SST) and as consequence in the ability of atmosphere to respond to this feature.

  13. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    PubMed Central

    2012-01-01

    Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality

  14. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  15. Sentinel Event Notification System for Occupational Risks (SENSOR): the concept.

    PubMed

    Baker, E L

    1989-12-01

    Although many states have laws that require health providers to report cases of occupational illness and injury, most states do not maintain a comprehensive system that actively identifies and targets potential sources of case reports and then responds to such reports. NIOSH has developed a Sentinel Event Notification System for Occupational Risks (SENSOR) that uses targeted sources of sentinel providers to recognize and report selected occupational disorders to a state surveillance center. SENSOR is a cooperative state-federal effort designed to develop local capability for preventing selected occupational disorders. To demonstrate the feasibility of this approach, NIOSH initially funded seven SENSOR projects in 1987 and three additional projects in early 1988 (Table 1). Currently, these projects are in the preliminary stages of organization and start-up, with some having begun to receive case reports. As funds become available, NIOSH intends to gradually expand the scope of the program to include additional states over the next several years.

  16. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    SciTech Connect

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  17. Solar simulator for concentrator photovoltaic systems.

    PubMed

    Domínguez, César; Antón, Ignacio; Sala, Gabriel

    2008-09-15

    A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories. PMID:18795026

  18. Solar simulator for concentrator photovoltaic systems.

    PubMed

    Domínguez, César; Antón, Ignacio; Sala, Gabriel

    2008-09-15

    A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories.

  19. ROBOSIM, a simulator for robotic systems

    NASA Technical Reports Server (NTRS)

    Hinman, Elaine M.; Fernandez, Ken; Cook, George E.

    1991-01-01

    ROBOSIM, a simulator for robotic systems, was developed by NASA to aid in the rapid prototyping of automation. ROBOSIM has allowed the development of improved robotic systems concepts for both earth-based and proposed on-orbit applications while significantly reducing development costs. In a cooperative effort with an area university, ROBOSIM was further developed for use in the classroom as a safe and cost-effective way of allowing students to study robotic systems. Students have used ROBOSIM to study existing robotic systems and systems which they have designed in the classroom. Since an advanced simulator/trainer of this type is beneficial not only to NASA projects and programs but industry and academia as well, NASA is in the process of developing this technology for wider public use. An update on the simulators's new application areas, the improvements made to the simulator's design, and current efforts to ensure the timely transfer of this technology are presented.

  20. High-Fidelity Full System Simulations

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2004-01-01

    High-fidelity full system simulations have the potential for revolutionizing the way complex systems, such as propulsion systems for aerospace vehicles, are designed, developed, manufactured, and operated. Significant time and cost savings will result from simulations that will resolve deleterious component interactions early in the design process. In addition, innovative new system configurations will result from the use of new tools that enable designers to challenge traditional rules and practices. The major challenges to developing and implementing high-fidelity systems simulations are in reducing the time and effort required to build, execute, and analyze data for the high complex simulations. In addition, large scale testing with unique instrumentation is required to validate the simulations. The solution to these problems reside in the application of advanced information technologies to assist the user to effectively manage, process, and synthesize the vast amount of data. The following presentation describes in more detail the benefits of high-fidelity full system simulations, the challenges to developing and implementing large scale simulations, and one approach that is being followed by the NASA Glenn Research Center to overcome these challenges. In addition, topics for discussion by the panel and audience are suggested.

  1. Simulation of large systems with neural networks

    SciTech Connect

    Paez, T.L.

    1994-09-01

    Artificial neural networks (ANNs) have been shown capable of simulating the behavior of complex, nonlinear, systems, including structural systems. Under certain circumstances, it is desirable to simulate structures that are analyzed with the finite element method. For example, when we perform a probabilistic analysis with the Monte Carlo method, we usually perform numerous (hundreds or thousands of) repetitions of a response simulation with different input and system parameters to estimate the chance of specific response behaviors. In such applications, efficiency in computation of response is critical, and response simulation with ANNs can be valuable. However, finite element analyses of complex systems involve the use of models with tens or hundreds of thousands of degrees of freedom, and ANNs are practically limited to simulations that involve far fewer variables. This paper develops a technique for reducing the amount of information required to characterize the response of a general structure. We show how the reduced information can be used to train a recurrent ANN. Then the trained ANN can be used to simulate the reduced behavior of the original system, and the reduction transformation can be inverted to provide a simulation of the original system. A numerical example is presented.

  2. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    PubMed

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  3. A Spiking Neural Simulator Integrating Event-Driven and Time-Driven Computation Schemes Using Parallel CPU-GPU Co-Processing: A Case Study.

    PubMed

    Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo

    2015-07-01

    Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.

  4. Particle simulation of plasmas and stellar systems

    SciTech Connect

    Tajima, T.; Clark, A.; Craddock, G.G.; Gilden, D.L.; Leung, W.K.; Li, Y.M.; Robertson, J.A.; Saltzman, B.J.

    1985-04-01

    A computational technique is introduced which allows the student and researcher an opportunity to observe the physical behavior of a class of many-body systems. A series of examples is offered which illustrates the diversity of problems that may be studied using particle simulation. These simulations were in fact assigned as homework in a course on computational physics.

  5. Using Expert Systems To Build Cognitive Simulations.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wang, Sherwood

    2003-01-01

    Cognitive simulations are runnable computer programs for modeling human cognitive activities. A case study is reported where expert systems were used as a formalism for modeling metacognitive processes in a seminar. Building cognitive simulations engages intensive introspection, ownership and meaning making in learners who build them. (Author/AEF)

  6. The Canadian Hospital Executive Simulation System (CHESS).

    PubMed

    Pink, G H; Knotts, U A; Parrish, L G; Shields, C A

    1991-01-01

    The Canadian Hospital Executive Simulation System (CHESS) is a computer-based management decision-making game designed specifically for Canadian hospital managers. The paper begins with an introduction on the development of business and health services industry-specific simulation games. An overview of CHESS is provided, along with a description of its development and a discussion of its educational benefits. PMID:10109530

  7. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  8. Simulation of rainfall-runoff for major flash flood events in Karachi

    NASA Astrophysics Data System (ADS)

    Zafar, Sumaira

    2016-07-01

    Metropolitan city Karachi has strategic importance for Pakistan. With the each passing decade the city is facing urban sprawl and rapid population growth. These rapid changes directly affecting the natural resources of city including its drainage pattern. Karachi has three major cities Malir River with the catchment area of 2252 sqkm and Lyari River has catchment area about 470.4 sqkm. These are non-perennial rivers and active only during storms. Change of natural surfaces into hard pavement causing an increase in rainfall-runoff response. Curve Number is increased which is now causing flash floods in the urban locality of Karachi. There is only one gauge installed on the upstream of the river but there no record for the discharge. Only one gauge located at the upstream is not sufficient for discharge measurements. To simulate the maximum discharge of Malir River rainfall (1985 to 2014) data were collected from Pakistan meteorological department. Major rainfall events use to simulate the rainfall runoff. Maximum rainfall-runoff response was recorded in during 1994, 2007 and 2013. This runoff causes damages and inundation in floodplain areas of Karachi. These flash flooding events not only damage the property but also cause losses of lives

  9. High Frequency Mechanical Pyroshock Simulations for Payload Systems

    SciTech Connect

    BATEMAN,VESTA I.; BROWN,FREDERICK A.; CAP,JEROME S.; NUSSER,MICHAEL A.

    1999-12-15

    Sandia National Laboratories (SNL) designs mechanical systems with components that must survive high frequency shock environments including pyrotechnic shock. These environments have not been simulated very well in the past at the payload system level because of weight limitations of traditional pyroshock mechanical simulations using resonant beams and plates. A new concept utilizing tuned resonators attached to the payload system and driven with the impact of an airgun projectile allow these simulations to be performed in the laboratory with high precision and repeatability without the use of explosives. A tuned resonator has been designed and constructed for a particular payload system. Comparison of laboratory responses with measurements made at the component locations during actual pyrotechnic events show excellent agreement for a bandwidth of DC to 4 kHz. The bases of comparison are shock spectra. This simple concept applies the mechanical pyroshock simulation simultaneously to all components with the correct boundary conditions in the payload system and is a considerable improvement over previous experimental techniques and simulations.

  10. An event-driven model simulating fundamental seismic characteristics with the use of cellular automata

    NASA Astrophysics Data System (ADS)

    Pavlou, L.; Georgoudas, I. G.; Sirakoulis, G. Ch.; Scordilis, E. M.; Andreadis, I.

    This paper presents an extensive simulation tool based on a Cellular Automata (CA) system that models fundamental seismic characteristics of a region. The CA-based dynamic model consists of cells-charges and it is used for the simulation of the earthquake process. The simulation tool has remarkably accelerated the response of the model by incorporating principles of the High Performance Computing (HPC). Extensive programming features of parallel computing have been applied, thus improving its processing effectiveness. The tool implements an enhanced (or hyper-) 2-dimensional version of the proposed CA model. Regional characteristics that depend on the seismic background of the area under study are assigned to the model with the application of a user-friendly software environment. The model is evaluated with real data that correspond to a circular region around Skyros Island, Greece, for different time periods, as for example one of 45 years (1901-1945). The enhanced 2-dimensional version of the model incorporates all principal characteristics of the 2-dimensional one, also including groups of CA cells that interact with others, located to a considerable distance in an attempt to simulate long-range interaction. The advanced simulation tool has been thoroughly evaluated. Several measurements have been made for different critical states, as well as for various cascade (earthquake) sizes, cell activities and different neighbourhood sizes. Simulation results qualitatively approach the Gutenberg-Richter (GR) scaling law and reveal fundamental characteristics of the system.

  11. Weightlessness simulation system and process

    NASA Technical Reports Server (NTRS)

    Vykukal, Hubert C. (Inventor)

    1987-01-01

    A weightlessness simulator has a chamber and a suit in the chamber. O-rings and valves hermetically seal the chamber. A vacuum pump connected to the chamber establishes a pressure in the chamber less than atmospheric pressure. A water supply tank and water supply line supply a body of water to the chamber as a result of partial vacuum created in the chamber. In use, an astronaut enters the pressure suit through a port, which remains open to ambient atmosphere, thus supplying air to the astronaut during use. The pressure less than atmospheric pressure in the chamber is chosen so that the pressure differential from the inside to the outside of the suit corresponds to the pressure differential with the suit in outer space.

  12. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  13. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  14. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  15. Colorimetric calibration of coupled infrared simulation system

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Fei, Jindong; Gao, Yang; Du, Jian

    2015-10-01

    In order to test 2-color infrared sensors, a coupled infrared simulation system can generate radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. There are two channels in the coupled simulation system, optically combined by a diachronic beam combiner. Each channel has an infrared blackbody, a filter, a diaphragm, and diaphragm-motors. The system is projected to the sensor under testing by a collimator. This makes it difficult to calibrate the system with only one-band thermal imager. Errors will be caused in the radiance levels measured by the narrow band thermal imager. This paper describes colorimetric temperature measurement techniques that have been developed to perform radiometric calibrations of these infrared simulation systems above. The calibration system consists of two infrared thermal imagers; one is operated at the wavelength range of MW-IR, and the other at the range of LW-IR.

  16. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  17. IRIS observations and MHD simulations of explosive events in the transition region of the Sun

    NASA Astrophysics Data System (ADS)

    Guo, Lijia; Innes, Davina; Huang, Yi-Min; Bhattacharjee, Amitava

    2016-05-01

    Small-scale explosive events on the Sun are thought to be related to magnetic reconnection. While Petschek reconnection has been considered as a reconnection mechanism for explosive events on the Sun for quite a long time, the fragmentation of a current sheet in the high-Lundquist-number regime caused by the plasmoid instability has recently been proposed as a possible mechanism for fast reconnection. The actual reconnection sites are too small to be resolved with images but these reconnection mechanisms, Petschek and the plasmoid instability, have very different density and velocity structures and so can be distinguished by high-resolution line profiles observations. We use high-resolution sit-and-stare spectral observations of the Si IV line, obtained by the IRIS spectrometer, to identify sites of reconnection, and follow the development of line profiles. The aim is to obtain a survey of typical line profiles produced by small-scale reconnection events in the transition region and compare them with synthetic line profiles from numerical simulations of a reconnecting current sheet to determine whether reconnection occurs via the plasmoid instabilty or the Petschek mechanism. Direct comparison between IRIS observations and numerical results suggests that the observed Si IV profiles can be reproduced with a fragmented current layer subject to plasmoid instability but not by bi-directional jets that characterise the Petschek mechanism. This result suggests that if these small-scale events are reconnection sites, then fast reconnection proceeds via the plasmoid instability, rather than the Petschek mechanism during small-scale reconnection on the Sun.

  18. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  19. The analyses of extreme climate events over China based on CMIP5 historical and future simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Dong, W.; Feng, J.; Chou, J.

    2013-12-01

    The extreme climate events have a serious influence on human society. Based on observations and 12 simulations from Coupled Model Intercomparison Project Phase 5 (CMIP5), Climatic extremes and their changes over china in history and future scenarios of three Representative Concentration Pathways (RCPs) are analyzed. Because of the background of global warming, in observations, the frost days (FD) and low-temperature threshold days (TN10P) have decreasing trend, and summer days (SU), high-temperature threshold days (TX90P), the heavy precipitation days (R20) and contribution of heavy precipitation days (P95T) show an increasing trend. Most coupled models can basically simulate main characteristics of most extreme indexes. The models reproduce the mean FD and TX90P value best and can give basic trends of the FD, TN10P, SU and TX90P. High correlation coefficients between simulated results and observation are found in FD, SU and P95T. For FD and SU index, most of the models have good ability to capture the spatial differences between the mean state of the 1986-2005 and 1961-1980 periods, but for other indexes, most of models' simulation ability for spatial disparity are not so satisfactory and have to be promoted. Under the high emission scenario of RCP8.5, the century-scale linear changes of Multi-Model Ensembles (MME) for FD, SU, TN10P, TX90P, R20 and P95T are -46.9, 46.0, -27.1, 175.4, 2.9 days and 9.9%, respectively. Due to the complexities of physical process parameterizations and the limitation of forcing data, a large uncertainty still exists in the simulations of climatic extremes. Fig.1 Observed and modeled multi-year average for each index (Dotted line: observation) Table1. Extreme index definition

  20. Numerical Propulsion System Simulation for Space Transportation

    NASA Technical Reports Server (NTRS)

    Owen, Karl

    2000-01-01

    Current system simulations are mature, difficult to modify, and poorly documented. Probabilistic life prediction techniques for space applications are in their early application stage. Many parts of the full system, variable fidelity simulation, have been demonstrated individually or technology is available from aeronautical applications. A 20% reduction in time to design with improvements in performance and risk reduction is anticipated. GRC software development will proceed with similar development efforts in aeronautical simulations. Where appropriate, parallel efforts will be encouraged/tracked in high risk areas until success is assured.

  1. Another Program Simulates A Modular Manufacturing System

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Wang, Jian

    1996-01-01

    SSE5 computer program provides simulation environment for modeling manufacturing systems containing relatively small numbers of stations and operators. Designed to simulate manufacturing of apparel, also used in other manufacturing domains. Valuable for small or medium-size firms, including those lacking expertise to develop detailed mathematical models or have only minimal knowledge in describing manufacturing systems and in analyzing results of simulations on mathematical models. Two other programs available bundled together as SSE (MFS-26245). Each program models slightly different manufacturing scenario. Written in Turbo C v2.0 for IBM PC-series and compatible computers running MS-DOS and successfully compiled using Turbo C++ v3.0.

  2. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  3. Electric-Power System Simulator

    NASA Technical Reports Server (NTRS)

    Caldwell, R. W.; Grumm, R. L.; Biedebach, B. L.

    1984-01-01

    Shows different combinations of generation, storage, and load components: display, video monitor with keyboard input to microprocessor, and video monitor for display of load curves and power generation. Planning tool for electric utilities, regulatory agencies, and laymen in understanding basics of electric-power systems operation.

  4. Effects of long range transboundary pollutants on air quality in Japan - numerical simulation of a yellow sand event

    SciTech Connect

    Ueda, Hiromasa; Kang, Seuk Jea

    1996-12-31

    Air quality in the East Asia may worsen drastically as a consequence of accelerated development of fossil fuel systems and highest economic and population growth rates of the world. The expansion of these energy systems combined with a major fuel shift to indigenous coal, will result in a significant acid deposition and photochemical oxidant pollution in this region. Frequently, during clean spring days large scale wind systems develop in order to transport pollutants from the East Asian mainland towards the Pacific Ocean. Therefore, in order to evaluate the air quality of the western Pacific Ocean and Japan, the effects of emissions of the adjacent continent must be taken into consideration. The present paper reports on a series of numerical simulations for clear spring time episodes using an Eulerian transport/chemistry/deposition model to obtain the concentration changes of air pollutants over this area. The simulation was done from 9:00 JST of 1 April to midnight of 3 April 1993. On this day a yellow sand event showing good evidence of long range transport from the continent toward the Western Pacific Ocean occurred. At first, the simulation results show a fair agreement with the observed values. Secondly, the numerical simulation showed the formation of a high air pollution belt in East Asia, connecting the eastern area of China, the southern area of Korea and the western area of Japan clearly. In the case of NO{sub x}, the formation of a air pollution belt is weak, but well displayed for sulfate, nitrate and the ozone. Specially, in the region covered by the air pollution belt (Western Pacific Ocean, Japan Sea and Western Japan) emissions are small, but the concentration of ozone, sulfate and nitrate are high. Ozone concentration in Japan, due to long range transport from the continent is already near the environmental standard value of 60 ppb. In this area tropospheric ozone and acid deposition were suggested to be a serious problem in the future.

  5. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    PubMed

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  6. Observing System Simulations for the GOLD and ICON Missions

    NASA Astrophysics Data System (ADS)

    Solomon, S. C.

    2015-12-01

    We present results from a comprehensive observing simulation system for remote sensing observations by the Global-scale Observations of the Limb and Disk (GOLD) and Ionospheric Connection Explorer (ICON) missions. These NASA missions to explore the terrestrial thermosphere and ionosphere are planned for launch in 2017. The GOLD instrument is a far-ultraviolet spectrograph to be deployed on a commercial communications satellite at geostationary orbit. It will measure thermospheric temperature and composition during the day, and electron density at night, to understand the global effects of solar and geomagnetic events on the thermosphere-ionosphere system. ICON is an Explorer-class satellite to be launched into low-Earth orbit at 27 degrees inclination, and will measure thermosphere and ionosphere parameters with multiple remote-sensing and in-situ instruments, to discover the connections between lower atmosphere weather and changes in the low-latitude ionosphere. Instrument development, algorithm development, and ultimately data analysis, depend on a robust capability for simulating what we expect the instruments to observe. Therefore, we are constructing a flexible software system to serve both missions. The system uses simulations from general circulation models of the atmosphere-ionosphere system as input to an airglow model, performs spectral synthesis calculations, and applies observational parameters to predict what the instruments may observe from a generalized viewing geometry that can be applied to other space-based remote sensing measurements as well. In this presentation, we describe the system architecture and methodology, and present preliminary simulation results.

  7. Final Technical Report "Multiscale Simulation Algorithms for Biochemical Systems"

    SciTech Connect

    Petzold, Linda R.

    2012-10-25

    Biochemical systems are inherently multiscale and stochastic. In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. An analysis tool that respects these dynamical characteristics is the stochastic simulation algorithm (SSA, Gillespie, 1976), a numerical simulation procedure that is essentially exact for chemical systems that are spatially homogeneous or well stirred. Despite recent improvements, as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the multiscale nature of the underlying problem: (1) stiffness, i.e. the presence of multiple timescales, the fastest of which are stable; and (2) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). This project has focused on the development of fast and adaptive algorithms, and the fun- damental theory upon which they must be based, for the multiscale simulation of biochemical systems. Areas addressed by this project include: (1) Theoretical and practical foundations for ac- celerated discrete stochastic simulation (tau-leaping); (2) Dealing with stiffness (fast reactions) in an efficient and well-justified manner in discrete stochastic simulation; (3) Development of adaptive multiscale algorithms for spatially homogeneous discrete stochastic simulation; (4) Development of high-performance SSA algorithms.

  8. Physics Detector Simulation Facility Phase II system software description

    SciTech Connect

    Scipioni, B.; Allen, J.; Chang, C.; Huang, J.; Liu, J.; Mestad, S.; Pan, J.; Marquez, M.; Estep, P.

    1993-05-01

    This paper presents the Physics Detector Simulation Facility (PDSF) Phase II system software. A key element in the design of a distributed computing environment for the PDSF has been the separation and distribution of the major functions. The facility has been designed to support batch and interactive processing, and to incorporate the file and tape storage systems. By distributing these functions, it is often possible to provide higher throughput and resource availability. Similarly, the design is intended to exploit event-level parallelism in an open distributed environment.

  9. Rice Convection Model simulation of the 18 April 2002 sawtooth event and evidence for interchange instability

    NASA Astrophysics Data System (ADS)

    Yang, J.; Toffoletto, F. R.; Wolf, R. A.; Sazykin, S.; Spiro, R. W.; Brandt, P. C.; Henderson, M. G.; Frey, H. U.

    2008-11-01

    We present the results of a Rice Convection Model (RCM) simulation of the 18 April 2002 sawtooth event. This event occurred as a series of quasi-periodic substorms during fairly stable solar wind conditions. It is modeled by (1) prescribing a solar-wind-driven magnetic field model (T01_s) augmented by additional current loops representing the magnetic effects of the substorm current wedge and (2) by carefully specifying a substorm-phase-dependent plasma distribution at the RCM outer boundary at 8 Re such that a hot and attenuated plasma distribution is used after every substorm onset. The set of input parameters was adjusted to make the simulation results agree with the primary signatures of the sawtooth event, specifically the sequence of magnetic field stretching and dipolarization observed by the GOES spacecraft and the associated sharp increases and gradual decreases in the flux of energetic protons measured by the LANL/Synchronous Orbit Plasma Analyzer (SOPA) instruments on other geosynchronous spacecrafts. The results suggest the important role that higher temperature and lower density plasma-sheet plasma plays in producing flux enhancements at geosynchronous orbit. The results also confirm that induction electric fields associated with magnetic field collapse after substorm onsets can serve as a likely mechanism for the energization of particles up to 25 keV. Synthetic high-energy neutral atom images are compared with IMAGE/HENA measurements for 10-60 keV hydrogen atoms. Magnetic field dipolarization over a large range of local time resulted in a dramatic reduction in the plasma entropy parameter PV5/3 on the boundary. The simulation indicates that the ring current intensified 10-20 minutes after every onset, associated with the injection of low PV5/3 flux tubes through the boundary. The low PV5/3 plasma also produced an interchange convection in the inner magnetosphere, which drives Birkeland currents in a quasi-periodic upward-downward pattern with a

  10. Evaluation of the southern California seismic velocity models through simulation of recorded events

    NASA Astrophysics Data System (ADS)

    Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Khoshnevis, Naeem; Cheng, Keli

    2016-06-01

    Significant effort has been devoted over the last two decades to the development of various seismic velocity models for the region of southern California, United States. These models are mostly used in forward wave propagation simulation studies, but also as base models for tomographic and source inversions. Two of these models, the community velocity models CVM-S and CVM-H, are among the most commonly used for this region. This includes two alternative variations to the original models, the recently released CVM-S4.26 which incorporates results from a sequence of tomographic inversions into CVM-S, and the user-controlled option of CVM-H to replace the near-surface profiles with a VS30-based geotechnical model. Although either one of these models is regarded as acceptable by the modeling community, it is known that they have differences in their representation of the crustal structure and sedimentary deposits in the region, and thus can lead to different results in forward and inverse problems. In this paper, we evaluate the accuracy of these models when used to predict the ground motion in the greater Los Angeles region by means of an assessment of a collection of simulations of recent events. In total, we consider 30 moderate-magnitude earthquakes (3.5 < Mw < 5.5) between 1998 and 2014, and compare synthetics with data recorded by seismic networks during these events. The simulations are done using a finite-element parallel code, with numerical models that satisfy a maximum frequency of 1 Hz and a minimum shear wave velocity of 200 m s-1. The comparisons between data and synthetics are ranked quantitatively by means of a goodness-of-fit (GOF) criteria. We analyse the regional distribution of the GOF results for all events and all models, and draw conclusions from the results and how these correlate to the models. We find that, in light of our comparisons, the model CVM-S4.26 consistently yields better results.

  11. Explicit simulation of a midlatitude Mesoscale Convective System

    SciTech Connect

    Alexander, G.D.; Cotton, W.R.

    1996-04-01

    We have explicitly simulated the mesoscale convective system (MCS) observed on 23-24 June 1985 during PRE-STORM, the Preliminary Regional Experiment for the Stormscale Operational and Research and Meterology Program. Stensrud and Maddox (1988), Johnson and Bartels (1992), and Bernstein and Johnson (1994) are among the researchers who have investigated various aspects of this MCS event. We have performed this MCS simulation (and a similar one of a tropical MCS; Alexander and Cotton 1994) in the spirit of the Global Energy and Water Cycle Experiment Cloud Systems Study (GCSS), in which cloud-resolving models are used to assist in the formulation and testing of cloud parameterization schemes for larger-scale models. In this paper, we describe (1) the nature of our 23-24 June MCS dimulation and (2) our efforts to date in using our explicit MCS simulations to assist in the development of a GCM parameterization for mesoscale flow branches. The paper is organized as follows. First, we discuss the synoptic situation surrounding the 23-24 June PRE-STORM MCS followed by a discussion of the model setup and results of our simulation. We then discuss the use of our MCS simulation. We then discuss the use of our MCS simulations in developing a GCM parameterization for mesoscale flow branches and summarize our results.

  12. Behavioral and Physiological Responses of Calves to Marshalling and Roping in a Simulated Rodeo Event

    PubMed Central

    Sinclair, Michelle; Keeley, Tamara; Lefebvre, Anne-Cecile; Phillips, Clive J. C.

    2016-01-01

    Simple Summary Rodeos often include a calf roping event, where calves are first lassoed by a rider on a horse, who then dismounts, ties the calves’ legs, lifts it from the ground and releases it back to the floor. We tested whether calves that were familiar to the roping experience stress during the roping event, and found increased concentrations of stress hormones in their blood after the roping. We also found increased concentrations of stress hormones in the blood of calves that had never been roped before but were just marshelled across the arena by the horse and rider. We conclude that the roping event in rodeos is stressful for both experienced and naïve calves. Abstract Rodeos are public events at which stockpeople face tests of their ability to manage cattle and horses, some of which relate directly to rangeland cattle husbandry. One of these is calf roping, in which a calf released from a chute is pursued by a horse and rider, who lassoes, lifts and drops the calf to the ground and finally ties it around the legs. Measurements were made of behavior and stress responses of ten rodeo-naïve calves marshalled by a horse and rider, and ten rodeo-experienced calves that were roped. Naïve calves marshalled by a horse and rider traversed the arena slowly, whereas rodeo-experienced calves ran rapidly until roped. Each activity was repeated once after two hours. Blood samples taken before and after each activity demonstrated increased cortisol, epinephrine and nor-epinephrine in both groups. However, there was no evidence of a continued increase in stress hormones in either group by the start of the repeated activity, suggesting that the elevated stress hormones were not a response to a prolonged effect of the initial blood sampling. It is concluded that both the marshalling of calves naïve to the roping chute by stockpeople and the roping and dropping of experienced calves are stressful in a simulated rodeo calf roping event. PMID:27136590

  13. Communication Simulations for Power System Applications

    SciTech Connect

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.; Fisher, Andrew R.; Hauer, Matthew L.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systems will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.

  14. High Resolution Simulation of a Colorado Rockies Extreme Snow and Rain Event in both a Current and Future Climate

    NASA Astrophysics Data System (ADS)

    Rasmussen, Roy; Ikeda, Kyoko; Liu, Changhai; Gutmann, Ethan; Gochis, David

    2016-04-01

    Modeling of extreme weather events often require very finely resolved treatment of atmospheric circulation structures in order to produce and localize the large moisture fluxes that result in extreme precipitation. This is particularly true for cool season orographic precipitation processes where the representation of the landform can significantly impact vertical velocity profiles and cloud moisture entrainment rates. This study presents results for high resolution regional climate modeling study of the Colorado Headwaters region using an updated version of the Weather Research and Forecasting (WRF) model run at 4 km horizontal resolution and a hydrological extension package called WRF-Hydro. Previous work has shown that the WRF modeling system can produce credible depictions of winter orographic precipitation over the Colorado Rockies if run at horizontal resolutions < 6 km. Here we present results from a detailed study of an extreme springtime snowfall event that occurred along the Colorado Front Range in March 2003. Results from the impact of warming on total precipitation, snow-rain partitioning and surface hydrological fluxes (evapotranspiration and runoff) will be discussed in the context of how potential changes in temperature impact the amount of precipitation, the phase of precipitation (rain vs. snow) and the timing and amplitude of streamflow responses. The results show using the Pseudo Global Warming technique that intense precipitation rates significantly increased during the event and a significant fraction of the snowfall converts to rain which significantly amplifies the runoff response from one where runoff is produced gradually to one in which runoff is rapidly translated into streamflow values that approach significant flooding risks. Results from a new, CONUS scale high resolution climate simulation of extreme events in a current and future climate will be presented as time permits.

  15. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  16. Simulation and intelligent vehicle highway systems

    SciTech Connect

    Rathi, A.K. ); Santiago, A.J. )

    1992-01-01

    Intelligent Vehicle Highway Systems (IVHS) is based on the premise of using advanced technologies in telecommunication, electronics, and computers to improve the nature and quality of highway travel while making it safer and more efficient. The safety benefits of the IVHS systems are unquestioned; however, there are different levels of optimism about the operational benefits of these systems. While there is a broad consensus that IVHS can improve the flow of traffic, and thus mobility, currently there is very limited empirical evidence or analytical basis to support this optimism. The lack of analytical framework for design, analysis, and evaluation of IVHS concepts will continue to fuel the debate between the skeptics and the advocates of IVHS. Computer simulation is likely to play a major role in the analysis and assessment of the IVHS technologies. In this paper, we attempt to identify the simulation modelling needs to support the IVHS functional areas dealing with traffic flow on highway networks. The paper outlines the envisioned IVHS operational environment. Functional requirements for the simulation modelling system that could be used to support the development and testing of IVHS concepts, namely Advanced Traffic Management Systems (ATMS) and Advanced Traveller Information Systems (ATIS), are defined. Simulation modelling research and development needs to support the design and evaluations of IVHS concepts are described. The paper concludes by presenting on-going work on the traffic simulation models at the Oak Ridge National Laboratory.

  17. Simulation and intelligent vehicle highway systems

    SciTech Connect

    Rathi, A.K.; Santiago, A.J.

    1992-09-01

    Intelligent Vehicle Highway Systems (IVHS) is based on the premise of using advanced technologies in telecommunication, electronics, and computers to improve the nature and quality of highway travel while making it safer and more efficient. The safety benefits of the IVHS systems are unquestioned; however, there are different levels of optimism about the operational benefits of these systems. While there is a broad consensus that IVHS can improve the flow of traffic, and thus mobility, currently there is very limited empirical evidence or analytical basis to support this optimism. The lack of analytical framework for design, analysis, and evaluation of IVHS concepts will continue to fuel the debate between the skeptics and the advocates of IVHS. Computer simulation is likely to play a major role in the analysis and assessment of the IVHS technologies. In this paper, we attempt to identify the simulation modelling needs to support the IVHS functional areas dealing with traffic flow on highway networks. The paper outlines the envisioned IVHS operational environment. Functional requirements for the simulation modelling system that could be used to support the development and testing of IVHS concepts, namely Advanced Traffic Management Systems (ATMS) and Advanced Traveller Information Systems (ATIS), are defined. Simulation modelling research and development needs to support the design and evaluations of IVHS concepts are described. The paper concludes by presenting on-going work on the traffic simulation models at the Oak Ridge National Laboratory.

  18. Numerical simulations of Asian dust events: A Lagrangian Dust Model and its applications

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-Hee; Lee, Hyo-Jung

    2013-11-01

    An uni-modal Lagrangian Dust Model (LDM) was developed to simulate the dust concentrations and source-receptor (SR) relationships for recent Asian dust events that occurred over the Korean Peninsula. The following dust sources were used for the S-R calculation in this study: S-I) Gurbantunggut desert, S-II) Taklamakan desert, S-III) Tibetan Plateau, S-IV) Mu Us Desert, S-V) Manchuria, and S-VI) Nei Mongol and Gobi Desert. The following two 8-day dust simulation periods were selected for two case studies: (Period A) March 15-22, 2011, and (Period B) April 27-May 4, 2011. During two periods there were highly dense dust onsets observed over a wide area in Korea. Meteorological fields were generated using the WRF (Weather Research and Forecasting) meteorological model, and Lagrangian turbulent properties and dust emission were estimated using FLEXPART model and ADAM2 (Asian Dust Aerosol Model 2), respectively. The simulated dust concentrations are compared with point measurements and Eulerian model outputs. Statistical techniques were also employed to determine the accuracy and uncertainty associated with the model results. The results showed that the LDM compared favorably well with observations for some sites; however, for most sites the model overestimated the observations. Analysis of S-R relationships showed that 38-50% of dust particles originated from Nei Mongol and the Gobi Desert, and 16-25% of dust particles originated from Manchuria, accounting for most of the dust particles in Korea. Because there is no nudging or other artificial forcing included in the LDM, higher error indicators (e.g., root mean square error, absolute gross error) were found for some sites. However, the LDM was able to satisfactorily simulate the maximum timing and starting time of dust events for most sites. Compared with the Eulerian model, ADAM2, the results of LDM found pattern correlations (PCs) equal to 0.78-0.83 and indices of agreement (IOAs) greater than 0.6, suggesting that

  19. Simulation Of A Photofission-Based Cargo Interrogation System

    SciTech Connect

    King, Michael; Gozani, Tsahi; Stevenson, John; Shaw, Timothy

    2011-06-01

    A comprehensive model has been developed to characterize and optimize the detection of Bremsstrahlung x-ray induced fission signatures from nuclear materials hidden in cargo containers. An effective active interrogation system should not only induce a large number of fission events but also efficiently detect their signatures. The proposed scanning system utilizes a 9-MV commercially available linear accelerator and the detection of strong fission signals i.e. delayed gamma rays and prompt neutrons. Because the scanning system is complex and the cargo containers are large and often highly attenuating, the simulation method segments the model into several physical steps, representing each change of radiation particle. Each approximation is carried-out separately, resulting in a major reduction in computational time and a significant improvement in tally statistics. The model investigates the effect on the fission rate and detection rate by various cargo types, densities and distributions. Hydrogenous and metallic cargos, homogeneous and heterogeneous, as well as various locations of the nuclear material inside the cargo container were studied. We will show that for the photofission-based interrogation system simulation, the final results are not only in good agreement with a full, single-step simulation but also with experimental results, further validating the full-system simulation.

  20. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  1. Single-event response of the SiGe HBT in TCAD simulations and laser microbeam experiment

    NASA Astrophysics Data System (ADS)

    Li, Pei; Guo, Hong-Xia; Guo, Qi; Zhang, Jin-Xin; Xiao, Yao; Wei, Ying; Cui, Jiang-Wei; Wen, Lin; Liu, Mo-Han; Wang, Xin

    2015-08-01

    In this paper the single-event responses of the silicon germanium heterojunction bipolar transistors (SiGe HBTs) are investigated by TCAD simulations and laser microbeam experiment. A three-dimensional (3D) simulation model is established, the single event effect (SEE) simulation is further carried out on the basis of SiGe HBT devices, and then, together with the laser microbeam test, the charge collection behaviors are analyzed, including the single event transient (SET) induced transient terminal currents, and the sensitive area of SEE charge collection. The simulations and experimental results are discussed in detail and it is demonstrated that the nature of the current transient is controlled by the behaviors of the collector-substrate (C/S) junction and charge collection by sensitive electrodes, thereby giving out the sensitive area and electrode of SiGe HBT in SEE. Project supported by the National Natural Science Foundation of China (Grant No. 61274106).

  2. Observing System Simulation Experiments: An Overview

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.; Errico, Ronald M.

    2016-01-01

    An overview of Observing System Simulation Experiments (OSSEs) will be given, with focus on calibration and validation of OSSE frameworks. Pitfalls and practice will be discussed, including observation error characteristics, incestuousness, and experimental design. The potential use of OSSEs for investigation of the behaviour of data assimilation systems will be explored, including some results from experiments using the NASAGMAO OSSE.

  3. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.

  4. Reduced salinity increases susceptibility of zooxanthellate jellyfish to herbicide toxicity during a simulated rainfall event.

    PubMed

    Klein, Shannon G; Pitt, Kylie A; Carroll, Anthony R

    2016-02-01

    Accurately predicting how marine biota are likely to respond to changing ocean conditions requires accurate simulation of interacting stressors, exposure regimes and recovery periods. Jellyfish populations have increased in some parts of the world and, despite few direct empirical tests, are hypothesised to be increasing because they are robust to a range of environmental stressors. Here, we investigated the effects of contaminated runoff on a zooxanthellate jellyfish by exposing juvenile Cassiopea sp. medusae to a photosystem II (PSII) herbicide, atrazine and reduced salinity conditions that occur following rainfall. Four levels of atrazine (0ngL(-1), 10ngL(-1), 2μgL(-1), 20μgL(-1)) and three levels of salinity (35 ppt, 25 ppt, 17 ppt) were varied, mimicking the timeline of light, moderate and heavy rainfall events. Normal conditions were then slowly re-established over four days to mimic the recovery of the ecosystem post-rain and the experiment continued for a further 7 days to observe potential recovery of the medusae. Pulse-amplitude modulated (PAM) chlorophyll fluorescence, growth and bell contraction rates of medusae were measured. Medusae exposed to the combination of high atrazine and lowest salinity died. After 3 days of exposure, bell contraction rates were reduced by 88% and medusae were 16% smaller in the lowest salinity treatments. By Day 5 of the experiment, all medusae that survived the initial pulse event began to recover quickly. Although atrazine decreased YII under normal salinity conditions, YII was further reduced when medusae were exposed to both low salinity and atrazine simultaneously. Atrazine breakdown products were more concentrated in jellyfish tissues than atrazine at the end of the experiment, suggesting that although bioaccumulation occurred, atrazine was metabolised. Our results suggest that reduced salinity may increase the susceptibility of medusae to herbicide exposure during heavy rainfall events. PMID:26647170

  5. Rare-event Simulation for Stochastic Korteweg-de Vries Equation

    SciTech Connect

    Xu, Gongjun; Lin, Guang; Liu, Jingchen

    2014-01-01

    An asymptotic analysis of the tail probabilities for the dynamics of a soliton wave $U(x,t)$ under a stochastic time-dependent force is developed. The dynamics of the soliton wave $U(x,t)$ is described by the Korteweg-de Vries Equation with homogeneous Dirichlet boundary conditions under a stochastic time-dependent force, which is modeled as a time-dependent Gaussian noise with amplitude $\\epsilon$. The tail probability we considered is $w(b) :=P(\\sup_{t\\in [0,T]} U(x,t) > b ),$ as $b\\rightarrow \\infty,$ for some constant $T>0$ and a fixed $x$, which can be interpreted as tail probability of the amplitude of water wave on shallow surface of a fluid or long internal wave in a density-stratified ocean. Our goal is to characterize the asymptotic behaviors of $w(b)$ and to evaluate the tail probability of the event that the soliton wave exceeds a certain threshold value under a random force term. Such rare-event calculation of $w(b)$ is very useful for fast estimation of the risk of the potential damage that could caused by the water wave in a density-stratified ocean modeled by the stochastic KdV equation. In this work, the asymptotic approximation of the probability that the soliton wave exceeds a high-level $b$ is derived. In addition, we develop a provably efficient rare-event simulation algorithm to compute $w(b)$. The efficiency of the algorithm only requires mild conditions and therefore it is applicable to a general class of Gaussian processes and many diverse applications.

  6. Reduced salinity increases susceptibility of zooxanthellate jellyfish to herbicide toxicity during a simulated rainfall event.

    PubMed

    Klein, Shannon G; Pitt, Kylie A; Carroll, Anthony R

    2016-02-01

    Accurately predicting how marine biota are likely to respond to changing ocean conditions requires accurate simulation of interacting stressors, exposure regimes and recovery periods. Jellyfish populations have increased in some parts of the world and, despite few direct empirical tests, are hypothesised to be increasing because they are robust to a range of environmental stressors. Here, we investigated the effects of contaminated runoff on a zooxanthellate jellyfish by exposing juvenile Cassiopea sp. medusae to a photosystem II (PSII) herbicide, atrazine and reduced salinity conditions that occur following rainfall. Four levels of atrazine (0ngL(-1), 10ngL(-1), 2μgL(-1), 20μgL(-1)) and three levels of salinity (35 ppt, 25 ppt, 17 ppt) were varied, mimicking the timeline of light, moderate and heavy rainfall events. Normal conditions were then slowly re-established over four days to mimic the recovery of the ecosystem post-rain and the experiment continued for a further 7 days to observe potential recovery of the medusae. Pulse-amplitude modulated (PAM) chlorophyll fluorescence, growth and bell contraction rates of medusae were measured. Medusae exposed to the combination of high atrazine and lowest salinity died. After 3 days of exposure, bell contraction rates were reduced by 88% and medusae were 16% smaller in the lowest salinity treatments. By Day 5 of the experiment, all medusae that survived the initial pulse event began to recover quickly. Although atrazine decreased YII under normal salinity conditions, YII was further reduced when medusae were exposed to both low salinity and atrazine simultaneously. Atrazine breakdown products were more concentrated in jellyfish tissues than atrazine at the end of the experiment, suggesting that although bioaccumulation occurred, atrazine was metabolised. Our results suggest that reduced salinity may increase the susceptibility of medusae to herbicide exposure during heavy rainfall events.

  7. Applications Of Monte Carlo Radiation Transport Simulation Techniques For Predicting Single Event Effects In Microelectronics

    SciTech Connect

    Warren, Kevin; Reed, Robert; Weller, Robert; Mendenhall, Marcus; Sierawski, Brian; Schrimpf, Ronald

    2011-06-01

    MRED (Monte Carlo Radiative Energy Deposition) is Vanderbilt University's Geant4 application for simulating radiation events in semiconductors. Geant4 is comprised of the best available computational physics models for the transport of radiation through matter. In addition to basic radiation transport physics contained in the Geant4 core, MRED has the capability to track energy loss in tetrahedral geometric objects, includes a cross section biasing and track weighting technique for variance reduction, and additional features relevant to semiconductor device applications. The crucial element of predicting Single Event Upset (SEU) parameters using radiation transport software is the creation of a dosimetry model that accurately approximates the net collected charge at transistor contacts as a function of deposited energy. The dosimetry technique described here is the multiple sensitive volume (MSV) model. It is shown to be a reasonable approximation of the charge collection process and its parameters can be calibrated to experimental measurements of SEU cross sections. The MSV model, within the framework of MRED, is examined for heavy ion and high-energy proton SEU measurements of a static random access memory.

  8. Thermal enclosure system functional simulation user's manual

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1994-01-01

    A form and function simulation of the thermal enclosure system (TES) for a microgravity protein crystal growth experiment has been developed as part of an investigation of the benefits and limitations of intravehicular telerobotics to aid in microgravity science and production. A user can specify the time, temperature, and sample rate profile for a given experiment, and menu options and status are presented on an LCD display. This report describes the features and operational procedures for the functional simulation.

  9. AMCOM RDEC ladar HWIL simulation system development

    NASA Astrophysics Data System (ADS)

    Kim, Hajin J.; Mobley, Scottie B.; Buford, James A., Jr.

    2003-09-01

    Hardware-in-the-loop (HWIL) testing has, for many years, been an integral part of the modeling and simulation efforts at the U.S. Army Aviation and Missile Command"s (AMCOM) Aviation and Missile Research, Engineering, and Development Center (AMRDEC). AMCOM"s history includes the development, characterization, and implementation of several unique technologies for the creation of synthetic environments in the visible, infrared, and radio frequency spectral regions and AMCOM has continued significant efforts in these areas. This paper describes recent advancements at AMCOM"s Advanced Simulation Center (ASC) and concentrates on Ladar HWIL simulation system development.

  10. Lunar Rocks: Available for Year of the Solar System Events

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2010-12-01

    sections may be use requested for college and university courses where petrographic microscopes are available for viewing. Requestors should contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov NASA also loans sets of Moon rocks for use in classrooms, libraries, museums, and planetariums through the Lunar Sample Education Program. Lunar samples (three soils and three rocks) are encapsulated in a six-inch diameter clear plastic disk. A CD with PowerPoint presentations, analogue samples from Earth, a classroom activity guide, and additional printed material accompany the disks. Educators may qualify for the use of these disks by attending a content and security certification workshop sponsored by NASA's Aerospace Education Services Program (AESP). Contact Ms. Margaret Maher, AESP Director. Email address: mjm67@psu.edu NASA makes these precious samples available for the public and encourages the use of lunar rocks to highlight Year of the Solar System events. Surely these interesting specimens of another world will enhance the experience of all YSS participants so please take advantage of these lunar samples and borrow them for events and classes.

  11. River system environmental modeling and simulation methodology

    SciTech Connect

    Rao, N.B.

    1981-01-01

    Several computer models have been built to examine pollution in rivers. However, the current state of the art in this field emphasizes problem solving using specific programs. A general methodology for building and simulating models of river systems is lacking. Thus, the purpose of this research was to develop a methodology which can be used to conceptualize, visualize, construct and analyze using simulation, models of pollution in river systems. The conceptualization and visualization of these models was facilitated through a network representation. The implementation of the models was accomplished using the capabilities of an existing simulation language, GASP V. The methodology also provides data management facilities for model outputs through the use of the Simulation Data Language (SDL), and high quality plotting facilities through the use of the graphics package DISSPLA (Display Integrated Software System and Plotting Language). Using this methodology, a river system is modeled as consisting of certain elements, namely reaches, junctions, dams, reservoirs, withdrawals and pollutant sources. All these elements of the river system are described in a standard form which has been implemented on a computer. This model, when executed, produces spatial and temporal distributions of the pollutants in the river system. Furthermore, these outputs can be stored in a database and used to produce high quality plots. The result of this research is a methodology for building, implementing and examining the results of models of pollution in river systems.

  12. Multipurpose simulation systems for regional development forecasting

    SciTech Connect

    Kostina, N.I.

    1995-09-01

    We examine the development of automaton-modeling multipurpose simulation systems as an efficient form of simulation software for MIS. Such systems constitute a single problem-oriented package of applications based on a general simulation model, which is equipped with a task source language, interaction tools, file management tools, and an output document editor. The simulation models are described by the method of probabilistic-automaton modeling, which ensures standard representation of models and standardization of the modeling algorithm. Examples of such systems include the demographic forecasting system DEPROG, the VOKON system for assessing the quality of consumer services in terms of free time, and the SONET system for servicing partially accessible customers. The development of computer-aided systems for production and economic control is now moving to the second state, namely operationalization of optimization and forecasting problems, whose solution may account for the main economic effect of MIS. Computation and information problems, which were the main focus of the first stage of MIS development, are thus acquiring the role of a source of information for optimization and forecasting problems in addition to their direct contribution to preparation and analysis of current production and economic information.

  13. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to

  14. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to

  15. Spatial Aspects in Biological System Simulations

    PubMed Central

    Resat, Haluk; Costa, Michelle N.; Shankaran, Harish

    2012-01-01

    Mathematical models of the dynamical properties of biological systems aim to improve our understanding of the studied system with the ultimate goal of being able to predict system responses in the absence of experimentation. Despite the enormous advances that have been made in biological modeling and simulation, the inherently multiscale character of biological systems and the stochasticity of biological processes continue to present significant computational and conceptual challenges. Biological systems often consist of well-organized structural hierarchies, which inevitably lead to multiscale problems. This chapter introduces and discusses the advantages and shortcomings of several simulation methods that are being used by the scientific community to investigate the spatiotemporal properties of model biological systems. We first describe the foundations of the methods and then describe their relevance and possible application areas with illustrative examples from our own research. Possible ways to address the encountered computational difficulties are also discussed. PMID:21187236

  16. Classification of single-trial auditory events using dry-wireless EEG during real and motion simulated flight.

    PubMed

    Callan, Daniel E; Durantin, Gautier; Terzibas, Cengiz

    2015-01-01

    Application of neuro-augmentation technology based on dry-wireless EEG may be considerably beneficial for aviation and space operations because of the inherent dangers involved. In this study we evaluate classification performance of perceptual events using a dry-wireless EEG system during motion platform based flight simulation and actual flight in an open cockpit biplane to determine if the system can be used in the presence of considerable environmental and physiological artifacts. A passive task involving 200 random auditory presentations of a chirp sound was used for evaluation. The advantage of this auditory task is that it does not interfere with the perceptual motor processes involved with piloting the plane. Classification was based on identifying the presentation of a chirp sound vs. silent periods. Evaluation of Independent component analysis (ICA) and Kalman filtering to enhance classification performance by extracting brain activity related to the auditory event from other non-task related brain activity and artifacts was assessed. The results of permutation testing revealed that single trial classification of presence or absence of an auditory event was significantly above chance for all conditions on a novel test set. The best performance could be achieved with both ICA and Kalman filtering relative to no processing: Platform Off (83.4% vs. 78.3%), Platform On (73.1% vs. 71.6%), Biplane Engine Off (81.1% vs. 77.4%), and Biplane Engine On (79.2% vs. 66.1%). This experiment demonstrates that dry-wireless EEG can be used in environments with considerable vibration, wind, acoustic noise, and physiological artifacts and achieve good single trial classification performance that is necessary for future successful application of neuro-augmentation technology based on brain-machine interfaces. PMID:25741249

  17. Classification of single-trial auditory events using dry-wireless EEG during real and motion simulated flight

    PubMed Central

    Callan, Daniel E.; Durantin, Gautier; Terzibas, Cengiz

    2015-01-01

    Application of neuro-augmentation technology based on dry-wireless EEG may be considerably beneficial for aviation and space operations because of the inherent dangers involved. In this study we evaluate classification performance of perceptual events using a dry-wireless EEG system during motion platform based flight simulation and actual flight in an open cockpit biplane to determine if the system can be used in the presence of considerable environmental and physiological artifacts. A passive task involving 200 random auditory presentations of a chirp sound was used for evaluation. The advantage of this auditory task is that it does not interfere with the perceptual motor processes involved with piloting the plane. Classification was based on identifying the presentation of a chirp sound vs. silent periods. Evaluation of Independent component analysis (ICA) and Kalman filtering to enhance classification performance by extracting brain activity related to the auditory event from other non-task related brain activity and artifacts was assessed. The results of permutation testing revealed that single trial classification of presence or absence of an auditory event was significantly above chance for all conditions on a novel test set. The best performance could be achieved with both ICA and Kalman filtering relative to no processing: Platform Off (83.4% vs. 78.3%), Platform On (73.1% vs. 71.6%), Biplane Engine Off (81.1% vs. 77.4%), and Biplane Engine On (79.2% vs. 66.1%). This experiment demonstrates that dry-wireless EEG can be used in environments with considerable vibration, wind, acoustic noise, and physiological artifacts and achieve good single trial classification performance that is necessary for future successful application of neuro-augmentation technology based on brain-machine interfaces. PMID:25741249

  18. Extreme events in a vortex gas simulation of a turbulent half-jet

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Saikishan; Pathikonda, Gokul; Narasimha, Roddam

    2012-11-01

    Extensive simulations [arXiv:1008.2876v1 [physics.flu-dyn], BAPS.2010.DFD.LE.4] have shown that the temporally evolving vortex gas mixing layer has 3 regimes, including one which has a universal spreading rate. The present study explores the development of spatially evolving mixing layers, using a vortex gas model based on Basu et al. (1995 Appl. Math. Modelling). The effects of the velocity ratio (r) are analyzed via the most extensive simulations of this kind till date, involving up to 10000 vortices and averaging over up to 1000 convective times. While the temporal limit is approached as r approaches unity, striking features such as extreme events involving coherent structures, bending, deviation of the convection velocity from mean velocity, spatial feedback and greater sensitivity to downstream and free stream boundary conditions are observed in the half-jet (r = 0) limit. A detailed statistical analysis reveals possible causes for the large scatter across experiments, as opposed to the commonly adopted explanation of asymptotic dependence on initial conditions. Supported in part by contract no. Intel/RN/4288.

  19. SIMULATIONS OF THE SPATIAL AND TEMPORAL INVARIANCE IN THE SPECTRA OF GRADUAL SOLAR ENERGETIC PARTICLE EVENTS

    SciTech Connect

    Wang, Yang; Qin, Gang E-mail: gqin@spaceweather.ac.cn

    2015-06-20

    The spatial and temporal invariance in the spectra of energetic particles in gradual solar events is reproduced in simulations. Based on a numerical solution of the focused transport equation, we obtain the intensity time profiles of solar energetic particles (SEPs) accelerated by an interplanetary shock in three-dimensional interplanetary space. The shock is treated as a moving source of energetic particles with a distribution function. The time profiles of particle fluxes with different energies are calculated in the ecliptic at 1 AU. According to our model, we find that shock acceleration strength, parallel diffusion, and adiabatic cooling are the main factors in forming the spatial invariance in SEP spectra, and perpendicular diffusion is a secondary factor. In addition, the temporal invariance in SEP spectra is mainly due to the effects of adiabatic cooling. Furthermore, a spectra invariant region, which agrees with observations but is different from the one suggested by Reames et al. is proposed based on our simulations.

  20. Simulation study on single event burnout in linear doping buffer layer engineered power VDMOSFET

    NASA Astrophysics Data System (ADS)

    Yunpeng, Jia; Hongyuan, Su; Rui, Jin; Dongqing, Hu; Yu, Wu

    2016-02-01

    The addition of a buffer layer can improve the device's secondary breakdown voltage, thus, improving the single event burnout (SEB) threshold voltage. In this paper, an N type linear doping buffer layer is proposed. According to quasi-stationary avalanche simulation and heavy ion beam simulation, the results show that an optimized linear doping buffer layer is critical. As SEB is induced by heavy ions impacting, the electric field of an optimized linear doping buffer device is much lower than that with an optimized constant doping buffer layer at a given buffer layer thickness and the same biasing voltages. Secondary breakdown voltage and the parasitic bipolar turn-on current are much higher than those with the optimized constant doping buffer layer. So the linear buffer layer is more advantageous to improving the device's SEB performance. Project supported by the National Natural Science Foundation of China (No. 61176071), the Doctoral Fund of Ministry of Education of China (No. 20111103120016), and the Science and Technology Program of State Grid Corporation of China (No. SGRI-WD-71-13-006).

  1. Simulation of a rapid dropout event for highly relativistic electrons with the RBE model

    NASA Astrophysics Data System (ADS)

    Kang, S.-B.; Fok, M.-C.; Glocer, A.; Min, K.-W.; Choi, C.-R.; Choi, E.; Hwang, J.

    2016-05-01

    A flux dropout is a sudden and sizable decrease in the energetic electron population of the outer radiation belt on the time scale of a few hours. We simulated a flux dropout of highly relativistic >2.5 MeV electrons using the Radiation Belt Environment model, incorporating the pitch angle diffusion coefficients caused by electromagnetic ion cyclotron (EMIC) waves for the geomagnetic storm event of 23-26 October 2002. This simulation showed a remarkable decrease in the >2.5 MeV electron flux during main phase of the storm, compared to those without EMIC waves. This decrease was independent of magnetopause shadowing or drift loss to the magnetopause. We suggest that the flux decrease was likely to be primarily due to pitch angle scattering to the loss cone by EMIC waves. Furthermore, the >2.5 MeV electron flux calculated with EMIC waves correspond very well with that observed from Solar Anomalous and Magnetospheric Particle EXplorer spacecraft. EMIC wave scattering is therefore likely one of the key mechanisms to understand flux dropouts. We modeled EMIC wave intensities by the Kp index. However, the calculated dropout is a several hours earlier than the observed one. We propose that Kp is not the best parameter to predict EMIC waves.

  2. Multidisciplinary propulsion simulation using the numerical propulsion system simulator (NPSS)

    NASA Technical Reports Server (NTRS)

    Claus, Russel W.

    1994-01-01

    Implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributions to the high cost is the need to perform many large scale system tests. The traditional design analysis procedure decomposes the engine into isolated components and focuses attention on each single physical discipline (e.g., fluid for structural dynamics). Consequently, the interactions that naturally occur between components and disciplines can be masked by the limited interactions that occur between individuals or teams doing the design and must be uncovered during expensive engine testing. This overview will discuss a cooperative effort of NASA, industry, and universities to integrate disciplines, components, and high performance computing into a Numerical propulsion System Simulator (NPSS).

  3. Aid For Simulating Digital Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hartman, Richard M.

    1991-01-01

    DIVERS translator is computer program to convert descriptions of digital flight-control systems (DFCS) into computer program. Language developed to represent design charts of DFCS. Translator converts DIVERS source code into easily transportable language, while minimizing probability that results are affected by interpretation of programmer. Final translated program used as standard of comparison to verify operation of actual flight-control systems. Applicable to simulation of other control systems; for example, electrical circuits and logic processes. Written in C.

  4. Simulating failures on large-scale systems.

    SciTech Connect

    Desai, N.; Lusk, E.; Buettner, D.; Cherry, A.; Voran, T.; Univ. of Colorado

    2008-09-01

    Developing fault management mechanisms is a difficult task because of the unpredictable nature of failures. In this paper, we present a fault simulation framework for Blue Gene/P systems implemented as a part of the Cobalt resource manager. The primary goal of this framework is to support system software development. We also present a hardware diagnostic system that we have implemented using this framework.

  5. System Equivalent for Real Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Lin, Xi

    2011-07-01

    The purpose of this research is to develop a method of making system equivalents for the Real Time Digital Simulator (RTDS), which should enhance its capability of simulating large power systems. The proposed equivalent combines a Frequency Dependent Network Equivalent (FDNE) for the high frequency electromagnetic transients and a Transient Stability Analysis (TSA) type simulation block for the electromechanical transients. The frequency dependent characteristic for FDNE is obtained by curve-fitting frequency domain admittance characteristics using the Vector Fitting method. An approach for approximating the frequency dependent characteristic of large power networks from readily available typical power-flow data is also introduced. A new scheme of incorporating TSA solution in RTDS is proposed. This report shows how the TSA algorithm can be adapted to a real time platform. The validity of this method is confirmed with examples, including the study of a multi in-feed HVDC system based network.

  6. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  7. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable

  8. Shuttle Propulsion System Major Events and the Final 22 Flights

    NASA Technical Reports Server (NTRS)

    Owen, James W.

    2011-01-01

    Numerous lessons have been documented from the Space Shuttle Propulsion elements. Major events include loss of the Solid Rocket Boosters (SRB's) on STS-4 and shutdown of a Space Shuttle Main Engine (SSME) during ascent on STS-51F. On STS-112 only half the pyrotechnics fired during release of the vehicle from the launch pad, a testament for redundancy. STS-91 exhibited freezing of a main combustion chamber pressure measurement and on STS-93 nozzle tube ruptures necessitated a low liquid level oxygen cut off of the main engines. A number of on pad aborts were experienced during the early program resulting in delays. And the two accidents, STS-51L and STS-107, had unique heritage in history from early program decisions and vehicle configuration. Following STS-51L significant resources were invested in developing fundamental physical understanding of solid rocket motor environments and material system behavior. And following STS-107, the risk of ascent debris was better characterized and controlled. Situational awareness during all mission phases improved, and the management team instituted effective risk assessment practices. The last 22 flights of the Space Shuttle, following the Columbia accident, were characterized by remarkable improvement in safety and reliability. Numerous problems were solved in addition to reduction of the ascent debris hazard. The Shuttle system, though not as operable as envisioned in the 1970's, successfully assembled the International Space Station (ISS). By the end of the program, the remarkable Space Shuttle Propulsion system achieved very high performance, was largely reusable, exhibited high reliability, and was a heavy lift earth to orbit propulsion system. During the program a number of project management and engineering processes were implemented and improved. Technical performance, schedule accountability, cost control, and risk management were effectively managed and implemented. Award fee contracting was implemented to provide

  9. Adaptive periodic event-triggered consensus for multi-agent systems subject to input saturation

    NASA Astrophysics Data System (ADS)

    Yin, Xiuxia; Yue, Dong; Hu, Songlin

    2016-04-01

    This paper investigates the distributed adaptive event-triggered consensus control for a class of nonlinear agents. Each agent is subject to input saturation. Two kinds of distributed event-triggered control scheme are introduced, one is continuous-time-based event-triggered scheme and the other is sampled-data-based event-triggered scheme. Compared with the traditional event-triggered schemes in the existing literatures, the parameters of the event-triggered schemes in this paper are adaptively adjusted by using some event-error-dependent adaptive laws. The problem of simultaneously deriving the controller gain matrix and the event-triggering parameter matrix, and tackling the saturation nonlinearity is cast into standard linear matrix inequalities problem. A convincing simulation example is given to demonstrate the theoretical results.

  10. Simulating Astronomical Adaptive Optics Systems Using Yao

    NASA Astrophysics Data System (ADS)

    Rigaut, François; Van Dam, Marcos

    2013-12-01

    Adaptive Optics systems are at the heart of the coming Extremely Large Telescopes generation. Given the importance, complexity and required advances of these systems, being able to simulate them faithfully is key to their success, and thus to the success of the ELTs. The type of systems envisioned to be built for the ELTs cover most of the AO breeds, from NGS AO to multiple guide star Ground Layer, Laser Tomography and Multi-Conjugate AO systems, with typically a few thousand actuators. This represents a large step up from the current generation of AO systems, and accordingly a challenge for existing AO simulation packages. This is especially true as, in the past years, computer power has not been following Moore's law in its most common understanding; CPU clocks are hovering at about 3GHz. Although the use of super computers is a possible solution to run these simulations, being able to use smaller machines has obvious advantages: cost, access, environmental issues. By using optimised code in an already proven AO simulation platform, we were able to run complex ELT AO simulations on very modest machines, including laptops. The platform is YAO. In this paper, we describe YAO, its architecture, its capabilities, the ELT-specific challenges and optimisations, and finally its performance. As an example, execution speed ranges from 5 iterations per second for a 6 LGS 60x60 subapertures Shack-Hartmann Wavefront sensor Laser Tomography AO system (including full physical image formation and detector characteristics) up to over 30 iterations/s for a single NGS AO system.

  11. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria.

  12. Hybrid system modeling, simulation, and visualization: a crane system

    NASA Astrophysics Data System (ADS)

    Hiniduma Udugama Gamage, Sahan S.; Palmer, Patrick R.

    2003-08-01

    Modeling and visualization of a complex hybrid system with different domains of energy flow and signal flow are described in this paper. It is a crane system situated in a barge complete with the load, electrical power, drive and control systems. A dynamically and functionally accurate model of the crane was developed. The implementation is in the freely available software suit of Virtual Test Bed (VTB) for simulation and Visual Extension Engine (VXE) for visualization. The bidirectional interaction of simulator and visualizer is fully utilized in this application. The further challenges confronted in implementing this particular system and any other complex system are discussed and possible solutions are suggested.

  13. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    NASA Technical Reports Server (NTRS)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  14. Electric System Intra-hour Operation Simulator

    2014-03-07

    ESIOS is a software program developed at Pacific Northwest National Laboratory (PNNL) that performs intra-hour dispatch and automatic generation control (AGC) simulations for electric power system frequency regulation and load/variable generation following. The program dispatches generation resources at minute interval to meet control performance requirements, while incorporating stochastic models of forecast errors and variability with generation, load, interchange and market behaviors. The simulator also contains an operator model that mimics manual actions to adjust resourcemore » dispatch and maintain system reserves. Besides simulating generation fleet intra-hour dispatch, ESIOS can also be used as a test platform for the design and verification of energy storage, demand response, and other technologies helping to accommodate variable generation.« less

  15. Theory and Simulations of Solar System Plasmas

    NASA Technical Reports Server (NTRS)

    Goldstein, Melvyn L.

    2011-01-01

    "Theory and simulations of solar system plasmas" aims to highlight results from microscopic to global scales, achieved by theoretical investigations and numerical simulations of the plasma dynamics in the solar system. The theoretical approach must allow evidencing the universality of the phenomena being considered, whatever the region is where their role is studied; at the Sun, in the solar corona, in the interplanetary space or in planetary magnetospheres. All possible theoretical issues concerning plasma dynamics are welcome, especially those using numerical models and simulations, since these tools are mandatory whenever analytical treatments fail, in particular when complex nonlinear phenomena are at work. Comparative studies for ongoing missions like Cassini, Cluster, Demeter, Stereo, Wind, SDO, Hinode, as well as those preparing future missions and proposals, like, e.g., MMS and Solar Orbiter, are especially encouraged.

  16. LHC RF System Time-Domain Simulation

    SciTech Connect

    Mastorides, T.; Rivetta, C.; /SLAC

    2010-09-14

    Non-linear time-domain simulations have been developed for the Positron-Electron Project (PEP-II) and the Large Hadron Collider (LHC). These simulations capture the dynamic behavior of the RF station-beam interaction and are structured to reproduce the technical characteristics of the system (noise contributions, non-linear elements, and more). As such, they provide useful results and insight for the development and design of future LLRF feedback systems. They are also a valuable tool for the study of diverse longitudinal beam dynamics effects such as coupled-bunch impedance driven instabilities and single bunch longitudinal emittance growth. Results from these studies and related measurements from PEP-II and LHC have been presented in multiple places. This report presents an example of the time-domain simulation implementation for the LHC.

  17. Mobilization of PAHs and PCBs from In-Place Contaminated Marine Sediments During Simulated Resuspension Events

    NASA Astrophysics Data System (ADS)

    Latimer, J. S.; Davis, W. R.; Keith, D. J.

    1999-10-01

    A particle entrainment simulator was used to experimentally produce representative estuarine resuspension conditions to investigate the resulting transport of polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs) to the overlying water column. Contaminants were evaluated in bulk sediments, size fractionated sediments, resuspended particulate material and in some cases, dissolved phases during the experiments. The two types of sediments used in the experiments, dredged material and bedded estuarine sediment, represented gradients in contaminant loadings and sediment textural characteristics. For the bedded sediment, resuspension tended to winnow the sediments of finer particles. However, in the case of the more highly contaminated dredge material, non-selective resuspension was most common. Resuspension resulted in up to orders of magnitude higher particle-bound organic contaminant concentrations in the overlying water column. Dissolved phase PAH changes during resuspension were variable and at most, increased by a factor of three. The sifting process resulted in the partitioning of fine and coarse particle contaminant loading. For bedded sediments, accurate predictions of PAH and PCB loadings on resuspended particles were made using the mass of resuspended particles of different sizes and the concentrations of contaminants in the particle pools of the bulk sediment. However, due possibly to contributions from other unmeasured particles (e.g. colloids), predictions were not possible for the dredge material. Thus, knowledge of the redistribution and fate of colloids may be important. The partitioning of PAHs between the dissolved and particulate phases during resuspension events was predicted to within a factor of two from the amount of organic carbon in each of the resuspended samples. These experiments show that contaminant transport is a function of the chemistry and textural characteristics of the bulk sediment and the winnowing action

  18. Simulation of ITER ELM transient heat events on tungsten grades using long pulse laser beams

    NASA Astrophysics Data System (ADS)

    Suslova, Anastassiya

    Tungsten has been chosen as the main candidate for plasma facing components (PFCs) in the magnetic confinement nuclear fusion reactors such as International Thermonuclear Experimental Reactor (ITER) and beyond due to its superior properties under extreme operating conditions expected in fusion rectors. One of the serious issues for the plasma facing components is the heat load during transient events such as edge localized modes (ELMs) and disruption in the reactor. High temperature gradient and high thermal stresses developed during transients could lead to material recrystallization and grain growth, formation of a melt layer, material erosion, and crack formation, which can limit the power handling capacity of PFCs, decrease lifetime, and contribute to plasma contamination that affect subsequent operations. Mechanical and surface properties of different tungsten grades and their behavior under ITER-like conditions are the main focus of current research efforts in the fusion research community. The current work was focused primarily on detailed investigation of the effect of ELM-like transient heat events on pristine samples of two different grades of deformed tungsten with ultrafine and nanocrystlline grains. Significant efforts were made to understand the mechanisms behind recrystallization, grain growth, crack formation, surface nano-structuring, melting, and other phenomena observed under repeated transient heat loads, simulated by the use of long pulse laser beams. It was observed that cold rolled tungsten overall demonstrated better power handling capabilities and higher thermal stress fatigue resistance. It had higher recrystallization and melting threshold parameters, slower grain growth at similar irradiation conditions, lower degree of surface roughening, and less material losses. The difference in behavior of the two grades of tungsten under similar heat load conditions was attributed to the initial tensile properties of the samples, initial impurities

  19. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  20. Numerical simulation of imaging laser radar system

    NASA Astrophysics Data System (ADS)

    Han, Shaokun; Lu, Bo; Jiang, Ming; Liu, Xunliang

    2008-03-01

    Rational and effective design of imaging laser radar systems is the key of imaging laser radar system research. Design must fully consider the interrelationship between various parameters. According to the parameters, choose suitable laser, detector and other components. To use of mathematical modeling and computer simulation is an effective imaging laser radar system design methods. This paper based on the distance equation, using the detection statistical methods, from the laser radar range coverage, detection probability, false-alarm rate, SNR to build the laser radar system mathematical models. In the process of setting up the mathematical models to fully consider the laser, atmosphere, detector and other factors on the performance that is to make the models be able to respond accurately the real situation. Based on this using C# and Matlab designed a simulation software.

  1. Aviation spectral camera infinity target simulation system

    NASA Astrophysics Data System (ADS)

    Liu, Xinyue; Ming, Xing; Liu, Jiu; Guo, Wenji; Lv, Gunbo

    2014-11-01

    With the development of science and technology, the applications of aviation spectral camera becoming more widely. Developing a test system of dynamic target is more important. Aviation spectral camera infinity target simulation system can be used to test the resolution and the modulation transfer function of camera. The construction and work principle of infinity target simulation system were introduced in detail. Dynamic target generator based digital micromirror device (DMD) and required performance of collimation System were analyzed and reported. The dynamic target generator based on DMD had the advantages of replacing image convenient, size small and flexible. According to the requirement of tested camera, by rotating and moving mirror, has completed a full field infinity dynamic target test plan.

  2. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  3. Introduction to Observing System Simulation Experiments (OSSEs)

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.

    2014-01-01

    This presentation gives a brief overview of Observing System Simulation Experiments (OSSEs), including what OSSEs are, and how and why they are performed. The intent is to educate the audience in light of the OSSE-related sections of the Forecast Improvement Act (H.R. 2413).

  4. The systems biology simulation core algorithm

    PubMed Central

    2013-01-01

    Background With the increasing availability of high dimensional time course data for metabolites, genes, and fluxes, the mathematical description of dynamical systems has become an essential aspect of research in systems biology. Models are often encoded in formats such as SBML, whose structure is very complex and difficult to evaluate due to many special cases. Results This article describes an efficient algorithm to solve SBML models that are interpreted in terms of ordinary differential equations. We begin our consideration with a formal representation of the mathematical form of the models and explain all parts of the algorithm in detail, including several preprocessing steps. We provide a flexible reference implementation as part of the Systems Biology Simulation Core Library, a community-driven project providing a large collection of numerical solvers and a sophisticated interface hierarchy for the definition of custom differential equation systems. To demonstrate the capabilities of the new algorithm, it has been tested with the entire SBML Test Suite and all models of BioModels Database. Conclusions The formal description of the mathematics behind the SBML format facilitates the implementation of the algorithm within specifically tailored programs. The reference implementation can be used as a simulation backend for Java™-based programs. Source code, binaries, and documentation can be freely obtained under the terms of the LGPL version 3 from http://simulation-core.sourceforge.net. Feature requests, bug reports, contributions, or any further discussion can be directed to the mailing list simulation-core-development@lists.sourceforge.net. PMID:23826941

  5. Re-awakening Magmatic Systems: The Mechanics of an Open-system Event

    NASA Astrophysics Data System (ADS)

    Bergantz, George; Burgisser, Alain; Schleicher, Jillian

    2016-04-01

    The re-awakening of magmatic systems requires new magma input, which often induces mixing with a resident magma existing as a crystal-rich mush. This is expressed by complex phenocryst populations, many of which preserve evidence of multiple episodes of recycling. The unlocking and mobilization of these resident mushes conditions the progress of re-awakening, however their processes are poorly understood. Crystal-rich but mobile systems, dominated by their granular mechanics, are not satisfactorily explained from either fluid or solid-like models. We will present a generalizing framework for describing the mechanics of crystal-rich mushes based on the notion of force chains. Force chains arise from crystal-crystal contacts and describe the highly non-uniform way that stress is transmitted in a crystal-rich mush. Using CFD-DEM simulations that resolve crystal-scale mechanics, we will show how the populations of crystal mush force chains and their spatial fabric change during an open-system event. We will show how the various forms of dissipation, such as: fluid drag, particle-fluid drag, particle normal and shear lubrication, and contact friction, jointly contribute to the processes of magma mush unlocking, mobilization and fabric formation. We will also describe non-intuitive constitutive behavior such as non-local and non-affine deformation as well as complex, rheological transitions from continuous to discontinuous shear thickening as a function of the dimensionless shear rate. One implication of this is that many of the commonly-invoked postulates about magma behavior such as lock-up at a critical crystallinity and suspension rheology, are better understood from a micro-physical (crystal-scale) perspective as a combination of far-field geometrical controls, local frictional thickening and shear jamming, each with distinct time scales. This kind of crystal-based unifying framework can simultaneously recover diverse processes such as strain-localization, shear

  6. Simulations of noise in disordered systems

    SciTech Connect

    Reichhardt, C.; Reichhardt, C. J.

    2003-01-01

    We use particle dynamics simulations to probe the correlations between noise and dynamics in a variety of disordered systems, including super conducting vortices, 2D electron liquid crystals, colloids, domain walls, and granular media. The noise measurements offer an experimentally accessible link to the microscopic dynamics, such as plastic versus elastic flow during transport, and can provide a signature of dynamical reordering transitions in the system. We consider broad and narrow band noise in transport systems, as well as the fluctuations of dislocation density in a system near the melting transition.

  7. Plans for wind energy system simulation

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.

    1978-01-01

    A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.

  8. A simulation of data acquisition system for SSC experiments

    SciTech Connect

    Watase, Y.; Ikeda, H.

    1989-04-01

    A simulation on some parts of the data acquisition system was performed using a general purpose simulation language GPSS. Several results of the simulation are discussed for the data acquisition system for the SSC experiment.

  9. An investigation into pilot and system response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Materials relating to the study of pilot and system response to critical in-flight events (CIFE) are given. An annotated bibliography and a trip summary outline are presented, as are knowledge surveys with accompanying answer keys. Performance profiles of pilots and performance data from the simulations of CIFE's are given. The paper and pencil testing materials are reproduced. Conditions for the use of the additive model are discussed. A master summary of data for the destination diversion scenario is given. An interview with an aircraft mechanic demonstrates the feasibility of system problem diagnosis from a verbal description of symptoms and shows the information seeking and problem solving logic used by an expert to narrow the list of probable causes of aircraft failure.

  10. Distributed estimation in networked systems under periodic and event-based communication policies

    NASA Astrophysics Data System (ADS)

    Millán, Pablo; Orihuela, Luis; Jurado, Isabel; Vivas, Carlos; Rubio, Francisco R.

    2015-01-01

    This paper's aim is to present a novel design technique for distributed estimation in networked systems. The problem assumes a network of interconnected agents each one having partial access to measurements from a linear plant and broadcasting their estimations to their neighbours. The objective is to reach a reliable estimation of the plant state from every agent location. The observer's structure implemented in each agent is based on local Luenberger-like observers in combination with consensus strategies. The paper focuses on the following network related issues: delays, packet dropouts and communication policy (time and event-driven). The design problem is solved via linear matrix inequalities and stability proofs are provided. The technique is of application for sensor networks and large scale systems where centralized estimation schemes are not advisable and energy-aware implementations are of interest. Simulation examples are provided to show the performance of the proposed methodologies.

  11. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  12. Assessment of WRF microphysics schemes to simulate extreme precipitation events from the perspective of GMI radiative signatures

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Shin, D. B.; Joh, M.

    2015-12-01

    Numerical simulations of precipitation depend to a large degree on the assumed cloud microphysics schemes representing the formation, growth and fallout of cloud droplets and ice crystals. Recent studies show that assumed cloud microphysics play a major role not only in forecasting precipitation, especially in cases of extreme precipitation events, but also in the quality of the passive microwave rainfall estimation. Evaluations of the various Weather Research Forecasting (WRF) model microphysics schemes in this study are based on a method that was originally developed to construct the a-priori databases of precipitation profiles and associated brightness temperatures (TBs) for precipitation retrievals. This methodology generates three-dimensional (3D) precipitation fields by matching the GPM dual frequency radar (DPR) reflectivity profiles with those calculated from cloud resolving model (CRM)-derived hydrometeor profiles. The method eventually provides 3D simulated precipitation fields over the DPR scan swaths. That is, atmospheric and hydrometeor profiles can be generated at each DPR pixel based on CRM and DPR reflectivity profiles. The generated raining systems over DPR observation fields can be applied to any radiometers that are unaccompanied with a radar for microwave radiative calculation with consideration of each sensor's channel and field of view. Assessment of the WRF model microphysics schemes for several typhoon cases in terms of emission and scattering signals of GMI will be discussed.

  13. Numerical simulation of the 12 May 1997 CME Event: The role of magnetic reconnection

    NASA Astrophysics Data System (ADS)

    Cohen, O.; Attrill, G. D. R.; Schwadron, N. A.; Crooker, N. U.; Owens, M. J.; Downs, C.; Gombosi, T. I.

    2010-10-01

    We perform a numerical study of the evolution of a Coronal Mass Ejection (CME) and its interaction with the coronal magnetic field based on the 12 May 1997, CME event using a global MagnetoHydroDynamic (MHD) model for the solar corona. The ambient solar wind steady-state solution is driven by photospheric magnetic field data, while the solar eruption is obtained by superimposing an unstable flux rope onto the steady-state solution. During the initial stage of CME expansion, the core flux rope reconnects with the neighboring field, which facilitates lateral expansion of the CME footprint in the low corona. The flux rope field also reconnects with the oppositely orientated overlying magnetic field in the manner of the breakout model. During this stage of the eruption, the simulated CME rotates counter-clockwise to achieve an orientation that is in agreement with the interplanetary flux rope observed at 1 AU. A significant component of the CME that expands into interplanetary space comprises one of the side lobes created mainly as a result of reconnection with the overlying field. Within 3 hours, reconnection effectively modifies the CME connectivity from the initial condition where both footpoints are rooted in the active region to a situation where one footpoint is displaced into the quiet Sun, at a significant distance (≈1R$\\odot$) from the original source region. The expansion and rotation due to interaction with the overlying magnetic field stops when the CME reaches the outer edge of the helmet streamer belt, where the field is organized on a global scale. The simulation thus offers a new view of the role reconnection plays in rotating a CME flux rope and transporting its footpoints while preserving its core structure.

  14. EMERGENCY BRAKING IN ADULTS VERSUS NOVICE TEEN DRIVERS: RESPONSE TO SIMULATED SUDDEN DRIVING EVENTS

    PubMed Central

    Kandadai, Venk; McDonald, Catherine C.; Winston, Flaura K.

    2015-01-01

    Motor vehicle crashes remain the leading cause of death in teens in the United States. Newly licensed drivers are the group most at risk for crashes. Their driving skills are very new, still very often untested, so that their ability to properly react in an emergency situation remains a research question. Since it is impossible to expose human subjects to critical life threatening driving scenarios, researchers have been increasingly using driving simulators to assess driving skills. This paper summarizes the results of a driving scenario in a study comparing the driving performance of novice teen drivers (n=21) 16–17 year olds with 90 days of provisional licensure with that of experienced adult drivers (n=17) 25–50 year olds with at least 5 years of PA licensure, at least 100 miles driven per week and no self-reported collisions in the previous 3 years. As part of a 30 to 35 simulated drive that encompassed the most common scenarios that result in serious crashes, participants were exposed to a sudden car event. As the participant drove on a suburban road, a car surged from a driveway hidden by a fence on the right side of the road. To avoid the crash, participants must hard brake, exhibiting dynamic control over both attentional and motor resources. The results showed strong differences between the experienced adult and novice teen drivers in the brake pressure applied. When placed in the same situation, the novice teens decelerated on average 50% less than the experienced adults (p<0.01). PMID:26709330

  15. Java simulations of embedded control systems.

    PubMed

    Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt.

  16. Modular Aero-Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2006-01-01

    The Modular Aero-Propulsion System Simulation (MAPSS) is a graphical simulation environment designed for the development of advanced control algorithms and rapid testing of these algorithms on a generic computational model of a turbofan engine and its control system. MAPSS is a nonlinear, non-real-time simulation comprising a Component Level Model (CLM) module and a Controller-and-Actuator Dynamics (CAD) module. The CLM module simulates the dynamics of engine components at a sampling rate of 2,500 Hz. The controller submodule of the CAD module simulates a digital controller, which has a typical update rate of 50 Hz. The sampling rate for the actuators in the CAD module is the same as that of the CLM. MAPSS provides a graphical user interface that affords easy access to engine-operation, engine-health, and control parameters; is used to enter such input model parameters as power lever angle (PLA), Mach number, and altitude; and can be used to change controller and engine parameters. Output variables are selectable by the user. Output data as well as any changes to constants and other parameters can be saved and reloaded into the GUI later.

  17. Java simulations of embedded control systems.

    PubMed

    Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt. PMID:22163674

  18. Java Simulations of Embedded Control Systems

    PubMed Central

    Farias, Gonzalo; Cervin, Anton; Årzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt. PMID:22163674

  19. Cassini radar : system concept and simulation results

    NASA Astrophysics Data System (ADS)

    Melacci, P. T.; Orosei, R.; Picardi, G.; Seu, R.

    1998-10-01

    The Cassini mission is an international venture, involving NASA, the European Space Agency (ESA) and the Italian Space Agency (ASI), for the investigation of the Saturn system and, in particular, Titan. The Cassini radar will be able to see through Titan's thick, optically opaque atmosphere, allowing us to better understand the composition and the morphology of its surface, but the interpretation of the results, due to the complex interplay of many different factors determining the radar echo, will not be possible without an extensive modellization of the radar system functioning and of the surface reflectivity. In this paper, a simulator of the multimode Cassini radar will be described, after a brief review of our current knowledge of Titan and a discussion of the contribution of the Cassini radar in answering to currently open questions. Finally, the results of the simulator will be discussed. The simulator has been implemented on a RISC 6000 computer by considering only the active modes of operation, that is altimeter and synthetic aperture radar. In the instrument simulation, strict reference has been made to the present planned sequence of observations and to the radar settings, including burst and single pulse duration, pulse bandwidth, pulse repetition frequency and all other parameters which may be changed, and possibly optimized, according to the operative mode. The observed surfaces are simulated by a facet model, allowing the generation of surfaces with Gaussian or non-Gaussian roughness statistic, together with the possibility of assigning to the surface an average behaviour which can represent, for instance, a flat surface or a crater. The results of the simulation will be discussed, in order to check the analytical evaluations of the models of the average received echoes and of the attainable performances. In conclusion, the simulation results should allow the validation of the theoretical evaluations of the capabilities of microwave instruments, when

  20. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    PubMed Central

    Nguyen, Vinh Hao; Suh, Young Soo

    2009-01-01

    This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD), sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen. PMID:22574063

  1. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  2. Simulation of Flywheel Energy Storage System Controls

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Wolff, Frederick J.; Dravid, Narayan

    2001-01-01

    This paper presents the progress made in the controller design and operation of a flywheel energy storage system. The switching logic for the converter bridge circuit has been redefined to reduce line current harmonics, even at the highest operating speed of the permanent magnet motor-generator. An electromechanical machine model is utilized to simulate charge and discharge operation of the inertial energy in the flywheel. Controlling the magnitude of phase currents regulates the rate of charge and discharge. The resulting improvements are demonstrated by simulation.

  3. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; Vlachoudis, Vasilis

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  4. On the use of Paleo DEMS for Simulation of historical Tsunami Events

    NASA Astrophysics Data System (ADS)

    Wronna, Martin; Baptista, Maria Ana; Götz, Joachim

    2016-04-01

    In this study, we present a methodology to reconstruct a Paleo Digital Elevation Model (PDEM) to alter geomorphological contexts between the present and the desired paleo period. We aim to simulate a historical tsunami propagation in the same geomorphological contexts of the time of the event. The methodology uses a combination of historical data, GPS-measurements with more recent LIDAR data to build PDEMs. Antique maps are georeferenced; altitude elevations are attributed through descriptions, and old pictures are used to estimate the original outline of a given site. Antique maps are georeferenced to obtain the location of landform and building features. Analysis and interpretation of the historical accounts, descriptions and old pictures serve to attribute an approximate elevation to landform and building features. River mouths and water courses outline can be rebuilt by the boundaries as given in the antique maps. Analysis of present day river mouths with similar characteristics permits the reconstruction of the antique water courses. GPS-RTK measurements along chosen river mouths' in similar geomorphologic environments is used to derive their inclination. We applied this methodology to the 1st November 1755 flooding of Cascais-Portugal. Our results show that using the PDEM we can reproduce the inundation described in most of the historical accounts. This study received funding from project ASTARTE- Assessment Strategy and Risk Reduction for Tsunamis in Europe a collaborative project Grant 603839, FP7-ENV2013 6.4-3

  5. Topological events in two-dimensional grain growth: Experiments and simulations

    SciTech Connect

    Fradkov, V.E.; Glicksman, M.E.; Palmer, M.; Rajan, K. . Materials Engineering Dept.)

    1994-08-01

    Grain growth in polycrystals is a process that occurs as a result of the vanishing of small grains. The mean topological class of vanishing two-dimensional (2-D) grains was found experimentally to be about 4.5. This result suggests that most vanishing grains are either 4- or 5-sided. A recent theory of 2-D grain growth is explicitly based on this fact, treating the switching as random events. The process of shrinking of 4- and 5-sided two-dimensional grains was observed experimentally on polycrystalline films of transparent, pure succinonitrile (SCN). Grain shrinking was studied theoretically and simulated by computer (both dynamic and Monte Carlo). It was found that most shrinking grains are topologically stable and remain within their topological class until they are much smaller than their neighbors. They discuss differences which were found with respect to the behavior of 2-D polycrystals, a 2-D ideal soap froth, and a 2-D section of a 3-D grain structure.

  6. Hybrid stochastic simulations of intracellular reaction-diffusion systems

    PubMed Central

    Kalantzis, Georgios

    2009-01-01

    With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a novel hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed a dynamic partitioning strategy using fractional propensities. In that way processes with high frequency are simulated mostly with deterministic rate-based equations, and those with low frequency mostly with the exact stochastic algorithm of Gillespie. In this way we preserve the stochastic behavior of cellular pathways while being able to apply it to large populations of molecules. In this article we describe this hybrid algorithmic approach, and we demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors. PMID:19414282

  7. Interactive simulation of digital communication systems

    NASA Astrophysics Data System (ADS)

    Modestino, J. W.; Matis, K. R.

    1984-01-01

    In this paper, efforts to develop a comprehensive tool for the digital simulation of a wide variety of point-to-point digital communication systems are described. These efforts have resulted in the interactive communications simulator (ICS), a flexible, graphics-oriented, and highly interactive hardware/software system consisting of a typical minicomputer acting as host to a fast peripheral array processor. This system is presently being employed both to evaluate existing modem performance and to explore new modulation/coding concepts approprate for military, commercial, and space applications. A detailed functional description of the ICS is provided together with pertinent software considerations. An outline of existinig ICS capabilities is presented and illustrated through typical graphical output. A discussion of channel modeling considerations is provided. The use of the ICS in the overall design of receiver structures for impulsive noise channels will be illustrated.

  8. Simulating Complex Window Systems using BSDF Data

    SciTech Connect

    Konstantoglou, Maria; Jonsson, Jacob; Lee, Eleanor

    2009-06-22

    Nowadays, virtual models are commonly used to evaluate the performance of conventional window systems. Complex fenestration systems can be difficult to simulate accurately not only because of their geometry but also because of their optical properties that scatter light in an unpredictable manner. Bi-directional Scattering Distribution Functions (BSDF) have recently been developed based on a mixture of measurements and modelling to characterize the optics of such systems. This paper describes the workflow needed to create then use these BSDF datasets in the Radiance lighting simulation software. Limited comparisons are made between visualizations produced using the standard ray-tracing method, the BSDF method, and that taken in a full-scale outdoor mockup.

  9. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  10. Runway Incursion Prevention System Simulation Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.

    2002-01-01

    A Runway Incursion Prevention System (RIPS) was evaluated in a full mission simulation study at the NASA Langley Research center in March 2002. RIPS integrates airborne and ground-based technologies to provide (1) enhanced surface situational awareness to avoid blunders and (2) alerts of runway conflicts in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted in a high fidelity simulator. The purpose of the study was to evaluate the RIPS airborne incursion detection algorithms and associated alerting and airport surface display concepts. Eight commercial airline crews participated as test subjects completing 467 test runs. This paper gives an overview of the RIPS, simulation study, and test results.

  11. Events Management Education through CD-ROM Simulation at Victoria University of Technology.

    ERIC Educational Resources Information Center

    Perry, Marcia; And Others

    There has been a rapid growth in the events industry in Victoria and Australia over the past five years with an increase in large scale events--resulting in substantive economic impact. The growth in events in Australia is projected to continue to beyond 2001. The Department of Management at Victoria University of Technology (VU) received a…

  12. Looking Back: Events That Have Shaped Our Current Child Care Delivery System.

    ERIC Educational Resources Information Center

    Neugebauer, Roger

    2000-01-01

    Reports findings of an unscientific survey of early childhood professionals asked to reflect upon the history, landmark events, and significant trends in the child care delivery system. Three events viewed as most influential are highlighted: (1) World War II; (2) women's movement; and (3) Head Start. Eleven other events also cited are discussed.…

  13. Efficient multiscale simulation of simple metallic systems

    NASA Astrophysics Data System (ADS)

    Choly, Nicholas Isaac

    2004-12-01

    The steady increase in computational resources and numerical sophistication has brought about a new approach in physical simulation. The methods that comprise this approach are known as multiscale methods, and have the defining characteristic of combining several simulation methods together, rendering tractable physical problems that no single simulation method can resolve. We have developed an approach for coupling quantum-mechanical and classical methods for the efficient simulation of multiscale problems in simple metals. The present multiscale method employs orbital-free density functional theory, in which fictitious orbitals are never introduced. We review the theory, and describe the state-of-the-art functionals associated with it. We have developed an efficient simulation code for performing orbital-free density functional theory calculations, and we describe the methods developed to treat the functional minimization problem. One of the biggest barriers hindering the widespread use of orbital-free methods is that only local pseudopotentials can be used, and hence the powerful machinery of norm-conserving pseudopotentials is inapplicable. We develop a similar machinery for local pseudopotentials, and we report on the application of these methods. We solve several problems associated with the efficient use of orbital-free density functional methods. Certain orbital-free methods are formulated in reciprocal space and are applicable to periodic systems. Incorporation of these methods in a multiscale setting requires that the effects of periodicity be absent. A direct translation of the methods to real space is extremely inefficient. Motivated by these considerations, we have developed an efficient method for applying orbital-free methods to non-periodic systems. We also overcome an algorithmic problem with the calculation of ionic forces in grid-based electronic structure methods in general. We develop and test an efficient method for computing ionic forces that

  14. Holodeck: Telepresence Dome Visualization System Simulations

    NASA Technical Reports Server (NTRS)

    Hite, Nicolas

    2012-01-01

    This paper explores the simulation and consideration of different image-projection strategies for the Holodeck, a dome that will be used for highly immersive telepresence operations in future endeavors of the National Aeronautics and Space Administration (NASA). Its visualization system will include a full 360 degree projection onto the dome's interior walls in order to display video streams from both simulations and recorded video. Because humans innately trust their vision to precisely report their surroundings, the Holodeck's visualization system is crucial to its realism. This system will be rigged with an integrated hardware and software infrastructure-namely, a system of projectors that will relay with a Graphics Processing Unit (GPU) and computer to both project images onto the dome and correct warping in those projections in real-time. Using both Computer-Aided Design (CAD) and ray-tracing software, virtual models of various dome/projector geometries were created and simulated via tracking and analysis of virtual light sources, leading to the selection of two possible configurations for installation. Research into image warping and the generation of dome-ready video content was also conducted, including generation of fisheye images, distortion correction, and the generation of a reliable content-generation pipeline.

  15. Improving Customer Waiting Time at a DMV Center Using Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Arnaout, Georges M.; Bowling, Shannon

    2010-01-01

    Virginia's Department of Motor Vehicles (DMV) serves a customer base of approximately 5.6 million licensed drivers and ID card holders and 7 million registered vehicle owners. DMV has more daily face-to-face contact with Virginia's citizens than any other state agency [1]. The DMV faces a major difficulty in keeping up with the excessively large customers' arrival rate. The consequences are queues building up, stretching out to the entrance doors (and sometimes even outside) and customers complaining. While the DMV state employees are trying to serve at their fastest pace, the remarkably large queues indicate that there is a serious problem that the DMV faces in its services, which must be dealt with rapidly. Simulation is considered as one of the best tools for evaluating and improving complex systems. In this paper, we use it to model one of the DMV centers located in Norfolk, VA. The simulation model is modeled in Arena 10.0 from Rockwell systems. The data used is collected from experts of the DMV Virginia headquarter located in Richmond. The model created was verified and validated. The intent of this study is to identify key problems causing the delays at the DMV centers and suggest possible solutions to minimize the customers' waiting time. In addition, two tentative hypotheses aiming to improve the model's design are tested and validated.

  16. Simulation of a Production Facility with an Automated Transport System

    SciTech Connect

    ABRAMCZYK, GLENN

    2004-04-07

    A model was needed to assess material throughput and validate the conceptual design of a production facility, including equipment lists and layout. The initial desire was to use a commercially available discrete event simulation package. However, the available software was found to be too limited in capability. Database interface software was used to develop autonomous intelligent manufacturing workstations and material transporters. The initial Extend model used to assess material throughput and develop equipment lists for the preconceptual design effort was upgraded with software add-ons from Simulation Dynamics, Inc. (SDI). Use of the SDI database interface allowed the upgraded model to include: 1. a material mass balance at any level of detail required by the user, and 2. a transport system model that includes all transport system movements, time delays, and transfers between systems. This model will assist in evaluating transport system capacity, sensitive time delays in the system, and optimal operating strategies. An additional benefit of using the SDI database interface is dramatically improved run time performance. This allows significantly more runs to be completed to provide better statistics for overall plant performance. The model has all system and process parameters entered into sub-component accessible tables. All information for the manufactured items and process data is automatically generated and written to the database. The standard software is used for the movement of manufactured items between workstations, and for sequence and timing functions. Use of the database permits almost unlimited process control and data collection with an insignificant effect on run time.

  17. Simulation of the infrared signature of transient luminous events in the middle atmosphere for a limb line of sight

    NASA Astrophysics Data System (ADS)

    Romand, Frédéric; Croizé, Laurence; Payan, Sébastien; Huret, Nathalie

    2016-04-01

    Transient Luminous Events (TLE) are electrical and optical events which occurs above thunderstorms. Visual signatures are reported since the beginning of the 20th century but the first picture is accidentally recorded from a television camera in 1989. Their occurrence is closely linked with the lightning activity below thunderstorms. TLEs are observed from the base of the stratosphere to the thermosphere (15 - 110 km). They are a very brief phenomenon which lasts from 1 to 300 milliseconds. At a worldwide scale, four TLEs occur each minute. The energy deposition, about some tenth of megajoules, is able to ionize, dissociate and excite the molecules of the atmosphere. Atmospheric discharges in the troposphere are important sources of NO and NO2. TLEs might have the same effects at higher altitudes, in the stratosphere. NOx then can affect the concentration of O3 and OH. Consequently, TLEs could be locally important contributors to the chemical budget of the middle atmosphere. The perturbation of the atmospheric chemistry induced by TLEs has the consequence to locally modify the radiations in the infrared during the minutes following the event. The interest of studying the infrared signature of a TLE is twofold. For the atmospheric sciences it allows to link the perturbed composition to the resulting infrared spectrum. Then, some Defense systems like detection and guiding devices are equipped with airborne infrared sensors so that the TLE infrared signature might disturb them. We want to obtain a quantitative and kinetic evaluation of the infrared signature of the atmosphere locally perturbed by a TLE. In order to do so we must model three phenomena. 1) The plasma/chemistry coupling, which describes how the different energetic levels of atmospheric molecules are populated by the energetic deposition of the TLE. This step lasts the time of the lightning itself. 2) The chemical kinetics which describes how these populations will evolve in the following minutes. 3) The

  18. Dynamic triggering of creep events in the Salton Trough, Southern California by regional M ≥ 5.4 earthquakes constrained by geodetic observations and numerical simulations

    NASA Astrophysics Data System (ADS)

    Wei, Meng; Liu, Yajing; Kaneko, Yoshihiro; McGuire, Jeffrey J.; Bilham, Roger

    2015-10-01

    Since a regional earthquake in 1951, shallow creep events on strike-slip faults within the Salton Trough, Southern California have been triggered at least 10 times by M ≥ 5.4 earthquakes within 200 km. The high earthquake and creep activity and the long history of digital recording within the Salton Trough region provide a unique opportunity to study the mechanism of creep event triggering by nearby earthquakes. Here, we document the history of fault creep events on the Superstition Hills Fault based on data from creepmeters, InSAR, and field surveys since 1988. We focus on a subset of these creep events that were triggered by significant nearby earthquakes. We model these events by adding realistic static and dynamic perturbations to a theoretical fault model based on rate- and state-dependent friction. We find that the static stress changes from the causal earthquakes are less than 0.1 MPa and too small to instantaneously trigger creep events. In contrast, we can reproduce the characteristics of triggered slip with dynamic perturbations alone. The instantaneous triggering of creep events depends on the peak and the time-integrated amplitudes of the dynamic Coulomb stress change. Based on observations and simulations, the stress change amplitude required to trigger a creep event of a 0.01-mm surface slip is about 0.6 MPa. This threshold is at least an order of magnitude larger than the reported triggering threshold of non-volcanic tremors (2-60 kPa) and earthquakes in geothermal fields (5 kPa) and near shale gas production sites (0.2-0.4 kPa), which may result from differences in effective normal stress, fault friction, the density of nucleation sites in these systems, or triggering mechanisms. We conclude that shallow frictional heterogeneity can explain both the spontaneous and dynamically triggered creep events on the Superstition Hills Fault.

  19. Exupery volcano fast response system - The event detection and waveform classification system

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity since the number and the size of certain types of seismic events usually increase before periods of volcanic crisis. The implementation of an automatic detection and classification system for seismic signals of volcanic origin allows not only for the processing of large amounts of data in short time, but also provides consistent and time-invariant results. Here, we have developed a system based upon a combination of different methods. To enable a first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is widely distributed in active volcano monitoring observatories worldwide. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters which are also used to quantify the volcanic activity. Most automatic classification systems need a sufficiently large pre-classified data set for training the system. However in case of a volcanic crisis we are often confronted with a lack of training data due to insufficient prior observations because prior data acquisition might be carried out with different equipment at a low number of sites and due to the imminent crisis there might be no time for the time-consuming and tedious process of preparing a training data set. For this reason we have developed a novel seismic event spotting technique in order to be less dependent on the existence of previously acquired data bases of event classes. One main goal is therefore to provide observatory staff with a robust event classification based on a minimum number of reference waveforms. By using a "learning-while-recording" approach we are allowing for the fast build-up of a

  20. Hidden Conformation Events in DNA Base Extrusions: A Generalized Ensemble Path Optimization and Equilibrium Simulation Study.

    PubMed

    Cao, Liaoran; Lv, Chao; Yang, Wei

    2013-08-13

    DNA base extrusion is a crucial component of many biomolecular processes. Elucidating how bases are selectively extruded from the interiors of double-strand DNAs is pivotal to accurately understanding and efficiently sampling this general type of conformational transitions. In this work, the on-the-path random walk (OTPRW) method, which is the first generalized ensemble sampling scheme designed for finite-temperature-string path optimizations, was improved and applied to obtain the minimum free energy path (MFEP) and the free energy profile of a classical B-DNA major-groove base extrusion pathway. Along the MFEP, an intermediate state and the corresponding transition state were located and characterized. The MFEP result suggests that a base-plane-elongation event rather than the commonly focused base-flipping event is dominant in the transition state formation portion of the pathway; and the energetic penalty at the transition state is mainly introduced by the stretching of the Watson-Crick base pair. Moreover to facilitate the essential base-plane-elongation dynamics, the surrounding environment of the flipped base needs to be intimately involved. Further taking the advantage of the extended-dynamics nature of the OTPRW Hamiltonian, an equilibrium generalized ensemble simulation was performed along the optimized path; and based on the collected samples, several base-flipping (opening) angle collective variables were evaluated. In consistence with the MFEP result, the collective variable analysis result reveals that none of these commonly employed flipping (opening) angles alone can adequately represent the base extrusion pathway, especially in the pre-transition-state portion. As further revealed by the collective variable analysis, the base-pairing partner of the extrusion target undergoes a series of in-plane rotations to facilitate the base-plane-elongation dynamics. A base-plane rotation angle is identified to be a possible reaction coordinate to represent

  1. Workshop on data acquisition and trigger system simulations for high energy physics

    SciTech Connect

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- an Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.

  2. The development of an incident event reporting system for nursing students.

    PubMed

    Chiou, Shwu-Fen; Huang, Ean-Wen; Chuang, Jen-Hsiang

    2009-01-01

    Incident events may occur when nursing students are present in the clinical setting. Their inexperience and unfamiliarity with clinical practice put them at risk for making mistakes that could potentially harm patients and themselves. However, there are deficiencies with incident event reporting systems, including incomplete data and delayed reports. The purpose of this study was to develop an incident event reporting system for nursing students in clinical settings and evaluate its effectiveness. This study was undertaken in three phases. In the first phase, a literature review and focus groups were used to develop the architecture of the reporting system. In the second phase, the reporting system was implemented. Data from incident events that involved nursing students were collected for a 12-month period. In the third phase, a pre-post trial was undertaken to evaluate the performance of the reporting system. The ASP.NET software and Microsoft Access 2003 were used to create an interactive web-based interface and design a database for the reporting system. Email notifications alerted the nursing student's teacher when an incident event was reported. One year after installing the reporting system, the number of reported incident events increased tenfold. However, the time to report the incident event and the time required to complete the reporting procedures were shorter than before implementation of the reporting system. The incident event reporting system appeared to be effective in more comprehensively reporting the number of incident events and shorten the time required for reporting them compared to traditional written reports.

  3. An event-based model of braided river system aquifers heterogeneity based on Multiple Points Statistics

    NASA Astrophysics Data System (ADS)

    Renard, P.; Pirot, G.

    2012-12-01

    Braided-rivers are common in mountainous regions like the Swiss Alps. These dynamic systems generate highly heterogeneous deposits and form an important part of alluvial aquifers which are tapped for agriculture and drinking water supply. In this presentation, we propose to integrate large scale and high resolution LIDAR data in a pseudo genetic approach embedding Multiple Points Statistics (MPS) to model the heterogeneity of such aquifers. A way to build 3D sedimentary models is to use descriptive methods, which translate data into conceptual facies models but do not offer uncertainty quantification. Another possibility is the use of stochastic models but most of them do not include strong geological knowledge and their degree of realism can be rather weak. Another approach is to use process-based methods. In this work, we imitate the processes occurring during flood events, by building successive topographies with the Direct Sampling (DS) multiple point statistics algorithm. Each successive topography is conditioned by the previous one. All those steps are constrained by a series of LIDAR data sets allowing to train the algorithm. This is different from classical MPS models, since we do not directly use MPS to model the lithofacies directly, but instead to simulate the processes that lead to the heterogeneity in order to ensure that higher statistics that can be inferred from field data are accurately reproduced. The use of the DS is motivated by the fact that it an MPS technique allowing to co-simulate continuous variables. It is easy to condition to field data and offers a high degree of realism in the simulations. The underlying erosion-deposition process leaves some records of each event in the form of remaining layers, which are populated with facies of different granulometry, according to some predefined rules function of the geobody's shape and dimensions. Input parameters allow controlling the aggradation/degradation intensity.

  4. Simulation of a data archival and distribution system at GSFC

    NASA Technical Reports Server (NTRS)

    Bedet, Jean-Jacques; Bodden, Lee; Dwyer, AL; Hariharan, P. C.; Berbert, John; Kobler, Ben; Pease, Phil

    1993-01-01

    A version-0 of a Data Archive and Distribution System (DADS) is being developed at GSFC to support existing and pre-EOS Earth science datasets and test Earth Observing System Data and Information System (EOSDIS) concepts. The performance of DADS is predicted using a discrete event simulation model. The goals of the simulation were to estimate the amount of disk space needed and the time required to fulfill the DADS requirements for ingestion (14 GB/day) and distribution (48 GB/day). The model has demonstrated that 4 mm and 8 mm stackers can play a critical role in improving the performance of the DADS, since it takes, on average, 3 minutes to manually mount/dismount tapes compared to less than a minute with stackers. With two 4 mm stackers and two 8 mm stackers, and a single operator per shift, the DADS requirements can be met within 16 hours using a total of 9 GB of disk space. When the DADS has no stacker, and the DADS depends entirely on operators to handle the distribution tapes, the simulation has shown that the DADS requirements can still be met within 16 hours, but a minimum of 4 operators per shift were required. The compression/decompression of data sets is very CPU intensive, and relatively slow when performed in software, thereby contributing to an increase in the amount of disk space needed.

  5. HSI Prototypes for Human Systems Simulation Laboratory

    SciTech Connect

    Jokstad, Håkon; McDonald, Rob

    2015-09-01

    This report describes in detail the design and features of three Human System Interface (HSI) prototypes developed by the Institutt for Energiteknikk (IFE) in support of the U.S. Department of Energy’s Light Water Reactor Sustainability Program under Contract 128420 through Idaho National Laboratory (INL). The prototypes are implemented for the Generic Pressurized Water Reactor simulator and installed in the Human Systems Simulation Laboratory at INL. The three prototypes are: 1) Power Ramp display 2) RCS Heat-up and Cool-down display 3) Estimated time to limit display The power ramp display and the RCS heat-up/cool-down display are designed to provide good visual indications to the operators on how well they are performing their task compared to their target ramp/heat-up/cool-down rate. The estimated time to limit display is designed to help operators restore levels or pressures before automatic or required manual actions are activated.

  6. Effects of Solar Particle Event-Like Proton Radiation and/or Simulated Microgravity on Circulating Mouse Blood Cells

    PubMed Central

    Romero-Weaver, Ana L.; Lin, Liyong; Carabe-Fernandez, Alejandro; Kennedy, Ann R.

    2014-01-01

    Astronauts traveling in space missions outside of low Earth orbit will be exposed for longer times to a microgravity environment. In addition, the increased travel time involved in exploration class missions will result in an increased risk of exposure to significant doses of solar particle event (SPE) radiation. Both conditions could significantly affect the number of circulating blood cells. Therefore, it is critical to determine the combined effects of exposure to both microgravity and SPE radiation. The purpose of the present study was to assess these risks by evaluating the effects of SPE-like proton radiation and/or microgravity, as simulated with the hindlimb unloading (HU) system, on circulating blood cells using mouse as a model system. The results indicate that exposure to HU alone caused minimal or no significant changes in mouse circulating blood cell numbers. The exposure of mice to SPE-like proton radiation with or without HU treatment caused a significant decrease in the number of circulating lymphocytes, granulocytes and platelets. The reduced numbers of circulating lymphocytes, granulocytes, and platelets, resulting from the SPE-like proton radiation exposure, with or without HU treatment, in mice suggest that astronauts participating in exploration class missions may be at greater risk of developing infections and thrombotic diseases; thus, countermeasures may be necessary for these biological endpoints. PMID:25360441

  7. Effects of Solar Particle Event-Like Proton Radiation and/or Simulated Microgravity on Circulating Mouse Blood Cells.

    PubMed

    Romero-Weaver, Ana L; Lin, Liyong; Carabe-Fernandez, Alejandro; Kennedy, Ann R

    2014-08-01

    Astronauts traveling in space missions outside of low Earth orbit will be exposed for longer times to a microgravity environment. In addition, the increased travel time involved in exploration class missions will result in an increased risk of exposure to significant doses of solar particle event (SPE) radiation. Both conditions could significantly affect the number of circulating blood cells. Therefore, it is critical to determine the combined effects of exposure to both microgravity and SPE radiation. The purpose of the present study was to assess these risks by evaluating the effects of SPE-like proton radiation and/or microgravity, as simulated with the hindlimb unloading (HU) system, on circulating blood cells using mouse as a model system. The results indicate that exposure to HU alone caused minimal or no significant changes in mouse circulating blood cell numbers. The exposure of mice to SPE-like proton radiation with or without HU treatment caused a significant decrease in the number of circulating lymphocytes, granulocytes and platelets. The reduced numbers of circulating lymphocytes, granulocytes, and platelets, resulting from the SPE-like proton radiation exposure, with or without HU treatment, in mice suggest that astronauts participating in exploration class missions may be at greater risk of developing infections and thrombotic diseases; thus, countermeasures may be necessary for these biological endpoints.

  8. Effects of Solar Particle Event-Like Proton Radiation and/or Simulated Microgravity on Circulating Mouse Blood Cells.

    PubMed

    Romero-Weaver, Ana L; Lin, Liyong; Carabe-Fernandez, Alejandro; Kennedy, Ann R

    2014-08-01

    Astronauts traveling in space missions outside of low Earth orbit will be exposed for longer times to a microgravity environment. In addition, the increased travel time involved in exploration class missions will result in an increased risk of exposure to significant doses of solar particle event (SPE) radiation. Both conditions could significantly affect the number of circulating blood cells. Therefore, it is critical to determine the combined effects of exposure to both microgravity and SPE radiation. The purpose of the present study was to assess these risks by evaluating the effects of SPE-like proton radiation and/or microgravity, as simulated with the hindlimb unloading (HU) system, on circulating blood cells using mouse as a model system. The results indicate that exposure to HU alone caused minimal or no significant changes in mouse circulating blood cell numbers. The exposure of mice to SPE-like proton radiation with or without HU treatment caused a significant decrease in the number of circulating lymphocytes, granulocytes and platelets. The reduced numbers of circulating lymphocytes, granulocytes, and platelets, resulting from the SPE-like proton radiation exposure, with or without HU treatment, in mice suggest that astronauts participating in exploration class missions may be at greater risk of developing infections and thrombotic diseases; thus, countermeasures may be necessary for these biological endpoints. PMID:25360441

  9. Data Systems Dynamic Simulation - A total system for data system design assessments and trade studies

    NASA Technical Reports Server (NTRS)

    Hooper, J. W.; Rowe, D. W.

    1978-01-01

    Data Systems Dynamic Simulation is a simulation system designed to reduce cost and time and increase the confidence and comprehensiveness of Data Systems Simulation. It is designed to simulate large data processing and communications systems from end-to-end or by subsystem. Those features relevant to system timing, control, sizing, personnel support activities, cost and external influences are modeled. Emphasis is placed on ease of use, comprehensive system performance measures, and extensive post simulation analysis capability. The system has been used to support trade studies of the NASA data system needs in the 1985 to 1990 time frame.

  10. A simple method for simulating gasdynamic systems

    NASA Technical Reports Server (NTRS)

    Hartley, Tom T.

    1991-01-01

    A simple method for performing digital simulation of gasdynamic systems is presented. The approach is somewhat intuitive, and requires some knowledge of the physics of the problem as well as an understanding of the finite difference theory. The method is explicitly shown in appendix A which is taken from the book by P.J. Roache, 'Computational Fluid Dynamics,' Hermosa Publishers, 1982. The resulting method is relatively fast while it sacrifices some accuracy.

  11. Optical simulations for Ambilight TV systems

    NASA Astrophysics Data System (ADS)

    Bruyneel, Filip; Lanoye, Lieve

    2012-06-01

    Ambilight is a unique Philips feature, where RGB LEDs are used to create a dynamic light halo around the television. This extends the screen and hence increases the viewing experience, as it draws the viewer more into the action on the screen. The feature receives very positive consumer feedback. However, implementing Ambilight in the increasingly stringent design boundary conditions of a slim and thin TV set is a challenging task. Optical simulations play a vital role in each step of the Ambilight development. Ranging from prototype to final product, we use simulations, next to prototyping, to aid the choice of LEDs, optical materials and optical systems during different phases of the design process. Each step the impact of the optical system on the mechanical design and TV set dimensions needs to be taken into account. Moreover, optical simulations are essential to guarantee the required optical performance given a big spread in LED performance, mechanical tolerances and material properties. Next to performance, optical efficiency is also an important parameter to evaluate an optical design, as it establishes the required number of LEDs and the total LED power. As such optical efficiency defines the thermal power which needs to be dissipated by the LED system. The innovation roadmap does not stop here. For future systems we see a miniaturization trend, where smaller LED packages and smaller dies are used. This evolution makes the impact of mechanical tolerances on the optical design more severe. Consequentially, this approach poses a whole new challenge to the way we use optical simulations in our design process.

  12. Simulation-based disassembly systems design

    NASA Astrophysics Data System (ADS)

    Ohlendorf, Martin; Herrmann, Christoph; Hesselbach, Juergen

    2004-02-01

    Recycling of Waste of Electrical and Electronic Equipment (WEEE) is a matter of actual concern, driven by economic, ecological and legislative reasons. Here, disassembly as the first step of the treatment process plays a key role. To achieve sustainable progress in WEEE disassembly, the key is not to limit analysis and planning to merely disassembly processes in a narrow sense, but to consider entire disassembly plants including additional aspects such as internal logistics, storage, sorting etc. as well. In this regard, the paper presents ways of designing, dimensioning, structuring and modeling different disassembly systems. Goal is to achieve efficient and economic disassembly systems that allow recycling processes complying with legal requirements. Moreover, advantages of applying simulation software tools that are widespread and successfully utilized in conventional industry sectors are addressed. They support systematic disassembly planning by means of simulation experiments including consecutive efficiency evaluation. Consequently, anticipatory recycling planning considering various scenarios is enabled and decisions about which types of disassembly systems evidence appropriateness for specific circumstances such as product spectrum, throughput, disassembly depth etc. is supported. Furthermore, integration of simulation based disassembly planning in a holistic concept with configuration of interfaces and data utilization including cost aspects is described.

  13. Digital system for structural dynamics simulation

    SciTech Connect

    Krauter, A.I.; Lagace, L.J.; Wojnar, M.K.; Glor, C.

    1982-11-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  14. Digital system for structural dynamics simulation

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.

    1982-01-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  15. Evolution of Storm-time Subauroral Electric Fields: RCM Event Simulations

    NASA Astrophysics Data System (ADS)

    Sazykin, S.; Spiro, R. W.; Wolf, R. A.; Toffoletto, F.; Baker, J.; Ruohoniemi, J. M.

    2012-12-01

    Subauroral polarization streams (SAPS) are regions of strongly-enhanced westward ExB plasma drift (poleward-directed electric fields) located just equatorward of the evening auroral oval. Several recently -installed HF (coherent scatter) radars in the SuperDARN chain at mid-latitudes present a novel opportunity for obtaining two-dimensional maps of ionospheric ExB flows at F-region altitudes that span several hours of the evening and nighttime subauroral ionosphere. These new and exciting observations of SAPS provide an opportunity and a challenge to coupled magnetosphere-ionosphere models. In this paper, we use the Rice Convection Model (RCM) to simulate several events where SAPS were observed by the mid-latitude SuperDARN chain. RCM frequently predicts the occurrence of SAPS in the subauroral evening MLT sector; the mechanism is essentially current closure on the dusk side where downward Birkeland currents (associated with the ion plasma sheet inner edge) map to a region of reduced ionospheric conductance just equatorward of the diffuse auroral precipitation (associated with the electron plasma sheet inner edge). We present detailed comparisons of model-computed ionospheric convection patterns with observations, with two goals in mind: (1) to analyze to what extent the observed appearance and time evolution of SAPS structures are driven by time variations of the cross polar cap potential drop (or, equivalently, the z-component of the interplanetary magnetic field), and (2) to evaluate the ability of the model to reproduce the spatial extent and magnitude of SAPS structures.

  16. A new method to estimate location and slip of simulated rock failure events

    NASA Astrophysics Data System (ADS)

    Heinze, Thomas; Galvan, Boris; Miller, Stephen Andrew

    2015-05-01

    At the laboratory scale, identifying and locating acoustic emissions (AEs) is a common method for short term prediction of failure in geomaterials. Above average AE typically precedes the failure process and is easily measured. At larger scales, increase in micro-seismic activity sometimes precedes large earthquakes (e.g. Tohoku, L'Aquilla, oceanic transforms), and can be used to assess seismic risk. The goal of this work is to develop a methodology and numerical algorithms for extracting a measurable quantity analogous to AE arising from the solution of equations governing rock deformation. Since there is no physical property to quantify AE derivable from the governing equations, an appropriate rock-mechanical analog needs to be found. In this work, we identify a general behavior of the AE generation process preceding rock failure. This behavior includes arbitrary localization of low magnitude events during pre-failure stage, followed by increase in number and amplitude, and finally localization around the incipient failure plane during macroscopic failure. We propose deviatoric strain rate as the numerical analog that mimics this behavior, and develop two different algorithms designed to detect rapid increases in deviatoric strain using moving averages. The numerical model solves a fully poro-elasto-plastic continuum model and is coupled to a two-phase flow model. We test our model by comparing simulation results with experimental data of drained compression and of fluid injection experiments. We find for both cases that occurrence and amplitude of our AE analog mimic the observed general behavior of the AE generation process. Our technique can be extended to modeling at the field scale, possibly providing a mechanistic basis for seismic hazard assessment from seismicity that occasionally precedes large earthquakes.

  17. An evidence-based laparoscopic simulation curriculum shortens the clinical learning curve and reduces surgical adverse events

    PubMed Central

    De Win, Gunter; Van Bruwaene, Siska; Kulkarni, Jyotsna; Van Calster, Ben; Aggarwal, Rajesh; Allen, Christopher; Lissens, Ann; De Ridder, Dirk; Miserez, Marc

    2016-01-01

    Background Surgical simulation is becoming increasingly important in surgical education. However, the method of simulation to be incorporated into a surgical curriculum is unclear. We compared the effectiveness of a proficiency-based preclinical simulation training in laparoscopy with conventional surgical training and conventional surgical training interspersed with standard simulation sessions. Materials and methods In this prospective single-blinded trial, 30 final-year medical students were randomized into three groups, which differed in the way they were exposed to laparoscopic simulation training. The control group received only clinical training during residency, whereas the interval group received clinical training in combination with simulation training. The Center for Surgical Technologies Preclinical Training Program (CST PTP) group received a proficiency-based preclinical simulation course during the final year of medical school but was not exposed to any extra simulation training during surgical residency. After 6 months of surgical residency, the influence on the learning curve while performing five consecutive human laparoscopic cholecystectomies was evaluated with motion tracking, time, Global Operative Assessment of Laparoscopic Skills, and number of adverse events (perforation of gall bladder, bleeding, and damage to liver tissue). Results The odds of adverse events were 4.5 (95% confidence interval 1.3–15.3) and 3.9 (95% confidence interval 1.5–9.7) times lower for the CST PTP group compared with the control and interval groups. For raw time, corrected time, movements, path length, and Global Operative Assessment of Laparoscopic Skills, the CST PTP trainees nearly always started at a better level and were never outperformed by the other trainees. Conclusion Proficiency-based preclinical training has a positive impact on the learning curve of a laparoscopic cholecystectomy and diminishes adverse events. PMID:27512343

  18. Investigating reaction pathways in rare events simulations of antibiotics diffusion through protein channels.

    PubMed

    Hajjar, Eric; Kumar, Amit; Ruggerone, Paolo; Ceccarelli, Matteo

    2010-11-01

    In Gram-negative bacteria, outer-membrane protein channels, such as OmpF of Escherichia coli, constitute the entry point of various classes of antibiotics. While antibacterial research and development is declining, bacterial resistance to antibiotics is rising and there is an emergency call for a new way to develop potent antibacterial agents and to bring them to the market faster and at reduced cost. An emerging strategy is to follow a bottom-up approach based on microscopically founded computational based screening, however such strategy needs better-tuned methods. Here we propose to use molecular dynamics (MD) simulations combined with the metadynamics algorithm, to study antibiotic translocation through OmpF at a molecular scale. This recently designed algorithm overcomes the time scale problem of classical MD by accelerating some reaction coordinates. It is expected that the initial assumption of the reaction coordinates is a key determinant for the efficiency and accuracy of the simulations. Previous studies using different computational schemes for a similar process only used one reaction coordinate, which is the directionality. Here we go further and see how it is possible to include more informative reaction coordinates, accounting explicitly for: (i) the antibiotic flexibility and (ii) interactions with the channel. As model systems, we select two compounds covering the main classes of antibiotics, ampicillin and moxifloxacine. We decipher the molecular mechanism of translocation of each antibiotic and highlight the important parameters that should be taken into account for improving further simulations. This will benefit the screening and design for antibiotics with better permeation properties.

  19. Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive and time consuming. One of the main contributors to the high cost and lengthy time is the need to perform many large-scale hardware tests and the inability to integrate all appropriate subsystems early in the design process. The NASA Glenn Research Center is developing the technologies required to enable simulations of full aerospace propulsion systems in sufficient detail to resolve critical design issues early in the design process before hardware is built. This concept, called the Numerical Propulsion System Simulation (NPSS), is focused on the integration of multiple disciplines such as aerodynamics, structures and heat transfer with computing and communication technologies to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS, as illustrated, is to be a "numerical test cell" that enables full engine simulation overnight on cost-effective computing platforms. There are several key elements within NPSS that are required to achieve this capability: 1) clear data interfaces through the development and/or use of data exchange standards, 2) modular and flexible program construction through the use of object-oriented programming, 3) integrated multiple fidelity analysis (zooming) techniques that capture the appropriate physics at the appropriate fidelity for the engine systems, 4) multidisciplinary coupling techniques and finally 5) high performance parallel and distributed computing. The current state of development in these five area focuses on air breathing gas turbine engines and is reported in this paper. However, many of the technologies are generic and can be readily applied to rocket based systems and combined cycles currently being considered for low-cost access-to-space applications. Recent accomplishments include: (1) the development of an industry-standard engine cycle analysis program and plug 'n play

  20. Event mean concentration and first flush effect from different drainage systems and functional areas during storms.

    PubMed

    Peng, Hai-Qin; Liu, Yan; Wang, Hong-Wu; Gao, Xue-Long; Ma, Lu-Ming

    2016-03-01

    This study aimed to investigate the characteristics of the event mean concentration (EMC) and first flush effect (FFE) during typical rainfall events in outfalls from different drainage systems and functional areas. Stormwater outfall quality data were collected from five outfalls throughout Fuzhou City (China) during 2011-2012. Samples were analyzed for water quality parameters, such as COD, NH3-N, TP, and SS. Analysis of values indicated that the order of the event mean concentrations (EMCs) in outfalls was intercepting combined system > direct emission combined system > separated system. Most of the rainfall events showed the FFE in all outfalls. The order of strength of the FFE was residential area of direct emission combined system > commercial area of separated system > residential area of intercepting combined system > office area of separated system > residential area of separated system. Results will serve as guide in managing water quality to reduce pollution from drainage systems. PMID:26564194

  1. Was the nineteenth century giant eruption of Eta Carinae a merger event in a triple system?

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, S. F.; van den Heuvel, E. P. J.

    2016-03-01

    We discuss the events that led to the giant eruption of Eta Carinae, and find that the mid-nineteenth century (in 1838-1843) giant mass-loss outburst has the characteristics of being produced by the merger event of a massive close binary, triggered by the gravitational interaction with a massive third companion star, which is the current binary companion in the Eta Carinae system. We come to this conclusion by a combination of theoretical arguments supported by computer simulations using the Astrophysical Multipurpose Software Environment. According to this model the ˜90 M⊙ present primary star of the highly eccentric Eta Carinae binary system is the product of this merger, and its ˜30 M⊙ companion originally was the third star in the system. In our model, the Homunculus nebula was produced by an extremely enhanced stellar wind, energized by tidal energy dissipation prior to the merger, which enormously boosted the radiation-driven wind mass-loss. The current orbital plane is then aligned with the equatorial plane of the Homunculus, and the symmetric lobes are roughly aligned with the argument of periastron of the current Eta Carina binary. The merger itself then occurred in 1838, which resulted in a massive asymmetric outflow in the equatorial plane of the Homunculus. The 1843 outburst can in our model be attributed to the subsequent encounter when the companion star (once the outermost star in the triple system) plunges through the bloated envelope of the merger product, once when it passed periastron again. We predict that the system has an excess space velocity of order 50 km s-1 in the equatorial plane of the Homunculus. Our triple model gives a viable explanation for the high runaway velocities typically observed in LBVs.

  2. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment

  3. Simulator for an Accelerator-Driven Subcritical Fissile Solution System

    SciTech Connect

    Klein, Steven Karl; Day, Christy M.; Determan, John C.

    2015-09-14

    LANL has developed a process to generate a progressive family of system models for a fissile solution system. This family includes a dynamic system simulation comprised of coupled nonlinear differential equations describing the time evolution of the system. Neutron kinetics, radiolytic gas generation and transport, and core thermal hydraulics are included in the DSS. Extensions to explicit operation of cooling loops and radiolytic gas handling are embedded in these systems as is a stability model. The DSS may then be converted to an implementation in Visual Studio to provide a design team the ability to rapidly estimate system performance impacts from a variety of design decisions. This provides a method to assist in optimization of the system design. Once design has been generated in some detail the C++ version of the system model may then be implemented in a LabVIEW user interface to evaluate operator controls and instrumentation and operator recognition and response to off-normal events. Taken as a set of system models the DSS, Visual Studio, and LabVIEW progression provides a comprehensive set of design support tools.

  4. GTOSS: Generalized Tethered Object Simulation System

    NASA Technical Reports Server (NTRS)

    Lang, David D.

    1987-01-01

    GTOSS represents a tether analysis complex which is described by addressing its family of modules. TOSS is a portable software subsystem specifically designed to be introduced into the environment of any existing vehicle dynamics simulation to add the capability of simulating multiple interacting objects (via multiple tethers). These objects may interact with each other as well as with the vehicle into whose environment TOSS is introduced. GTOSS is a stand alone tethered system analysis program, representing an example of TOSS having been married to a host simulation. RTOSS is the Results Data Base (RDB) subsystem designed to archive TOSS simulation results for future display processing. DTOSS is a display post processors designed to utilize the RDB. DTOSS extracts data from the RDB for multi-page printed time history displays. CTOSS is similar to DTOSS, but is designed to create ASCII plot files. The same time history data formats provided for DTOSS (for printing) are available via CTOSS for plotting. How these and other modules interact with each other is discussed.

  5. Mathematical simulation of Earth system dynamics

    NASA Astrophysics Data System (ADS)

    Dymnikov, V. P.; Lykosov, V. N.; Volodin, E. M.

    2015-05-01

    The mathematical simulation of the Earth system, the dynamics of which depends on physical, chemical, biological, and other processes and which requires interdisciplinary approaches to studying this problem, is considered. The term "the Earth system" extends the concept "the climatic system," since additional geospheres (lithosphere, heliosphere, etc.) are taken into account and a wider range of physical, chemical, biological, and social interactions is described. The present-day level of climate modeling is discussed, and some data obtained at the Institute of Numerical Mathematics, Russian Academy of Sciences (INM RAS), are presented for this purpose. The prospects for further development of climate models toward the creation of the Earth system models based on a seamless approach, according to which a unified model is used to make short-term (several days) and long-term (climatic) prediction, are considered.

  6. Forest biomass supply logistics for a power plant using the discrete-event simulation approach

    SciTech Connect

    Mobini, Mahdi; Sowlati, T.; Sokhansanj, Shahabaddine

    2011-04-01

    This study investigates the logistics of supplying forest biomass to a potential power plant. Due to the complexities in such a supply logistics system, a simulation model based on the framework of Integrated Biomass Supply Analysis and Logistics (IBSAL) is developed in this study to evaluate the cost of delivered forest biomass, the equilibrium moisture content, and carbon emissions from the logistics operations. The model is applied to a proposed case of 300 MW power plant in Quesnel, BC, Canada. The results show that the biomass demand of the power plant would not be met every year. The weighted average cost of delivered biomass to the gate of the power plant is about C$ 90 per dry tonne. Estimates of equilibrium moisture content of delivered biomass and CO2 emissions resulted from the processes are also provided.

  7. Multiprocessor data acquisition system for high event rates at the Heidelberg/Darmstadt crystal ball

    SciTech Connect

    Ender, C.; Manner, R.; Bauer, P. . Physikalisches Inst.)

    1989-10-01

    The Heidelberg/Darmstadt crystal ball detector uses a distributed data acquisition system consisting of a Fastbus/CAMAC front-end, the Heidelberg Polyp multiprocessor system with 30 processor modules, and an online VAX. For this heterogeneous multicomputer system a distributed real-time operating system was developed. The distributed computer system allows for event rates up to 2 x 10/sup 4/ events/s. It is managed user transparently and fault tolerantly.

  8. Modeling Rare Events in High Dimensional Stochastics Systems

    SciTech Connect

    Darve, Eric

    2015-06-22

    The main issue addressed in this note is the study of an algorithm to accelerate the computation of kinetic rates in the context of molecular dynamics(MD). It is based on parallel simulations of short-time trajectories and its main components are: a decomposition of phase space into macrostates or cells, a resampling procedure that ensures that the number of parallel replicas (MD simulations) in each macro-state remains constant, the use of multiple populations (colored replicas) to compute multiple rates (e.g., forward and backward rates) in one simulation. The method leads to enhancing the sampling of macro-states associated to the transition states, since in traditional MD these are likely to be depleted even after short-time simulations. By allowing parallel replicas to carry different probabilistic weights, the number of replicas within each macro-state can be maintained constant without introducing any bias. The empirical variance of the estimated reaction rate, defined as a probability flux, is expectedly diminished. This note is a first attempt towards a better mathematical and numerical understanding of this method. It provides first a mathematical formalization of the notion of colors. Then, the link between this algorithm and a set of closely related methods having been proposed in the literature within the past few years is discussed. Lastly, numerical results are provided that illustrate the efficiency of the method.

  9. Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Schlesinger, Adam

    2012-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  10. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  11. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  12. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  13. Sequential Window Diagnoser for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Wen-Chiao Lin; Humberto E. Garcia; David Thorsley; Tae-Sic Yoo

    2009-09-01

    This paper addresses the issue of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). Developed diagnosers referred to as sequential window diagnosers (SWDs) utilize the stochastic diagnoser probability transition matrices developed in [9] along with a resetting mechanism that allows on-line monitoring of special event occurrences. To illustrate their performance, the SWDs are applied to detect and count the occurrence of special events in a particular DEDS. Results show that SWDs are able to accurately track the number of times special events occur.

  14. A volcano-seismic event spotting system for the use in rapid response systems

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    The classification of seismic signals of volcanic origin is an important task in monitoring active volcanoes. The number and size of certain types of seismic events usually increase before periods of volcanic cri