Science.gov

Sample records for event system simulation

  1. Synchronous Parallel System for Emulation and Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  2. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  3. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  4. Multi-threaded, discrete event simulation of distributed computing systems

    NASA Astrophysics Data System (ADS)

    Legrand, Iosif; MONARC Collaboration

    2001-10-01

    The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.

  5. Simulating rare events in equilibrium or nonequilibrium stochastic systems.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-01-14

    We present three algorithms for calculating rate constants and sampling transition paths for rare events in simulations with stochastic dynamics. The methods do not require a priori knowledge of the phase-space density and are suitable for equilibrium or nonequilibrium systems in stationary state. All the methods use a series of interfaces in phase space, between the initial and final states, to generate transition paths as chains of connected partial paths, in a ratchetlike manner. No assumptions are made about the distribution of paths at the interfaces. The three methods differ in the way that the transition path ensemble is generated. We apply the algorithms to kinetic Monte Carlo simulations of a genetic switch and to Langevin dynamics simulations of intermittently driven polymer translocation through a pore. We find that the three methods are all of comparable efficiency, and that all the methods are much more efficient than brute-force simulation.

  6. An event-based hydrologic simulation model for bioretention systems.

    PubMed

    Roy-Poirier, A; Filion, Y; Champagne, P

    2015-01-01

    Bioretention systems are designed to treat stormwater and provide attenuated drainage between storms. Bioretention has shown great potential at reducing the volume and improving the quality of stormwater. This study introduces the bioretention hydrologic model (BHM), a one-dimensional model that simulates the hydrologic response of a bioretention system over the duration of a storm event. BHM is based on the RECARGA model, but has been adapted for improved accuracy and integration of pollutant transport models. BHM contains four completely-mixed layers and accounts for evapotranspiration, overflow, exfiltration to native soils and underdrain discharge. Model results were evaluated against field data collected over 10 storm events. Simulated flows were particularly sensitive to antecedent water content and drainage parameters of bioretention soils, which were calibrated through an optimisation algorithm. Temporal disparity was observed between simulated and measured flows, which was attributed to preferential flow paths formed within the soil matrix of the field system. Modelling results suggest that soil water storage is the most important short-term hydrologic process in bioretention, with exfiltration having the potential to be significant in native soils with sufficient permeability.

  7. Rare event simulation of the chaotic Lorenz 96 dynamical system

    NASA Astrophysics Data System (ADS)

    Wouters, Jeroen; Bouchet, Freddy

    2015-04-01

    The simulation of rare events is becoming increasingly important in the climate sciences. Several sessions are devoted to rare and extreme events at this meeting and the IPCC has devoted a special report to risk management of extreme events (SREX). Brute force simulation of rare events can however be very costly. To obtain satisfactory statistics on a 1/1000y event, one needs to perform simulations over several thousands of years. Recently, a class of rare event simulation algorithms has been introduced that could yield significant increases in performance with respect to brute force simulations (see e.g. [1]). In these algorithms an ensemble of simulations is evolved in parallel, while at certain interaction times ensemble members are killed and cloned so as to have better statistics in the region of phase space that is relevant to the rare event of interest. We will discuss the implementational issues and performance gains for these algorithms. We also present results on a first application of a rare event simulation algorithm to a toy model for chaos in the atmosphere, the Lorenz 96 model. We demonstrate that for the estimation of the histogram tail of the energy observable, the algorithm gives a significant error reduction. We will furthermore discuss first results and an outlook on the application of rare event simulation algorithms to study blocking atmospheric circulation and heat wave events in the PlaSim climate model [2]. [1] Del Moral, P. & Garnier, J. Genealogical particle analysis of rare events. The Annals of Applied Probability 15, 2496-2534 (2005). [2] http://www.mi.uni-hamburg.de/Planet-Simul.216.0.html

  8. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  9. Parametric Parallel Simulation of Discrete Event Systems on SIMD Supercomputers

    DTIC Science & Technology

    1994-05-01

    Arrival @ Node i )r, - i. (5.20) qmaxBE P(Accepting Departure @ Node i => Join Nodej )1•. - •i’,P, - (5.21) qmax,BE k XDri + g) P(Null Event)!P,.,.a =W1...network. The departure rate from node j is 0 when that node is in state 0 and g, otherwise. Departure Rate from Nodej = 0* n(0Oj) + j(l - (0j)) 168

  10. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  11. An Early Warning System for Loan Risk Assessment Based on Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Zhou, Hong; Qiu, Yue; Wu, Yueqin

    System simulation is one of important tool for risk assessment. In this paper, a new method is presented to deal with credit risk assessment problems for commercial banks based on rare event simulation. The failure probability of repaying loans of listed company is taken as the criterion to measure the level of credit risk. The rare-event concept is adopted to construct the model of credit risk identification in commercial banks, and cross-entropy scheme is designed to implement the rare event simulation, based on which the loss probability can be assessed. Numerical experiments have shown that the method has a strong capability to identify the credit risk for commercial banks and offers a good tool for early warning.

  12. Discrete event simulation as a tool in optimization of a professional complex adaptive system.

    PubMed

    Nielsen, Anders Lassen; Hilwig, Helmer; Kissoon, Niranjan; Teelucksingh, Surujpal

    2008-01-01

    Similar urgent needs for improvement of health care systems exist in the developed and developing world. The culture and the organization of an emergency department in developing countries can best be described as a professional complex adaptive system, where each agent (employee) are ignorant of the behavior of the system as a whole; no one understands the entire system. Each agent's action is based on the state of the system at the moment (i.e. lack of medicine, unavailable laboratory investigation, lack of beds and lack of staff in certain functions). An important question is how one can improve the emergency service within the given constraints. The use of simulation signals is one new approach in studying issues amenable to improvement. Discrete event simulation was used to simulate part of the patient flow in an emergency department. A simple model was built using a prototyping approach. The simulation showed that a minor rotation among the nurses could reduce the mean number of visitors that had to be refereed to alternative flows within the hospital from 87 to 37 on a daily basis with a mean utilization of the staff between 95.8% (the nurses) and 87.4% (the doctors). We conclude that even faced with resource constraints and lack of accessible data discrete event simulation is a tool that can be used successfully to study the consequences of changes in very complex and self organizing professional complex adaptive systems.

  13. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  14. An Adaptive Simulation Framework for the Exploration of Extreme and Unexpected Events in Dynamic Engineered Systems.

    PubMed

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2017-01-01

    The end states reached by an engineered system during an accident scenario depend not only on the sequences of the events composing the scenario, but also on their timing and magnitudes. Including these additional features within an overarching framework can render the analysis infeasible in practical cases, due to the high dimension of the system state-space and the computational effort correspondingly needed to explore the possible system evolutions in search of the interesting (and very rare) ones of failure. To tackle this hurdle, in this article we introduce a framework for efficiently probing the space of event sequences of a dynamic system by means of a guided Monte Carlo simulation. Such framework is semi-automatic and allows embedding the analyst's prior knowledge about the system and his/her objectives of analysis. Specifically, the framework allows adaptively and intelligently allocating the simulation efforts preferably on those sequences leading to outcomes of interest for the objectives of the analysis, e.g., typically those that are more safety-critical (and/or rare). The emerging diversification in the filling of the state-space by the preference-guided exploration allows also the retrieval of critical system features, which can be useful to analysts and designers for taking appropriate means of prevention and mitigation of dangerous and/or unexpected consequences. A dynamic system for gas transmission is considered as a case study to demonstrate the application of the method.

  15. Validation of ground-motion simulations for historical events using SDoF systems

    USGS Publications Warehouse

    Galasso, C.; Zareian, F.; Iervolino, I.; Graves, R.W.

    2012-01-01

    The study presented in this paper is among the first in a series of studies toward the engineering validation of the hybrid broadband ground‐motion simulation methodology by Graves and Pitarka (2010). This paper provides a statistical comparison between seismic demands of single degree of freedom (SDoF) systems subjected to past events using simulations and actual recordings. A number of SDoF systems are selected considering the following: (1) 16 oscillation periods between 0.1 and 6 s; (2) elastic case and four nonlinearity levels, from mildly inelastic to severely inelastic systems; and (3) two hysteretic behaviors, in particular, nondegrading–nonevolutionary and degrading–evolutionary. Demand spectra are derived in terms of peak and cyclic response, as well as their statistics for four historical earthquakes: 1979 Mw 6.5 Imperial Valley, 1989 Mw 6.8 Loma Prieta, 1992 Mw 7.2 Landers, and 1994 Mw 6.7 Northridge.

  16. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  17. Using Discrete Event Simulation to Model Attacker Interactions with Cyber and Physical Security Systems

    DOE PAGES

    Perkins, Casey; Muller, George

    2015-10-08

    The number of connections between physical and cyber security systems is rapidly increasing due to centralized control from automated and remotely connected means. As the number of interfaces between systems continues to grow, the interactions and interdependencies between them cannot be ignored. Historically, physical and cyber vulnerability assessments have been performed independently. This independent evaluation omits important aspects of the integrated system, where the impacts resulting from malicious or opportunistic attacks are not easily known or understood. Here, we describe a discrete event simulation model that uses information about integrated physical and cyber security systems, attacker characteristics and simple responsemore » rules to identify key safeguards that limit an attacker's likelihood of success. Key features of the proposed model include comprehensive data generation to support a variety of sophisticated analyses, and full parameterization of safeguard performance characteristics and attacker behaviours to evaluate a range of scenarios. Lastly, we also describe the core data requirements and the network of networks that serves as the underlying simulation structure.« less

  18. Using Discrete Event Simulation to Model Attacker Interactions with Cyber and Physical Security Systems

    SciTech Connect

    Perkins, Casey; Muller, George

    2015-10-08

    The number of connections between physical and cyber security systems is rapidly increasing due to centralized control from automated and remotely connected means. As the number of interfaces between systems continues to grow, the interactions and interdependencies between them cannot be ignored. Historically, physical and cyber vulnerability assessments have been performed independently. This independent evaluation omits important aspects of the integrated system, where the impacts resulting from malicious or opportunistic attacks are not easily known or understood. Here, we describe a discrete event simulation model that uses information about integrated physical and cyber security systems, attacker characteristics and simple response rules to identify key safeguards that limit an attacker's likelihood of success. Key features of the proposed model include comprehensive data generation to support a variety of sophisticated analyses, and full parameterization of safeguard performance characteristics and attacker behaviours to evaluate a range of scenarios. Lastly, we also describe the core data requirements and the network of networks that serves as the underlying simulation structure.

  19. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  20. The IDES framework: A case study in development of a parallel discrete-event simulation system

    SciTech Connect

    Nicol, D.M.; Johnson, M.M.; Yoshimura, A.S.

    1997-12-31

    This tutorial describes considerations in the design and development of the IDES parallel simulation system. IDES is a Java-based parallel/distributed simulation system designed to support the study of complex large-scale enterprise systems. Using the IDES system as an example, the authors discuss how anticipated model and system constraints molded the design decisions with respect to modeling, synchronization, and communication strategies.

  1. Standardized Simulated Events for Provocative Testing of Medical Care System Rescue Capabilities

    DTIC Science & Technology

    2005-01-01

    from the available literature.* Observed simulated event behavior. While an apneic event was initiated on room air and 100 percent O2, the PaO2 ...Pediatric Advanced Life Support. Hypoxia and hypotension were defined as SpO2 ឬ percent and systolic BPអ mm Hg, respectively, as these parameters...ETCO2 1 0 Cont. auscultation 1 0 Oxygenation SpO2 1 1 Cont. Tone/Beep 1 1 Alarm for SpO2 1 1 Perfusion SpO2 pleth 1 1 SpO2 HR 1 1

  2. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations.

  3. Weighted-ensemble Brownian dynamics simulation: sampling of rare events in nonequilibrium systems.

    PubMed

    Kromer, Justus A; Schimansky-Geier, Lutz; Toral, Raul

    2013-06-01

    We provide an algorithm based on weighted-ensemble (WE) methods, to accurately sample systems at steady state. Applying our method to different one- and two-dimensional models, we succeed in calculating steady-state probabilities of order 10(-300) and reproduce the Arrhenius law for rates of order 10(-280). Special attention is payed to the simulation of nonpotential systems where no detailed balance assumption exists. For this large class of stochastic systems, the stationary probability distribution density is often unknown and cannot be used as preknowledge during the simulation. We compare the algorithm's efficiency with standard Brownian dynamics simulations and the original WE method.

  4. Dynamic simulation recalls condensate piping event

    SciTech Connect

    Farrell, R.J.; Reneberg, K.O. ); Moy, H.C. )

    1994-05-01

    This article describes how experience gained from simulating and reconstructing a condensate piping event will be used by Consolidated Edison to analyze control system problems. A cooperative effort by Con Edison and the Chemical Engineering Department at Polytechnic University used modular modeling system to investigate the probable cause of a Con Edison condensate piping event. Con Edison commissioned the work to serve as a case study for the more general problem of control systems analysis using dynamic simulation and MMS.

  5. Integration of scheduling and discrete event simulation systems to improve production flow planning

    NASA Astrophysics Data System (ADS)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  6. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  7. Forward flux sampling for rare event simulations.

    PubMed

    Allen, Rosalind J; Valeriani, Chantal; Rein Ten Wolde, Pieter

    2009-11-18

    Rare events are ubiquitous in many different fields, yet they are notoriously difficult to simulate because few, if any, events are observed in a conventional simulation run. Over the past several decades, specialized simulation methods have been developed to overcome this problem. We review one recently developed class of such methods, known as forward flux sampling. Forward flux sampling uses a series of interfaces between the initial and final states to calculate rate constants and generate transition paths for rare events in equilibrium or nonequilibrium systems with stochastic dynamics. This review draws together a number of recent advances, summarizes several applications of the method and highlights challenges that remain to be overcome.

  8. A State Event Detection Algorithm for Numerically Simulating Hybrid Systems with Model Singularities

    DTIC Science & Technology

    2007-01-01

    introduced there as well. However, in these early works as well as in Hay and Griffin [1979], Joglekar and Reklaitis [1984], and Prestin and Berzine...1995. Nonlinear Control Systems. Springer, London. Joglekar, G. and Reklaitis , G. 1984. A simulator for batch and semi-continuous processes

  9. Algorithmic scalability in globally constrained conservative parallel discrete event simulations of asynchronous systems.

    PubMed

    Kolakowska, A; Novotny, M A; Korniss, G

    2003-04-01

    We consider parallel simulations for asynchronous systems employing L processing elements that are arranged on a ring. Processors communicate only among the nearest neighbors and advance their local simulated time only if it is guaranteed that this does not violate causality. In simulations with no constraints, in the infinite L limit the utilization scales [Korniss et al., Phys. Rev. Lett. 84, 1351 (2000)]; but, the width of the virtual time horizon diverges (i.e., the measurement phase of the algorithm does not scale). In this work, we introduce a moving Delta-window global constraint, which modifies the algorithm so that the measurement phase scales as well. We present results of systematic studies in which the system size (i.e., L and the volume load per processor) as well as the constraint are varied. The Delta constraint eliminates the extreme fluctuations in the virtual time horizon, provides a bound on its width, and controls the average progress rate. The width of the Delta window can serve as a tuning parameter that, for a given volume load per processor, could be adjusted to optimize the utilization, so as to maximize the efficiency. This result may find numerous applications in modeling the evolution of general spatially extended short-range interacting systems with asynchronous dynamics, including dynamic Monte Carlo studies.

  10. Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

    SciTech Connect

    Waanders, Bart Van Bloemen

    2006-01-01

    Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.

  11. Agent-based modeling to simulate contamination events and evaluate threat management strategies in water distribution systems.

    PubMed

    Zechman, Emily M

    2011-05-01

    In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent-based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal-based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word-of-mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.

  12. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  13. Weighted next reaction method and parameter selection for efficient simulation of rare events in biochemical reaction systems.

    PubMed

    Xu, Zhouyi; Cai, Xiaodong

    2011-07-25

    The weighted stochastic simulation algorithm (wSSA) recently developed by Kuwahara and Mura and the refined wSSA proposed by Gillespie et al. based on the importance sampling technique open the door for efficient estimation of the probability of rare events in biochemical reaction systems. In this paper, we first apply the importance sampling technique to the next reaction method (NRM) of the stochastic simulation algorithm and develop a weighted NRM (wNRM). We then develop a systematic method for selecting the values of importance sampling parameters, which can be applied to both the wSSA and the wNRM. Numerical results demonstrate that our parameter selection method can substantially improve the performance of the wSSA and the wNRM in terms of simulation efficiency and accuracy.

  14. Agent Frameworks for Discrete Event Social Simulations

    DTIC Science & Technology

    2010-03-01

    of a general modeling approach to social simulation that embeds a multi - agent system within a DES framework, and propose several reusable agent... agent system to simulate changes in the beliefs, values, and interests (BVIs) of large social groups (Alt, Jackson, Hudak, & Steven Lieberman, 2010...to events from A. 2.3 Cultural Geography Model The Cultural Geography (CG) Model is an implementation of a DESS that uses an embedded multi

  15. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  16. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  17. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    www.ijetae.com (ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 829 Towards the Modeling and Simulation of Quantum Key...ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 830 Such a simulation capability needs to address many ―concerns...www.ijetae.com (ISSN 2250-2459, ISO 9001 :2008 Certified Journal, Volume 4, Issue 2, February 2014) 831 TABLE II END USER CAPABILITY REQUIREMENTS

  18. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  19. Ocean Dynamics Simulation during an Extreme Bora Event using a Two-Way Coupled Atmosphere-Ocean Modeling System

    NASA Astrophysics Data System (ADS)

    Licer, Matjaz; Smerkol, Peter; Fettich, Anja; Ravdas, Michalis; Papapostolou, Alexandros; Mantziafou, Anneta; Cedilnik, Jure; Strajnar, Benedikt; Jeromel, Maja; Pristov, Neva; Jerman, Jure; Petan, Saso; Malacic, Vlado; Sofianos, Sarantis

    2015-04-01

    The response of the Adriatic Sea to cold north-easterly Bora wind forcing has been modelled numerous times, but usually using one-way coupling techniques. One of the most significant events of the kind took place in February 2012, when hurricane force Bora was blowing over the Northern Adriatic almost continuously for over three weeks, causing extreme air-sea interactions leading to severe water cooling (below 4 degrees Celsius) and extensive dense water formation (with density anomalies above 30.5 kg/m3). The intensity of the atmosphere-ocean interactions during such conditions calls for a two-way atmosphere-ocean coupling approach. We compare the performances of a) fully two-way coupled atmosphere-ocean modelling system and b) one way coupled ocean model (forced by the atmospheric model hourly output) to the available in-situ measurements (coastal buoy, CTD). The models used were ALADIN (4.4 km resolution) on the atmospheric side and POM (1/30°× 1/30° resolution) on the ocean side. The atmosphere-ocean coupling was implemented using the OASIS3-MCT model coupling toolkit. We show that the atmosphere-ocean two-way coupling significantly improves the simulated temperature and density response of the ocean since it represents short-termed transient features much better than the offline version of the ocean model.

  20. A Simbol-X Event Simulator

    NASA Astrophysics Data System (ADS)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  1. A Simbol-X Event Simulator

    SciTech Connect

    Puccetti, S.; Giommi, P.; Fiore, F.

    2009-05-11

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  2. Running Parallel Discrete Event Simulators on Sierra

    SciTech Connect

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  3. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  4. Assessing the Effectiveness of Biosurveillance Via Discrete Event Simulation

    DTIC Science & Technology

    2011-03-01

    EFFECTIVENESS OF BIOSURVEILLANCE VIA DISCRETE EVENT SIMULATION by Jason H. Dao March 2011 Thesis Advisor: Ronald D. Fricker, Jr. Second Reader...TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Assessing the Effectiveness of Biosurveillance Via Discrete Event Simulation 6...the potential for disastrous outcomes is greater than it has ever been. In order to confront this threat, biosurveillance systems are utilized to

  5. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  6. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  7. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  8. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  9. MHD simulation of the Bastille day event

    NASA Astrophysics Data System (ADS)

    Linker, Jon; Torok, Tibor; Downs, Cooper; Lionello, Roberto; Titov, Viacheslav; Caplan, Ronald M.; Mikić, Zoran; Riley, Pete

    2016-03-01

    We describe a time-dependent, thermodynamic, three-dimensional MHD simulation of the July 14, 2000 coronal mass ejection (CME) and flare. The simulation starts with a background corona developed using an MDI-derived magnetic map for the boundary condition. Flux ropes using the modified Titov-Demoulin (TDm) model are used to energize the pre-event active region, which is then destabilized by photospheric flows that cancel flux near the polarity inversion line. More than 1033 ergs are impulsively released in the simulated eruption, driving a CME at 1500 km/s, close to the observed speed of 1700km/s. The post-flare emission in the simulation is morphologically similar to the observed post-flare loops. The resulting flux rope that propagates to 1 AU is similar in character to the flux rope observed at 1 AU, but the simulated ICME center passes 15° north of Earth.

  10. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  11. Event-by-Event Simulation of Induced Fission

    SciTech Connect

    Vogt, Ramona; Randrup, Joergen

    2008-04-17

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  12. Discrete Event Simulation of Distributed Team Communication

    DTIC Science & Technology

    2012-03-22

    executable system architecture approach to discrete events system modeling using sysml in conjunction with colored petri net . In Systems Conference, 2008 2nd...operators. Mitchell found that IMPRINT predictions of communication times and frequencies correlated with recorded communications amongst a platoon of

  13. Complete event simulations of nuclear fission

    NASA Astrophysics Data System (ADS)

    Vogt, Ramona

    2015-10-01

    For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.

  14. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models.

  15. Distributed discrete event simulation. Final report

    SciTech Connect

    De Vries, R.C.

    1988-02-01

    The presentation given here is restricted to discrete event simulation. The complexity of and time required for many present and potential discrete simulations exceeds the reasonable capacity of most present serial computers. The desire, then, is to implement the simulations on a parallel machine. However, certain problems arise in an effort to program the simulation on a parallel machine. In one category of methods deadlock care arise and some method is required to either detect deadlock and recover from it or to avoid deadlock through information passing. In the second category of methods, potentially incorrect simulations are allowed to proceed. If the situation is later determined to be incorrect, recovery from the error must be initiated. In either case, computation and information passing are required which would not be required in a serial implementation. The net effect is that the parallel simulation may not be much better than a serial simulation. In an effort to determine alternate approaches, important papers in the area were reviewed. As a part of that review process, each of the papers was summarized. The summary of each paper is presented in this report in the hopes that those doing future work in the area will be able to gain insight that might not otherwise be available, and to aid in deciding which papers would be most beneficial to pursue in more detail. The papers are broken down into categories and then by author. Conclusions reached after examining the papers and other material, such as direct talks with an author, are presented in the last section. Also presented there are some ideas that surfaced late in the research effort. These promise to be of some benefit in limiting information which must be passed between processes and in better understanding the structure of a distributed simulation. Pursuit of these ideas seems appropriate.

  16. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  17. Rare event simulation in radiation transport

    SciTech Connect

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.

  18. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  19. Detecting plastic events in emulsions simulations

    NASA Astrophysics Data System (ADS)

    Lulli, Matteo; Matteo Lulli, Massimo Bernaschi, Mauro Sbragaglia Team

    2016-11-01

    Emulsions are complex systems which are formed by a number of non-coalescing droplets dispersed in a solvent leading to non-trivial effects in the overall flowing dynamics. Such systems possess a yield stress below which an elastic response to an external forcing occurs, while above the yield stress the system flows as a non-Newtonian fluid, i.e. the stress is not proportional to the shear. In the solid-like regime the network of the droplets interfaces stores the energy coming from the work exerted by an external forcing, which can be used to move the droplets in a non-reversible way, i.e. causing plastic events. The Kinetic-Elasto-Plastic (KEP) theory is an effective theory describing some features of the flowing regime relating the rate of plastic events to a scalar field called fluidity f =γ˙/σ , i.e. the inverse of an effective viscosity. Boundary conditions have a non-trivial role not captured by the KEP description. In this contribution we will compare numerical results against experiments concerning the Poiseuille flow of emulsions in microchannels with complex boundary geometries. Using an efficient computational tool we can show non-trivial results on plastic events for different realizations of the rough boundaries. The research leading to these results has received funding from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007- 2013)/ERC Grant Agreement no. [279004].

  20. Empirical study of simulated two-planet microlensing events

    SciTech Connect

    Zhu, Wei; Gould, Andrew; Penny, Matthew; Mao, Shude; Gendron, Rieul

    2014-10-10

    We undertake the first study of two-planet microlensing models recovered from simulations of microlensing events generated by realistic multiplanet systems in which 292 planetary events, including 16 two-planet events, were detected from 6690 simulated light curves. We find that when two planets are recovered, their parameters are usually close to those of the two planets in the system most responsible for the perturbations. However, in 1 of the 16 examples, the apparent mass of both detected planets was more than doubled by the unmodeled influence of a third, massive planet. This fraction is larger than but statistically consistent with the roughly 1.5% rate of serious mass errors due to unmodeled planetary companions for the 274 cases from the same simulation in which a single planet is recovered. We conjecture that an analogous effect due to unmodeled stellar companions may occur more frequently. For 7 out of 23 cases in which two planets in the system would have been detected separately, only one planet was recovered because the perturbations due to the two planets had similar forms. This is a small fraction (7/274) of all recovered single-planet models, but almost a third of all events that might plausibly have led to two-planet models. Still, in these cases, the recovered planet tends to have parameters similar to one of the two real planets most responsible for the anomaly.

  1. Autocharacterization feasibility system on Hunters Trophy event

    SciTech Connect

    Mills, R.A.

    1993-09-01

    An automated system to characterize cable systems at NTS has been developed to test the feasibility of such a system. A rack of electronic equipment including a fast pulse generator, digital sampling scope, coaxial switch matrix and GPIB controller was installed downhole at NTS for the Hunters Trophy event. It was used to test automated characterization. Recorded measurements of simulation and other instrument data were gathered to determine if a full scale automated system would be practical in full scale underground nuclear effects tests. The benefits of such a full scale system would be fewer personnel required downhole; more instrument control in the uphole recording room; faster acquisition of cable parameter data.

  2. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  3. Sensitivity analysis of some critical factors affecting simulated intrusion volumes during a low pressure transient event in a full-scale water distribution system.

    PubMed

    Ebacher, G; Besner, M C; Clément, B; Prévost, M

    2012-09-01

    Intrusion events caused by transient low pressures may result in the contamination of a water distribution system (DS). This work aims at estimating the range of potential intrusion volumes that could result from a real downsurge event caused by a momentary pump shutdown. A model calibrated with transient low pressure recordings was used to simulate total intrusion volumes through leakage orifices and submerged air vacuum valves (AVVs). Four critical factors influencing intrusion volumes were varied: the external head of (untreated) water on leakage orifices, the external head of (untreated) water on submerged air vacuum valves, the leakage rate, and the diameter of AVVs' outlet orifice (represented by a multiplicative factor). Leakage orifices' head and AVVs' orifice head levels were assessed through fieldwork. Two sets of runs were generated as part of two statistically designed experiments. A first set of 81 runs was based on a complete factorial design in which each factor was varied over 3 levels. A second set of 40 runs was based on a latin hypercube design, better suited for experimental runs on a computer model. The simulations were conducted using commercially available transient analysis software. Responses, measured by total intrusion volumes, ranged from 10 to 366 L. A second degree polynomial was used to analyze the total intrusion volumes. Sensitivity analyses of both designs revealed that the relationship between the total intrusion volume and the four contributing factors is not monotonic, with the AVVs' orifice head being the most influential factor. When intrusion through both pathways occurs concurrently, interactions between the intrusion flows through leakage orifices and submerged AVVs influence intrusion volumes. When only intrusion through leakage orifices is considered, the total intrusion volume is more largely influenced by the leakage rate than by the leakage orifices' head. The latter mainly impacts the extent of the area affected by

  4. The effects of parallel processing architectures on discrete event simulation

    NASA Astrophysics Data System (ADS)

    Cave, William; Slatt, Edward; Wassmer, Robert E.

    2005-05-01

    As systems become more complex, particularly those containing embedded decision algorithms, mathematical modeling presents a rigid framework that often impedes representation to a sufficient level of detail. Using discrete event simulation, one can build models that more closely represent physical reality, with actual algorithms incorporated in the simulations. Higher levels of detail increase simulation run time. Hardware designers have succeeded in producing parallel and distributed processor computers with theoretical speeds well into the teraflop range. However, the practical use of these machines on all but some very special problems is extremely limited. The inability to use this power is due to great difficulties encountered when trying to translate real world problems into software that makes effective use of highly parallel machines. This paper addresses the application of parallel processing to simulations of real world systems of varying inherent parallelism. It provides a brief background in modeling and simulation validity and describes a parameter that can be used in discrete event simulation to vary opportunities for parallel processing at the expense of absolute time synchronization and is constrained by validity. It focuses on the effects of model architecture, run-time software architecture, and parallel processor architecture on speed, while providing an environment where modelers can achieve sufficient model accuracy to produce valid simulation results. It describes an approach to simulation development that captures subject area expert knowledge to leverage inherent parallelism in systems in the following ways: * Data structures are separated from instructions to track which instruction sets share what data. This is used to determine independence and thus the potential for concurrent processing at run-time. * Model connectivity (independence) can be inspected visually to determine if the inherent parallelism of a physical system is properly

  5. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices.

  6. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  7. Detectability of Discrete Event Systems with Dynamic Event Observation

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2009-01-01

    Our previous work considers detectability of discrete event systems which is to determine the current state and subsequent states of a system based on event observation. We assume that event observation is static, that is, if an event is observable, then all its occurrences are observable. However, in practical systems such as sensor networks, event observation often needs to be dynamic, that is, the occurrences of same events may or may not be observable, depending on the state of the system. In this paper, we generalize static event observation into dynamic event observation and consider the detectability problem under dynamic event observation. We define four types of detectabilities. To check detectabilities, we construct the observer with exponential complexity. To reduce computational complexity, we can also construct a detector with polynomial complexity to check strong detectabilities. Dynamic event observation can be implemented in two possible ways: a passive observation and an active observation. For the active observation, we discuss how to find minimal event observation policies that preserve four types of detectabilities respectively. PMID:20161618

  8. Data Systems Dynamic Simulator

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Clark, Melana; Davenport, Bill; Message, Philip

    1993-01-01

    The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique.

  9. High-level simulation of JWST event-driven operations

    NASA Astrophysics Data System (ADS)

    Henry, R.; Kinzel, W.

    2012-09-01

    The James Webb Space Telescope (JWST) has an event-driven architecture: an onboard Observation Plan Executive (OPE) executes an Observation Plan (OP) consisting of a sequence of observing units (visits). During normal operations, ground action to update the OP is only expected to be necessary about once a week. This architecture is designed to tolerate uncertainty in visit duration, and occasional visit failures due to inability to acquire guide stars, without creating gaps in the observing timeline. The operations concept is complicated by the need for occasional scheduling of timecritical science and engineering visits that cannot tolerate much slippage without inducing gaps, and also by onboard momentum management. A prototype Python tool called the JWST Observation Plan Execution Simulator (JOPES) has recently been developed to simulate OP execution at a high level and analyze the response of the Observatory and OPE to both nominal and contingency scenarios. Incorporating both deterministic and stochastic behavior, JOPES has potential to be a powerful tool for several purposes: requirements analysis, system verification, systems engineering studies, and test data generation. It has already been successfully applied to a study of overhead estimation bias: whether to use conservative or average-case estimates for timing components that are inherently uncertain, such as those involving guide-star acquisition. JOPES is being enhanced to support interfaces to the operational Proposal Planning Subsystem (PPS) now being developed, with the objective of "closing the loop" between testing and simulation by feeding simulated event logs back into the PPS.

  10. Single event effects and laser simulation studies

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Schwartz, H.; Mccarty, K.; Coss, J.; Barnes, C.

    1993-01-01

    The single event upset (SEU) linear energy transfer threshold (LETTH) of radiation hardened 64K Static Random Access Memories (SRAM's) was measured with a picosecond pulsed dye laser system. These results were compared with standard heavy ion accelerator (Brookhaven National Laboratory (BNL)) measurements of the same SRAM's. With heavy ions, the LETTH of the Honeywell HC6364 was 27 MeV-sq cm/mg at 125 C compared with a value of 24 MeV-sq cm/mg obtained with the laser. In the case of the second type of 64K SRAM, the IBM640lCRH no upsets were observed at 125 C with the highest LET ions used at BNL. In contrast, the pulsed dye laser tests indicated a value of 90 MeV-sq cm/mg at room temperature for the SEU-hardened IBM SRAM. No latchups or multiple SEU's were observed on any of the SRAM's even under worst case conditions. The results of this study suggest that the laser can be used as an inexpensive laboratory SEU prescreen tool in certain cases.

  11. Stochastic discrete event simulation of germinal center reactions

    NASA Astrophysics Data System (ADS)

    Figge, Marc Thilo

    2005-05-01

    We introduce a generic reaction-diffusion model for germinal center reactions and perform numerical simulations within a stochastic discrete event approach. In contrast to the frequently used deterministic continuum approach, each single reaction event is monitored in space and time in order to simulate the correct time evolution of this complex biological system. Germinal centers play an important role in the immune system by performing a reaction that aims at improving the affinity between antibodies and antigens. Our model captures experimentally observed features of this reaction, such as the development of the remarkable germinal center morphology and the maturation of antibody-antigen affinity in the course of time. We model affinity maturation within a simple affinity class picture and study it as a function of the distance between the initial antibody-antigen affinity and the highest possible affinity. The model reveals that this mutation distance may be responsible for the experimentally observed all-or-none behavior of germinal centers; i.e., they generate either mainly output cells of high affinity or no high-affinity output cells at all. Furthermore, the exact simulation of the system dynamics allows us to study the hypothesis of cell recycling in germinal centers as a mechanism for affinity optimization. A comparison of three possible recycling pathways indicates that affinity maturation is optimized by a recycling pathway that has previously not been taken into account in deterministic continuum models.

  12. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  13. Multi-Transiting Systems and Exoplanet Mutual Events

    NASA Astrophysics Data System (ADS)

    Coughlin, Jared; Ragozzine, D.; Holman, M. J.

    2011-01-01

    Until recently, studies of transiting exoplanets- planets that cross in front of their host star- have focused almost exclusively upon systems where there is only one transiting planet. Those studies that have considered additional planets have mostly done so with the goal of determining the perturbing effects that additional planets would have upon the orbit, and therefore the light curve, of the transiting planet. This work considers, in detail, a specific type of event known as an exoplanet mutual event. Such events occur when one planet passes in front of another. While such events can occur whether or not these planets are transiting, predicting and understanding these events is best done in systems with multiple transiting planets. We estimate, through an ensemble simulation, how frequently exoplanet mutual events occur and which systems are most likely to undergo exoplanet mutual events. We also investigate what information can be learned about not only the planets themselves but also the orbital architecture in such systems. We conclude that while ODT (overlapping double-transit) events occur with a much lower frequency than PPO (planet-planet occultation) events, ODT mutual events are capable of producing detectable signals, that Kepler will detect a few, and recommend that candidate systems for these events, such as KOI 191, be observed in short cadence(Steffen et. al 2010, Holman et. al 2010). This work is supported in part by the NSF REU and DOD ASSURE programs under NSF grant no. 0754568 and by the Smithsonian Institution.

  14. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  15. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  16. Simulating Single-Event Upsets in Bipolar RAM's

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.

    1986-01-01

    Simulation technique saves testing. Uses interactive version of SPICE (Simulation Program with Integrated Circuit Emphasis). Device and subcircuit models available in software used to construct macromodel for an integrated bipolar transistor. Time-dependent current generators placed inside transistor macromodel to simulate charge collection from ion track. Significant finding of experiments is standard design practice of reducing power in unaddressed bipolar RAM cell increases sensitivity of cell to single-event upsets.

  17. Upstream gyrating ion events: Cluster observations and simulations

    SciTech Connect

    Sauer, K.; Fraenz, M.; Dubinin, E.; Korth, A.; Mazelle, C.; Reme, H.; Dandouras, I.

    2005-08-01

    Localized events of low-frequency quasi-monochromatic waves in the 30s range observed by Cluster in the upstream region of Earth are analyzed. They are associated with a gyro-motion of the two ion populations consisting of the incoming solar wind protons and the back-streaming ions from the shock. A coordinate system is chosen in which one axis is parallel to the ambient magnetic field B0 and the other one is in the vswxB0 direction. The variation of the plasma parameters is compared with the result of two-fluid Hall-MHD simulations using different beam densities and velocities. Keeping a fixed (relative) beam density (e.g. {alpha}=0.005), non-stationary 'shock-like' structures are generated if the beam velocity exceeds a certain threshold of about ten times the Alfven velocity. Below the threshold, the localized events represent stationary, nonlinear waves (oscillitons) in a beam-plasma system in which the Reynold's stresses of the plasma and beam ions are balanced by the magnetic field stress.

  18. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    NASA Astrophysics Data System (ADS)

    De Raedt, H.; Michielsen, K.

    2014-12-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. Invited paper presented at QTAP-6.

  19. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  20. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  1. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  2. Event-by-event fission simulation code, generates complete fission events

    SciTech Connect

    2013-04-01

    FREYA is a computer code that generates complete fission events. The output includes the energy and momentum of these final state particles: fission products, prompt neutrons and prompt photons. The version of FREYA that is to be released is a module for MCNP6.

  3. Rare Event Simulation for T-cell Activation

    NASA Astrophysics Data System (ADS)

    Lipsmeier, Florian; Baake, Ellen

    2009-02-01

    The problem of statistical recognition is considered, as it arises in immunobiology, namely, the discrimination of foreign antigens against a background of the body's own molecules. The precise mechanism of this foreign-self-distinction, though one of the major tasks of the immune system, continues to be a fundamental puzzle. Recent progress has been made by van den Berg, Rand, and Burroughs (J. Theor. Biol. 209:465-486, 2001), who modelled the probabilistic nature of the interaction between the relevant cell types, namely, T-cells and antigen-presenting cells (APCs). Here, the stochasticity is due to the random sample of antigens present on the surface of every APC, and to the random receptor type that characterises individual T-cells. It has been shown previously (van den Berg et al. in J. Theor. Biol. 209:465-486, 2001; Zint et al. in J. Math. Biol. 57:841-861, 2008) that this model, though highly idealised, is capable of reproducing important aspects of the recognition phenomenon, and of explaining them on the basis of stochastic rare events. These results were obtained with the help of a refined large deviation theorem and were thus asymptotic in nature. Simulations have, so far, been restricted to the straightforward simple sampling approach, which does not allow for sample sizes large enough to address more detailed questions. Building on the available large deviation results, we develop an importance sampling technique that allows for a convenient exploration of the relevant tail events by means of simulation. With its help, we investigate the mechanism of statistical recognition in some depth. In particular, we illustrate how a foreign antigen can stand out against the self background if it is present in sufficiently many copies, although no a priori difference between self and nonself is built into the model.

  4. Simulation and study of small numbers of random events

    NASA Technical Reports Server (NTRS)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  5. High Resolution Modeling of Tropical Cyclones Using Rare Event Simulation

    NASA Astrophysics Data System (ADS)

    Plotkin, D. A.; Abbot, D. S.; Weare, J.

    2014-12-01

    Tropical cyclones (TCs) present a challenge to modeling using general circulation models (GCMs) because they involve processes and structures that are too fine for GCMs to resolve. TCs have fine structures - e.g. the eye, eyewall, and rain bands - with length scales on the order of 10 km, while GCMs have typical resolutions on the order of 50-100 km. High resolution GCM runs that are sufficiently long to exhibit multiple TCs can be prohibitively computationally expensive. Thus, while GCMs exhibit TC-like vortices with similar spatial and temporal frequencies to observed TCs, the ability of GCMs to reproduce fine TC structures remains largely untested. In this study, we use recently developed rare event analysis and simulation methods to selectively simulate TCs under GCMs at very high resolution. These rare event simulation methods have been developed mostly in the context of computational chemistry, but are broadly applicable. They allow (either by careful manipulation of the model or by selection of trajectories) direct and detailed interrogation of the event of interest without introducing error and without the need to simulated for long periods of time to see the event. By creating targeted, high resolution GCM simulations with many TCs, we hope to determine whether or not GCMs can capture fine TC structures such as eyewalls and individual rain bands.

  6. Variability of simulants used in recreating stab events.

    PubMed

    Carr, D J; Wainwright, A

    2011-07-15

    Forensic investigators commonly use simulants/backing materials to mount fabrics and/or garments on when recreating damage due to stab events. Such work may be conducted in support of an investigation to connect a particular knife to a stabbing event by comparing the severance morphology obtained in the laboratory to that observed in the incident. There does not appear to have been a comparison of the effect of simulant type on the morphology of severances in fabrics and simulants, nor on the variability of simulants. This work investigates three simulants (pork, gelatine, expanded polystyrene), two knife blades (carving, bread), and how severances in the simulants and an apparel fabric typically used to manufacture T-shirts (single jersey) were affected by (i) simulant type and (ii) blade type. Severances were formed using a laboratory impact apparatus to ensure a consistent impact velocity and hence impact energy independently of the other variables. The impact velocity was chosen so that the force measured was similar to that measured in human performance trials. Force-time and energy-time curves were analysed and severance morphology (y, z directions) investigated. Simulant type and knife type significantly affected the critical forensic measurements of severance length (y direction) in the fabric and 'skin' (Tuftane). The use of EPS resulted in the lowest variability in data, further the severances recorded in both the fabric and Tuftane more accurately reflected the dimensions of the impacting knives.

  7. Estimating rare events in biochemical systems using conditional sampling

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  8. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  9. Fission Reaction Event Yield Algorithm, FREYA - For event-by-event simulation of fission

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Randrup, J.; Vogt, R.

    2015-06-01

    From nuclear materials accountability to detection of special nuclear material, SNM, the need for better modeling of fission has grown over the past decades. Current radiation transport codes compute average quantities with great accuracy and performance, but performance and averaging come at the price of limited interaction-by-interaction modeling. For fission applications, these codes often lack the capability of modeling interactions exactly: energy is not conserved, energies of emitted particles are uncorrelated, prompt fission neutron and photon multiplicities are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g. the neutron multiplicity) and correlations between neutrons and photons. The new computational model, FREYA (Fission Reaction Event Yield Algorithm), aims to meet this need by modeling complete fission events. Thus it automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum. FREYA has been integrated into the LLNL Fission Library, and will soon be part of MCNPX2.7.0, MCNP6, TRIPOLI-4.9, and Geant4.10.

  10. Disaster Response Modeling Through Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  11. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  12. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  13. Simulating an Extreme Wind Event in a Topographically Complex Region

    NASA Astrophysics Data System (ADS)

    Lennard, Christopher

    2014-07-01

    Complex topography modifies local weather characteristics such as air temperature, rainfall and airflow within a larger regional extent. The Cape Peninsula around Cape Town, South Africa, is a complex topographical feature responsible for the modification of rainfall and wind fields largely downstream of the Peninsula. During the passage of a cold front on 2 October 2002, an extreme wind event associated with tornado-like damage occurred in the suburb of Manenberg, however synoptic conditions did not indicate convective activity typically associated with a tornado. A numerical regional climate model was operated at very high horizontal resolution (500 m) to investigate the dynamics of the event. The model simulated an interaction between the topography of the peninsula and an airflow direction change associated with the passage of the cold front. A small region of cyclonic circulation was simulated over Manenberg that was embedded in an area of negative vorticity and a leeward gravity wave. The feature lasted 14 min and moved in a north to south direction. Vertically, it was not evident above 220 m. The model assessment describes this event as a shallow but intense cyclonic vortex generated in the lee of the peninsula through an interaction between the peninsula and a change in wind direction as the cold front made landfall. The model did not simulate wind speeds associated with the observed damage suggesting that the horizontal grid resolution ought to be at the scale of the event to more completely understand such microscale airflow phenomena.

  14. SPICE: Simulation Package for Including Flavor in Collider Events

    NASA Astrophysics Data System (ADS)

    Engelhard, Guy; Feng, Jonathan L.; Galon, Iftah; Sanford, David; Yu, Felix

    2010-01-01

    We describe SPICE: Simulation Package for Including Flavor in Collider Events. SPICE takes as input two ingredients: a standard flavor-conserving supersymmetric spectrum and a set of flavor-violating slepton mass parameters, both of which are specified at some high "mediation" scale. SPICE then combines these two ingredients to form a flavor-violating model, determines the resulting low-energy spectrum and branching ratios, and outputs HERWIG and SUSY Les Houches files, which may be used to generate collider events. The flavor-conserving model may be any of the standard supersymmetric models, including minimal supergravity, minimal gauge-mediated supersymmetry breaking, and anomaly-mediated supersymmetry breaking supplemented by a universal scalar mass. The flavor-violating contributions may be specified in a number of ways, from specifying charges of fields under horizontal symmetries to completely specifying all flavor-violating parameters. SPICE is fully documented and publicly available, and is intended to be a user-friendly aid in the study of flavor at the Large Hadron Collider and other future colliders. Program summaryProgram title: SPICE Catalogue identifier: AEFL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 8153 No. of bytes in distributed program, including test data, etc.: 67 291 Distribution format: tar.gz Programming language: C++ Computer: Personal computer Operating system: Tested on Scientific Linux 4.x Classification: 11.1 External routines: SOFTSUSY [1,2] and SUSYHIT [3] Nature of problem: Simulation programs are required to compare theoretical models in particle physics with present and future data at particle colliders. SPICE determines the masses and decay branching ratios of

  15. Extreme events evaluation over African cities with regional climate simulations

    NASA Astrophysics Data System (ADS)

    Bucchignani, Edoardo; Mercogliano, Paola; Simonis, Ingo; Engelbrecht, Francois

    2013-04-01

    The warming of the climate system in recent decades is evident from observations and is mainly related to the increase of anthropogenic greenhouse gas concentrations (IPCC, 2012). Given the expected climate change conditions on the African continent, as underlined in different publications, and their associated socio-economic impacts, an evaluation of the specific effects on some strategic African cities on the medium and long-term is of crucial importance with regard to the development of adaptation strategies. Assessments usually focus on averages climate properties rather than on variability or extremes, but often these last ones have more impacts on the society than averages values. Global Coupled Models (GCM) are generally used to simulate future climate scenarios as they guarantee physical consistency between variables; however, due to the coarse spatial resolution, their output cannot be used for impact studies on local scales, which makes necessary the generation of higher resolution climate change data. Regional Climate Models (RCM) describe better the phenomena forced by orography or by coastal lines, or that are related to convection. Therefore they can provide more detailed information on climate extremes that are hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws. The normal bias of the RCM to represent the local climatology is reduced using adequate statistical techniques based on the comparison of the simulated results with long observational time series. In the framework of the EU-FP7 CLUVA (Climate Change and Urban Vulnerability in Africa) project, regional projections of climate change at high resolution (about 8 km), have been performed for selected areas surrounding five African cities. At CMCC, the regional climate model COSMO-CLM has been employed: it is a non-hydrostatic model. For each domain, two simulations have been performed, considering the RCP4.5 and RCP8.5 emission

  16. Flash heat simulation events in the north Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Mazon, Jordi; Pino, David

    2013-04-01

    According to the definition of flash heat event proposed by Mazon et al. in the European Meteorology Meeting (2011 and 2012) from the studied case produced in the Northeast of the Iberian peninsula on 27th August 20120, some other flash heat events have been detected by automatic weather stations around the in the Mediterranean basin (South Italy, Crete island, South Greece and the northeast of the Iberian peninsula). Flash heat event covers those events in which a large increase of temperature last a spatial and temporal scale between heat wave (defined by the WMO as a phenomenon in which the daily maximum temperature of more than five consecutive days exceeds the average maximum temperature by 5°C, with respect to the 1961-1990 period) and heat burst (defined by the AMS as a rare atmospheric event characterized by gusty winds and a rapid increase in temperature and decrease in humidity that can last some minutes). Thus flash heat event may be considered as a rapid modification of the temperature that last several hours, lower than 48 hours, but usually less than 24 hours. Two different flash heat events have been simulated with the WRF mesoscale model in the Mediterranean basin. The results show that two different mechanisms are the main causes of these flash heat events. The first one occurred on 23rd March 2008 in Crete Island due to a strong Foehn effect caused by a strong south and southeast wind, in which the maximum temperature increased during some hours on the night at 32°C. The second one occurred on 1st August 2012 in the northeast of the Iberian Peninsula, caused by a rapid displacement of warm a ridge from North Africa that lasted around 24 hours.

  17. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  18. Three Dimensional Simulation of the Baneberry Nuclear Event

    SciTech Connect

    Lomov, I

    2003-07-16

    Baneberry, a 10-kiloton nuclear event, was detonated at a depth of 278 m at the Nevada Test Site on December 18, 1970. Shortly after detonation, radioactive gases emanating from the cavity were released into the atmosphere through a shock-induced fissure near surface ground zero. Extensive geophysical investigations, coupled with a series of 1D and 2D computational studies were used to reconstruct the sequence of events that led to the catastrophic failure. However, the geological profile of the Baneberry site is complex and inherently three-dimensional, which meant that some geological features had to be simplified or ignored in the 2D simulations. This left open the possibility that features unaccounted for in the 2D simulations could have had an important influence on the eventual containment failure of the Baneberry event. This paper presents results from a high-fidelity 3D Baneberry simulation based on the most accurate geologic and geophysical data available. The results are compared with available data, and contrasted against the results of the previous 2D computational studies.

  19. Automated estimation of rare event probabilities in biochemical systems

    PubMed Central

    Daigle, Bernie J.; Roh, Min K.; Gillespie, Dan T.; Petzold, Linda R.

    2011-01-01

    In biochemical systems, the occurrence of a rare event can be accompanied by catastrophic consequences. Precise characterization of these events using Monte Carlo simulation methods is often intractable, as the number of realizations needed to witness even a single rare event can be very large. The weighted stochastic simulation algorithm (wSSA) [J. Chem. Phys. 129, 165101 (2008)] and its subsequent extension [J. Chem. Phys. 130, 174103 (2009)] alleviate this difficulty with importance sampling, which effectively biases the system toward the desired rare event. However, extensive computation coupled with substantial insight into a given system is required, as there is currently no automatic approach for choosing wSSA parameters. We present a novel modification of the wSSA—the doubly weighted SSA (dwSSA)—that makes possible a fully automated parameter selection method. Our approach uses the information-theoretic concept of cross entropy to identify parameter values yielding minimum variance rare event probability estimates. We apply the method to four examples: a pure birth process, a birth-death process, an enzymatic futile cycle, and a yeast polarization model. Our results demonstrate that the proposed method (1) enables probability estimation for a class of rare events that cannot be interrogated with the wSSA, and (2) for all examples tested, reduces the number of runs needed to achieve comparable accuracy by multiple orders of magnitude. For a particular rare event in the yeast polarization model, our method transforms a projected simulation time of 600 years to three hours. Furthermore, by incorporating information-theoretic principles, our approach provides a framework for the development of more sophisticated influencing schemes that should further improve estimation accuracy. PMID:21280690

  20. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  1. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  2. Simulated Changes in Extreme Temperature and Precipitation Events at 6 ka

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Bell, J. L.; Sloan, L. C.

    2003-12-01

    Paleoenviromental archives record a range of information about past environments. Three key influences shaping paleoclimate records at a given time plane are the mean state of the climate system, interannual variability, and the frequency and seasonality of extreme climate events. We have employed a high resolution regional climate model (RCM) to test the sensitivity of extreme climate events to 6 ka orbital forcing, using western North America as a case study. Extreme precipitation and temperature events were defined by the distribution of daily precipitation and temperature values in the control simulation. Simulated anomalies (6 ka - control) in the number of extreme precipitation events per year were positive throughout the RCM domain, as were anomalies in the percent of annual precipitation delivered by extreme precipitation events. These annual-scale positive anomalies in extreme precipitation were driven by changes in the seasonality of extreme precipitation events at 6 ka, with January, October and November showing the greatest positive anomalies in percent of monthly precipitation delivered by extreme precipitation events. The frequency and length of extreme temperature events in the western United States was also sensitive to 6 ka orbital forcing. Positive anomalies in the frequency of extreme maximum daily temperature values occurred inland in the RCM domain, with peak anomalies of 24 days/year centered over the Great Basin. Likewise, the number of days/year in which the maximum daily temperature exceeded 32° C increased over land by 24%, with the average heat-wave up to 12 days longer in the 6 ka simulation than in the control simulation. Finally, mean first and last freeze dates were later inland in the 6 ka simulation than in the control simulation.

  3. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  4. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  5. Sequential Events Control System (SECS) Overview

    NASA Technical Reports Server (NTRS)

    Interbartolo, Michael

    2009-01-01

    This slide presentation will cover the Sequential Events Control System (SECS), which is the Apollo spacecraft subsystem that controls the automatically sequenced functions during the mission and during any a borts that could be performed. Included in this presentation are its general architecture, its integration into and use of the spacecraft' s other systems, and details on the functions it is responsible for c ontrolling during the mission. The objectives are to describe the system's architecture, the major components in the system, and the major system functions.

  6. Event-driven simulation of cerebellar granule cells.

    PubMed

    Carrillo, Richard R; Ros, Eduardo; Tolu, Silvia; Nieus, Thierry; D'Angelo, Egidio

    2008-01-01

    Around half of the neurons of a human brain are granule cells (approximately 10(11)granule neurons) [Kandel, E.R., Schwartz, J.H., Jessell, T.M., 2000. Principles of Neural Science. McGraw-Hill Professional Publishing, New York]. In order to study in detail the functional role of the intrinsic features of this cell we have developed a pre-compiled behavioural model based on the simplified granule-cell model of Bezzi et al. [Bezzi, M., Nieus, T., Arleo, A., D'Angelo, E., Coenen, O.J.-M.D., 2004. Information transfer at the mossy fiber-granule cell synapse of the cerebellum. 34th Annual Meeting. Society for Neuroscience, San Diego, CA, USA]. We can use an efficient event-driven simulation scheme based on lookup tables (EDLUT) [Ros, E., Carrillo, R.R., Ortigosa, E.M., Barbour, B., Ags, R., 2006. Event-driven simulation scheme for spiking neural networks using lookup tables to characterize neuronal dynamics. Neural Computation 18 (12), 2959-2993]. For this purpose it is necessary to compile into tables the data obtained through a massive numerical calculation of the simplified cell model. This allows network simulations requiring minimal numerical calculation. There are three major features that are considered functionally relevant in the simplified granule cell model: bursting, subthreshold oscillations and resonance. In this work we describe how the cell model is compiled into tables keeping these key properties of the neuron model.

  7. Earthquake Simulations and Historical Patterns of Events: Forecasting the Next Great Earthquake in California

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Rundle, J. B.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M.; Kellogg, L. H.

    2013-12-01

    The fault system in California combined with some of the United States most densely populated regions is a recipe for devastation. It has been estimated that a repeat of the 1906 m=7.8 San Francisco earthquake could cause as much as $84 billion in damage. Earthquake forecasting can help alleviate the effects of these events by targeting disaster relief and preparedness in regions that will need it the most. However, accurate earthquake forecasting has proven difficult. We present a forecasting technique that uses simulated earthquake catalogs generated by Virtual California and patterns of historical events. As background, we also describe internal details of the Virtual California earthquake simulator.

  8. Interoperability Standards for Medical Simulation Systems

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Diallo, Saikou Y.; Padilla, Jose J.

    2012-01-01

    The Modeling and Simulation Community successfully developed and applied interoperability standards like the Distributed Interactive Simulation (DIS) protocol (IEEE 1278) and the High Level Architecture (HLA) (IEEE 1516). These standards were applied for world-wide distributed simulation events for several years. However, this paper shows that some of the assumptions and constraints underlying the philosophy of these current standards are not valid for Medical Simulation Systems. This paper describes the standards, the philosophy and the limits for medical applications and recommends necessary extensions of the standards to support medical simulation.

  9. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  10. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    SciTech Connect

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  11. The Compass Paradigm for the Systematic Evaluation of U.S. Army Command and Control Systems Using Neural Network and Discrete Event Computer Simulation

    DTIC Science & Technology

    2005-11-01

    process remains an enigma that is the subject of constant study, analysis, and debate. The new operational paradigm that is promulgating throughout the...Summaries WWI World War I WWII World War II WWMCCS Worldwide Military Command And Control System XM30 LCPK Missile Low-Cost Precision Kill XM984

  12. 3D Simulation of External Flooding Events for the RISMC Pathway

    SciTech Connect

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad; Smith, Curtis; Lin, Linyu

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.

  13. Thermodynamic MHD Simulation of the Bastille Day Event

    NASA Astrophysics Data System (ADS)

    Torok, Tibor; Downs, Cooper; Lionello, Roberto; Linker, Jon A.; Mikic, Zoran; Titov, Viacheslav S.; Riley, Pete

    2014-05-01

    The "Bastille Day" event on July 14, 2000 is one of the most extensively studied solar eruptions. It originated in a complex active region close to disk center and produced an X5.7 flare, a fast halo CME, and an intense geomagnetic storm. We have recently begun to model this challenging event, with the final goal to simulate its whole evolution, from the pre-eruptive state to the CME's arrival at 1 AU. To this end, we first produce a steady-state MHD solution of the background corona that incorporates realistic energy transport ("thermodynamic MHD"), photospheric magnetic field measurements, and the solar wind. In order to model the pre-eruptive magnetic field, we then insert into this solution a stable, elongated flux rope that resides above the highly curved polarity inversion line of the active region. Finally, we produce an eruption by imposing photospheric flows that slowly converge towards the polarity inversion line. In this presentation we describe our method, compare the simulation results with the observations, and discuss the challenges and limitations involved in modeling such complex and powerful eruptions.

  14. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  15. Cardiovascular Events in Systemic Lupus Erythematosus

    PubMed Central

    Fernández-Nebro, Antonio; Rúa-Figueroa, Íñigo; López-Longo, Francisco J.; Galindo-Izquierdo, María; Calvo-Alén, Jaime; Olivé-Marqués, Alejandro; Ordóñez-Cañizares, Carmen; Martín-Martínez, María A.; Blanco, Ricardo; Melero-González, Rafael; Ibáñez-Rúan, Jesús; Bernal-Vidal, José Antonio; Tomero-Muriel, Eva; Uriarte-Isacelaya, Esther; Horcada-Rubio, Loreto; Freire-González, Mercedes; Narváez, Javier; Boteanu, Alina L.; Santos-Soler, Gregorio; Andreu, José L.; Pego-Reigosa, José M.

    2015-01-01

    Abstract This article estimates the frequency of cardiovascular (CV) events that occurred after diagnosis in a large Spanish cohort of patients with systemic lupus erythematosus (SLE) and investigates the main risk factors for atherosclerosis. RELESSER is a nationwide multicenter, hospital-based registry of SLE patients. This is a cross-sectional study. Demographic and clinical variables, the presence of traditional risk factors, and CV events were collected. A CV event was defined as a myocardial infarction, angina, stroke, and/or peripheral artery disease. Multiple logistic regression analysis was performed to investigate the possible risk factors for atherosclerosis. From 2011 to 2012, 3658 SLE patients were enrolled. Of these, 374 (10.9%) patients suffered at least a CV event. In 269 (7.4%) patients, the CV events occurred after SLE diagnosis (86.2% women, median [interquartile range] age 54.9 years [43.2–66.1], and SLE duration of 212.0 months [120.8–289.0]). Strokes (5.7%) were the most frequent CV event, followed by ischemic heart disease (3.8%) and peripheral artery disease (2.2%). Multivariate analysis identified age (odds ratio [95% confidence interval], 1.03 [1.02–1.04]), hypertension (1.71 [1.20–2.44]), smoking (1.48 [1.06–2.07]), diabetes (2.2 [1.32–3.74]), dyslipidemia (2.18 [1.54–3.09]), neurolupus (2.42 [1.56–3.75]), valvulopathy (2.44 [1.34–4.26]), serositis (1.54 [1.09–2.18]), antiphospholipid antibodies (1.57 [1.13–2.17]), low complement (1.81 [1.12–2.93]), and azathioprine (1.47 [1.04–2.07]) as risk factors for CV events. We have confirmed that SLE patients suffer a high prevalence of premature CV disease. Both traditional and nontraditional risk factors contribute to this higher prevalence. Although it needs to be verified with future studies, our study also shows—for the first time—an association between diabetes and CV events in SLE patients. PMID:26200625

  16. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder... (licensee) shall submit a Licensee Event Report (LER) for any event of the type described in this...

  17. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder... (licensee) shall submit a Licensee Event Report (LER) for any event of the type described in this...

  18. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder... (licensee) shall submit a Licensee Event Report (LER) for any event of the type described in this...

  19. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder... (licensee) shall submit a Licensee Event Report (LER) for any event of the type described in this...

  20. 10 CFR 50.73 - Licensee event report system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Licensee event report system. 50.73 Section 50.73 Energy..., Records, Reports, Notifications § 50.73 Licensee event report system. (a) Reportable events.(1) The holder... (licensee) shall submit a Licensee Event Report (LER) for any event of the type described in this...

  1. Weather Climate Interactions and Extreme Events in the Climate System

    NASA Astrophysics Data System (ADS)

    Roundy, P. E.

    2015-12-01

    The most pronounced local impacts of climate change would occur in association with extreme weather events superimposed on the altered climate. Thus a major thrust of recent efforts in the climate community has been to assess how extreme regional events such as cold air outbreaks, heat waves, tropical cyclones, floods, droughts, and severe weather might change with the climate. Many of these types of events are poorly simulated in climate models because of insufficient spatial resolution and insufficient quality parameterization of sub grid scale convection and radiation processes. This talk summarizes examples selected from those discussed below of how weather and climate events can be interconnected so that the physics of natural climate and weather phenomena depend on each other, thereby complicating our ability to simulate extreme events. A major focus of the chapter is on the Madden Julian oscillation (MJO), which is associated with alternating eastward-moving planetary scale regions of enhanced and suppressed moist deep convection favoring warm pool regions in the tropics. The MJO modulates weather events around the world and influences the evolution of interannual climate variability. We first discuss how the MJO evolves together with the seasonal cycle, the El Niño/southern oscillation (ENSO), and the extratropical circulation, then continue with a case study illustration of how El Niño is intrinsically coupled to intraseasonal and synoptic weather events such as the MJO and westerly wind bursts. This interconnectedness in the system implies that modeling many types of regional extreme weather events requires more than simply downscaling coarse climate model signals to nested regional models because extreme outcomes in a region can depend on poorly simulated extreme weather in distant parts of the world. The authors hope that an improved understanding of these types of interactions between signals across scales of time and space will ultimately yield

  2. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  3. Constraints on Cumulus Parameterization from Simulations of Observed MJO Events

    NASA Technical Reports Server (NTRS)

    Del Genio, Anthony; Wu, Jingbo; Wolf, Audrey B.; Chen, Yonghua; Yao, Mao-Sung; Kim, Daehyun

    2015-01-01

    Two recent activities offer an opportunity to test general circulation model (GCM) convection and its interaction with large-scale dynamics for observed Madden-Julian oscillation (MJO) events. This study evaluates the sensitivity of the Goddard Institute for Space Studies (GISS) GCM to entrainment, rain evaporation, downdrafts, and cold pools. Single Column Model versions that restrict weakly entraining convection produce the most realistic dependence of convection depth on column water vapor (CWV) during the Atmospheric Radiation Measurement MJO Investigation Experiment at Gan Island. Differences among models are primarily at intermediate CWV where the transition from shallow to deeper convection occurs. GCM 20-day hindcasts during the Year of Tropical Convection that best capture the shallow–deep transition also produce strong MJOs, with significant predictability compared to Tropical Rainfall Measuring Mission data. The dry anomaly east of the disturbance on hindcast day 1 is a good predictor of MJO onset and evolution. Initial CWV there is near the shallow–deep transition point, implicating premature onset of deep convection as a predictor of a poor MJO simulation. Convection weakly moistens the dry region in good MJO simulations in the first week; weakening of large-scale subsidence over this time may also affect MJO onset. Longwave radiation anomalies are weakest in the worst model version, consistent with previous analyses of cloud/moisture greenhouse enhancement as the primary MJO energy source. The authors’ results suggest that both cloud-/moisture-radiative interactions and convection–moisture sensitivity are required to produce a successful MJO simulation.

  4. Characteristics of rainfall events in regional climate model simulations for the Czech Republic

    NASA Astrophysics Data System (ADS)

    Svoboda, Vojtěch; Hanel, Martin; Máca, Petr; Kyselý, Jan

    2017-02-01

    Characteristics of rainfall events in an ensemble of 23 regional climate model (RCM) simulations are evaluated against observed data in the Czech Republic for the period 1981-2000. Individual rainfall events are identified using the concept of minimum inter-event time (MIT) and only heavy events (15 % of events with the largest event depths) during the warm season (May-September) are considered. Inasmuch as an RCM grid box represents a spatial average, the effects of areal averaging of rainfall data on characteristics of events are investigated using the observed data. Rainfall events from the RCM simulations are then compared to those from the at-site and area-average observations. Simulated number of heavy events and seasonal total precipitation due to heavy events are on average represented relatively well despite the higher spatial variation compared to observations. RCM-simulated event depths are comparable to the area-average observations, while event durations are overestimated and other characteristics related to rainfall intensity are significantly underestimated. The differences between RCM-simulated and at-site observed rainfall event characteristics are in general dominated by the biases of the climate models rather than the areal-averaging effect. Most of the rainfall event characteristics in the majority of the RCM simulations show a similar altitude-dependence pattern as in the observed data. The number of heavy events and seasonal total precipitation due to heavy events increase with altitude, and this dependence is captured better by the RCM simulations with higher spatial resolution.

  5. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  6. Coupling expert systems and simulation

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G.; Padalkar, S.; Rodriguez-Moscoso, J.; Hsieh, B. J.; Vinz, F.; Fernandez, K. R.

    1988-01-01

    A prototype coupled system called NESS (NASA Expert Simulation System) is described. NESS assists the user in running digital simulations of dynamic systems, interprets the output data to performance specifications, and recommends a suitable series compensator to be added to the simulation model.

  7. Production of Nitrogen Oxides by Laboratory Simulated Transient Luminous Events

    NASA Astrophysics Data System (ADS)

    Peterson, H.; Bailey, M.; Hallett, J.; Beasley, W.

    2007-12-01

    Restoration of the polar stratospheric ozone layer has occurred at rates below those originally expected following reductions in chlorofluorocarbon (CFC) usage. Additional reactions affecting ozone depletion now must also be considered. This research examines nitrogen oxides (NOx) produced in the middle atmosphere by transient luminous events (TLEs), with NOx production in this layer contributing to the loss of stratospheric ozone. In particular, NOx produced by sprites in the mesosphere would be transported to the polar stratosphere via the global meridional circulation and downward diffusion. A pressure-controlled vacuum chamber was used to simulate middle atmosphere pressures, while a power supply and in-chamber electrodes were used to simulate TLEs in the pressure controlled environment. Chemiluminescence NOx analyzers were used to sample NOx produced by the chamber discharges- originally a Monitor Labs Model 8440E, later a Thermo Environment Model 42. Total NOx production for each discharge as well as NOx per ampere of current and NOx per Joule of discharge energy were plotted. Absolute NOx production was greatest for discharge environments with upper tropospheric pressures (100-380 torr), while NOx/J was greatest for discharge environments with stratospheric pressures (around 10 torr). The different production efficiencies in NOx/J as a function of pressure pointed to three different production regimes, each with its own reaction mechanisms: one for tropospheric pressures, one for stratospheric pressures, and one for upper stratospheric to mesospheric pressures (no greater than 1 torr).

  8. Transportation Anslysis Simulation System

    SciTech Connect

    2004-08-23

    TRANSIMS version 3.1 is an integrated set of analytical and simulation models and supporting databases. The system is designed to create a virtual metropolitan region with representation of each of the region’s individuals, their activities and the transportation infrastructure they use. TRANSIMS puts into practice a new, disaggregate approach to travel demand modeling using agent-based micro-simulation technology. TRANSIMS methodology creates a virtual metropolitan region with representation of the transportation infrastructure and the population, at the level of households and individual travelers. Trips a planned to satisfy the population’s activity pattems at the individual traveler level. TRANSIMS then simulates the movement of travelers and vehicles across the transportation network using multiple modes, including car, transit, bike and walk, on a second-by-second basis. Metropolitan planners must plan growth of their cities according to the stringent transportation system planning requirements of the Interniodal Surface Transportation Efficiency Act of 1991, the Clean Air Act Amendments of 1990 and other similar laws and regulations. These require each state and its metropotitan regions to work together to develop short and long term transportation improvement plans. The plans must (1) estimate the future transportation needs for travelers and goods movements, (2) evaluate ways to manage and reduce congestion, (3) examine the effectiveness of building new roads and transit systems, and (4) limit the environmental impact of the various strategies. The needed consistent and accurate transportation improvement plans require an analytical capability that properly accounts for travel demand, human behavior, traffic and transit operations, major investments, and environmental effects. Other existing planning tools use aggregated information and representative behavior to predict average response and average use of transportation facilities. They do not account

  9. Improved transition path sampling methods for simulation of rare events.

    PubMed

    Chopra, Manan; Malshe, Rohit; Reddy, Allam S; de Pablo, J J

    2008-04-14

    The free energy surfaces of a wide variety of systems encountered in physics, chemistry, and biology are characterized by the existence of deep minima separated by numerous barriers. One of the central aims of recent research in computational chemistry and physics has been to determine how transitions occur between deep local minima on rugged free energy landscapes, and transition path sampling (TPS) Monte-Carlo methods have emerged as an effective means for numerical investigation of such transitions. Many of the shortcomings of TPS-like approaches generally stem from their high computational demands. Two new algorithms are presented in this work that improve the efficiency of TPS simulations. The first algorithm uses biased shooting moves to render the sampling of reactive trajectories more efficient. The second algorithm is shown to substantially improve the accuracy of the transition state ensemble by introducing a subset of local transition path simulations in the transition state. The system considered in this work consists of a two-dimensional rough energy surface that is representative of numerous systems encountered in applications. When taken together, these algorithms provide gains in efficiency of over two orders of magnitude when compared to traditional TPS simulations.

  10. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  11. Simulating neural systems with Xyce.

    SciTech Connect

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting; Warrender, Christina E.; Aimone, James Bradley; Teeter, Corinne; Duda, Alex M.

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  12. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS AND PROCESSES

    SciTech Connect

    Jaros, W.

    2005-08-30

    The purpose of this report is to evaluate and document the inclusion or exclusion of engineered barrier system (EBS) features, events, and processes (FEPs) with respect to models and analyses used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for exclusion screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d, e, and f) [DIRS 173273]. The FEPs addressed in this report deal with those features, events, and processes relevant to the EBS focusing mainly on those components and conditions exterior to the waste package and within the rock mass surrounding emplacement drifts. The components of the EBS are the drip shield, waste package, waste form, cladding, emplacement pallet, emplacement drift excavated opening (also referred to as drift opening in this report), and invert. FEPs specific to the waste package, cladding, and drip shield are addressed in separate FEP reports: for example, ''Screening of Features, Events, and Processes in Drip Shield and Waste Package Degradation'' (BSC 2005 [DIRS 174995]), ''Clad Degradation--FEPs Screening Arguments (BSC 2004 [DIRS 170019]), and Waste-Form Features, Events, and Processes'' (BSC 2004 [DIRS 170020]). For included FEPs, this report summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). This report also documents changes to the EBS FEPs list that have occurred since the previous versions of this report. These changes have resulted due to a reevaluation of the FEPs for TSPA-LA as identified in Section 1.2 of this report and described in more detail in Section 6.1.1. This revision addresses updates in Yucca Mountain Project (YMP) administrative procedures as they

  13. Design and implementation of a distributed Complex Event Processing system

    NASA Astrophysics Data System (ADS)

    Li, Yan; Shang, Yanlei

    2017-01-01

    Making use of the massive events from event sources such as sensors and bank transactions and extract valuable information is of significant importance. Complex Event Processing (CEP), a method of detecting complex events from simple events stream, provides a solution of processing data in real time fast and efficiently. However, a single node CEP system can't satisfy requirements of processing massive event streams from multitudinous event sources. Therefore, this article designs a distributed CEP system, which combine Siddhi, a CEP engine, and Storm, a distributed real time computation architecture. This system can construct topology automatically based on the event streams and execution plans provided by users and process the event streams parallel. Compared with single node complex event system, the distributed system can achieve better performance.

  14. Event communication in a regional disease surveillance system.

    PubMed

    Loschen, Wayne; Coberly, Jacqueline; Sniegoski, Carol; Holtry, Rekha; Sikes, Marvin; Happel Lewis, Sheryl

    2007-10-11

    When real-time disease surveillance is practiced in neighboring states within a region, public health users may benefit from easily sharing their concerns and findings regarding potential health threats. To better understand the need for this capability, an event communications component (ECC) was added to the National Capital Region Disease Surveillance System, an operational biosurveillance system employed in the District of Columbia and in surrounding Maryland and Virginia counties. Through usage analysis and user survey methods, we assessed the value of the enhanced system in daily operational use and during two simulated exercises. Results suggest that the system has utility for regular users of the system as well as suggesting several refinements for future implementations.

  15. Event-based simulation of neutron experiments: interference, entanglement and uncertainty relations

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; De Raedt, Hans

    2014-04-01

    We discuss a discrete-event simulation approach, which has been shown to give a unified cause-and-effect description of many quantum optics and single-neutron interferometry experiments. The event-based simulation algorithm does not require the knowledge of the solution of a wave equation of the whole system, yet reproduces the corresponding statistical distributions by generating detection events one-by-one. It is showm that single-particle interference and entanglement, two important quantum phenomena, emerge via information exchange between individual particles and devices such as beam splitters, polarizers and detectors. We demonstrate this by reproducing the results of several single-neutron interferometry experiments, including one that demonstrates interference and one that demonstrates the violation of a Bell-type inequality. We also present event-based simulation results of a single neutron experiment designed to test the validity of Ozawa's universally valid error-disturbance relation, an uncertainty relation derived using the theory of general quantum measurements.

  16. Sediment transport in grassed swales during simulated runoff events.

    PubMed

    Bäckström, M

    2002-01-01

    Particle trapping in nine different grassed swales was measured successfully with a standardised runoff event simulation procedure. The percentage of total suspended solids removed ranged from 79 to 98%. It was found that sedimentation processes, rather than grass filtration governed the overall particle trapping efficiency. The highest particle trapping efficiency was observed in the field swales with dense, fully developed turf. A high infiltration rate was beneficial for the particle trapping and an increased swale length made it possible for smaller particles to be captured. A densely vegetated, ten metre long swale, receiving a stormwater flow of 1.0 litres per second, may capture a majority of the waterborne particles with settling velocities larger than 0.1 metres per hour. A simple model of particle trapping efficiency in grassed swales was developed and tested. It was found that mean swale residence time could be used as a design parameter for particle removal in grassed swales. The suggested exponential relationship between mean swale residence time and particle settling velocity associated with a certain trapping efficiency is so far only valid for a limited range of swale designs and residence times.

  17. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  18. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    PubMed

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  19. On the nature of medial temporal lobe contributions to the constructive simulation of future events

    PubMed Central

    Schacter, Daniel L.; Addis, Donna Rose

    2009-01-01

    A rapidly growing number of studies indicate that imagining or simulating possible future events depends on much of the same neural machinery as does remembering past events. One especially striking finding is that the medial temporal lobe (MTL), which has long been linked to memory function, appears to be similarly engaged during future event simulation. This paper focuses on the role of two MTL regions—the hippocampus and parahippocampal cortex—in thinking about the future and building mental simulations. PMID:19528005

  20. Simulation of a continuous lignite excavation system

    SciTech Connect

    Michalakopoulos, T.N.; Arvaniti, S.E.; Panagiotou, G.N.

    2005-07-01

    A discrete-event simulation model using the GPSS/H simulation language has been developed for a excavation system at a multi- level terrace mine. The continuous excavation system consists of five bucket wheel excavators and a network of 22 km of belt conveyors. Ways of dealing with the continuous material flow and frequent changes of material type are considered. The principal model output variables are production and arrival rate at the transfer point of mineral and waste. Animation and comparison with previous production data have been used to validate the model. 14 refs., 6 figs., 1 tab.

  1. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  2. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  3. Analysis and Simulations of Space Radiation Induced Single Event Transients

    NASA Astrophysics Data System (ADS)

    Perez, Reinaldo

    2016-05-01

    Spacecraft electronics are affected by the space radiation environment. Among the different types of radiation effects that can affect spacecraft electronics is the single event transients. The space environment is responsible for many of the single event transients which can upset the performance of the spacecraft avionics hardware. In this paper we first explore the origins of single event transients, then explore the modeling of a single event transient in digital and analog circuit. The paper also addresses the concept of crosstalk that could develop among digital circuits in the present of a SET event. The paper ends with a brief discussion of SET hardening. The goal of the paper is to provide methodologies for assessing single event transients and their effects so that spacecraft avionics engineers can develop either hardware or software countermeasures in their designs.

  4. Regional Climate Simulation of the Anomalous Events of 1998 using a Stretched-Grid GCM with Multiple Areas of Interest

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, M. S.; Takacs, L. L.; Govindaraju, R. C.; Atlas, Robert (Technical Monitor)

    2002-01-01

    The GEOS (Goddard Earth Observing System) stretched-grid (SG) GCM developed and thoroughly tested over the last few years, is used for simulating the major anomalous regional climate events of 1998. The anomalous regional climate events are simulated simultaneously during the 13 months long (November-1997 - December-1998) SG-GCM simulation due to using the new SG-design with multiple (four) areas of interest. The following areas/regions of interest (one at each global quadrant) are implemented: U.S./Northern Mexico, the El-Nino/Brazil area, India-China, and Eastern Indian Ocean/Australia.

  5. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  6. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  7. A Study on Discrete Event Simulation (DES) in a High-Level Architecture (HLA) Networked Simulation

    DTIC Science & Technology

    2010-12-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Singapore Technologies Electronics (Training & Simulation System) Pte Ltd 24 Ang Mo Kio St 65 Singapore...Thesis Co-Advisor Mathias Kölsch Chairman, MOVES Academic Committee Peter J. Denning Chairman, Department of Computer Science iv...and Simulation and broaden his academic spectrum. xiv THIS PAGE INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. OVERVIEW Since the evolution of

  8. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  9. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  10. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism.

    PubMed

    van Beek, Johannes H G M; Supandi, Farahaniza; Gavai, Anand K; de Graaf, Albert A; Binsl, Thomas W; Hettling, Hannes

    2011-11-13

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible.

  11. Towards High Performance Discrete-Event Simulations of Smart Electric Grids

    SciTech Connect

    Perumalla, Kalyan S; Nutaro, James J; Yoginath, Srikanth B

    2011-01-01

    Future electric grid technology is envisioned on the notion of a smart grid in which responsive end-user devices play an integral part of the transmission and distribution control systems. Detailed simulation is often the primary choice in analyzing small network designs, and the only choice in analyzing large-scale electric network designs. Here, we identify and articulate the high-performance computing needs underlying high-resolution discrete event simulation of smart electric grid operation large network scenarios such as the entire Eastern Interconnect. We focus on the simulator's most computationally intensive operation, namely, the dynamic numerical solution for the electric grid state, for both time-integration as well as event-detection. We explore solution approaches using general-purpose dense and sparse solvers, and propose a scalable solver specialized for the sparse structures of actual electric networks. Based on experiments with an implementation in the THYME simulator, we identify performance issues and possible solution approaches for smart grid experimentation in the large.

  12. Simulating the physiology of athletes during endurance sports events: modelling human energy conversion and metabolism

    PubMed Central

    van Beek, Johannes H. G. M.; Supandi, Farahaniza; Gavai, Anand K.; de Graaf, Albert A.; Binsl, Thomas W.; Hettling, Hannes

    2011-01-01

    The human physiological system is stressed to its limits during endurance sports competition events. We describe a whole body computational model for energy conversion during bicycle racing. About 23 per cent of the metabolic energy is used for muscle work, the rest is converted to heat. We calculated heat transfer by conduction and blood flow inside the body, and heat transfer from the skin by radiation, convection and sweat evaporation, resulting in temperature changes in 25 body compartments. We simulated a mountain time trial to Alpe d'Huez during the Tour de France. To approach the time realized by Lance Armstrong in 2004, very high oxygen uptake must be sustained by the simulated cyclist. Temperature was predicted to reach 39°C in the brain, and 39.7°C in leg muscle. In addition to the macroscopic simulation, we analysed the buffering of bursts of high adenosine triphosphate hydrolysis by creatine kinase during cyclical muscle activity at the biochemical pathway level. To investigate the low oxygen to carbohydrate ratio for the brain, which takes up lactate during exercise, we calculated the flux distribution in cerebral energy metabolism. Computational modelling of the human body, describing heat exchange and energy metabolism, makes simulation of endurance sports events feasible. PMID:21969677

  13. Parallelized event chain algorithm for dense hard sphere and polymer systems

    SciTech Connect

    Kampmann, Tobias A. Boltz, Horst-Holger; Kierfeld, Jan

    2015-01-15

    We combine parallelization and cluster Monte Carlo for hard sphere systems and present a parallelized event chain algorithm for the hard disk system in two dimensions. For parallelization we use a spatial partitioning approach into simulation cells. We find that it is crucial for correctness to ensure detailed balance on the level of Monte Carlo sweeps by drawing the starting sphere of event chains within each simulation cell with replacement. We analyze the performance gains for the parallelized event chain and find a criterion for an optimal degree of parallelization. Because of the cluster nature of event chain moves massive parallelization will not be optimal. Finally, we discuss first applications of the event chain algorithm to dense polymer systems, i.e., bundle-forming solutions of attractive semiflexible polymers.

  14. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  15. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  16. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  17. The simulation of a MCS event in the South America using a radiative transfer model

    NASA Astrophysics Data System (ADS)

    Silveira, B. B.; Aravéquia, J. A.

    2011-12-01

    The Mesoescale Convective Systems (MCS) have an important role in the total precipitation in some regions in the world. The Southeast of South America is one of these regions, because in this area the environment favors the development of MCS. The satellite image is an important data used in the identification and characterization of these systems. In these images the MCSs are characterize for have a low values of Brightness Temperature (BT). A channel utilized to identify these systems is 4 (infrared) of the sensor imager of GOES 10 satellite. With the objective of identify a MCS with an atmospheric model 12h forecast was realized a simulation of BT to channel 4 of GOES 10 using a radiative transfer model. The MCS event chosen was one that occur between 9 and 10 November 2008 and this system reached the North of Argentine and Paraguay. This MCS was identified using the outputs of FORTACC (Forecast and Tracking of Active Convective Cells). The BT simulation was realized using the radiative transfer model CRTM version 2.0.2 (Community Radiative Transfer Model) from JCSDA (Joint Center for Satellite Data Assimilation). To realize the simulation was used a 12 hours forecast from ETA model, this atmospheric model is an operational model from the CPTEC/INPE (Centro de Previsão de Tempo e Estudos Climáticos/ Instituto Nacional de Pesquisas Epaciais). The ETA model has 20x20 Km horizontal spatial resolution and 19 levels in the vertical. The simulation of BT values with CRTM indicates the region where the MCS occurred. However the BT values are overestimated by the CRTM, the simulated amounts are quantitatively higher than the observed by the channel 4 from GOES 10. The area with BT values related to the MCS is smaller than the observed in the satellite image, the system shape also wasn't simulated the satisfactory way.

  18. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  19. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  20. Discrete-event simulation of nuclear-waste transport in geologic sites subject to disruptive events. Final report

    SciTech Connect

    Aggarwal, S.; Ryland, S.; Peck, R.

    1980-06-19

    This report outlines a methodology to study the effects of disruptive events on nuclear waste material in stable geologic sites. The methodology is based upon developing a discrete events model that can be simulated on the computer. This methodology allows a natural development of simulation models that use computer resources in an efficient manner. Accurate modeling in this area depends in large part upon accurate modeling of ion transport behavior in the storage media. Unfortunately, developments in this area are not at a stage where there is any consensus on proper models for such transport. Consequently, our work is directed primarily towards showing how disruptive events can be properly incorporated in such a model, rather than as a predictive tool at this stage. When and if proper geologic parameters can be determined, then it would be possible to use this as a predictive model. Assumptions and their bases are discussed, and the mathematical and computer model are described.

  1. Event Prediction for Modeling Mental Simulation in Naturalistic Decision Making

    DTIC Science & Technology

    2005-12-01

    identical except for orientation, they are mentally rotated. (adapted from Shepard & Metzler 1971). Mental simulation in the psychological domain...own minds to simulate the psychological causes of others’ behavior, typically by making decisions within a pretended context (Gordon, 2001). Ac

  2. Forward flux sampling-type schemes for simulating rare events: efficiency analysis.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-05-21

    We analyze the efficiency of several simulation methods which we have recently proposed for calculating rate constants for rare events in stochastic dynamical systems in or out of equilibrium. We derive analytical expressions for the computational cost of using these methods and for the statistical error in the final estimate of the rate constant for a given computational cost. These expressions can be used to determine which method to use for a given problem, to optimize the choice of parameters, and to evaluate the significance of the results obtained. We apply the expressions to the two-dimensional nonequilibrium rare event problem proposed by Maier and Stein [Phys. Rev. E 48, 931 (1993)]. For this problem, our analysis gives accurate quantitative predictions for the computational efficiency of the three methods.

  3. Impulsive events in the evolution of a forced nonlinear system

    SciTech Connect

    Longcope, D.W.; Sudan, R.N. )

    1992-03-16

    Long-time numerical solutions of a low-dimensional model of the reduced MHD equations show that, when this system is driven quasistatically, the response is punctuated by impulsive events. The statistics of these events indicate a Poisson process; the frequency of these events scales as {Delta}{ital E}{sub {ital M}}{sup {minus}1}, where {Delta}{ital E}{sub {ital M}} is the energy released in one event.

  4. A multiprocessor operating system simulator

    SciTech Connect

    Johnston, G.M.; Campbell, R.H. . Dept. of Computer Science)

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  5. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  6. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  7. Representing Dynamic Social Networks in Discrete Event Social Simulation

    DTIC Science & Technology

    2010-12-01

    applied settings in the areas of marketing and behavior modification programs (exercise adoption, smoking cessation) ( Icek Ajzen 2006). The model has an...society. The action choice component of the conceptual model is based on the theory of planned behavior (TPB) (I. Ajzen 1991). The TPB states that an...information networks into military simulations. In Pro- ceedings of the 40th Conference on Winter Simulation. pp. 133–144. Ajzen , I. 1991. The theory of

  8. [The performance of respirator alarms during simulated critical events in CMV/IPPV artificial respiration].

    PubMed

    Bender, H J; Frankenberger, H; Ryll, C; Albrecht, M D

    1993-06-01

    Alarm systems of ventilators enhance detection of possible critical events during artificial ventilation. Due to their significance, in some countries the alarm detection of ventilators is regulated by federal law. Up to now, no recommendations for the adjustment of alarm limits exist and only a few detailed investigations of the accuracy of alarm detection are available. METHODS. The response of four commercially available ventilators (Servoventilator 900C, Siemens, Inc.; Bennett 7200a, Hoyer, Inc.; Veolar, Hamilton, Inc.; EVITA, Dräger, Inc.) to critical events during artificial ventilation of a test lung were evaluated. We measured the alarm time (the time between event creation and alarm response) of ten different simulated critical events including disconnection, differentisized leaks, failure of the gas supply, and obstruction at different places in the artificial airway. DISCUSSION. All respirators were able to recognise severe critical situations such as hose disconnection, failure of gas supply, and total airway obstruction within a short time (< 15 s). The recognition of small airway leaks was more difficult for the ventilators even when the alarm thresholds were close. The alarm detection of the EVITA (software 10.0 or less) under conditions of partial airway obstruction may be a source of risk for the patient as the machine continued supplying inspiration with pressure-limited ventilation even when the pressure threshold was reached.

  9. Analytic Perturbation Analysis of Discrete Event Dynamic Systems

    SciTech Connect

    Uryasev, S.

    1994-09-01

    This paper considers a new Analytic Perturbation Analysis (APA) approach for Discrete Event Dynamic Systems (DEDS) with discontinuous sample-path functions with respect to control parameters. The performance functions for DEDS usually are formulated as mathematical expectations, which can be calculated only numerically. APA is based on new analytic formulas for the gradients of expectations of indicator functions; therefore, it is called an analytic perturbation analysis. The gradient of performance function may not coincide with the expectation of a gradient of sample-path function (i.e., the interchange formula for the gradient and expectation sign may not be valid). Estimates of gradients can be obtained with one simulation run of the models.

  10. Stochastic Event Counter for Discrete-Event Systems Under Unreliable Observations

    SciTech Connect

    Tae-Sic Yoo; Humberto E. Garcia

    2008-06-01

    This paper addresses the issues of counting the occurrence of special events in the framework of partiallyobserved discrete-event dynamical systems (DEDS). First, we develop a noble recursive procedure that updates active counter information state sequentially with available observations. In general, the cardinality of active counter information state is unbounded, which makes the exact recursion infeasible computationally. To overcome this difficulty, we develop an approximated recursive procedure that regulates and bounds the size of active counter information state. Using the approximated active counting information state, we give an approximated minimum mean square error (MMSE) counter. The developed algorithms are then applied to count special routing events in a material flow system.

  11. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    NASA Astrophysics Data System (ADS)

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  12. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model.

    PubMed

    Sanzari, Jenine K; Diffenderfer, Eric S; Hagan, Sarah; Billings, Paul C; Gridley, Daila S; Seykora, John T; Kennedy, Ann R; Cengel, Keith A

    2015-07-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the derm is upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed.

  13. Dermatopathology effects of simulated solar particle event radiation exposure in the porcine model

    PubMed Central

    Sanzari, Jenine K.; Diffenderfer, Eric S.; Hagan, Sarah; Billings, Paul C.; Gridley, Daila S.; Seykora, John T.; Kennedy, Ann R.; Cengel, Keith A.

    2015-01-01

    The space environment exposes astronauts to risks of acute and chronic exposure to ionizing radiation. Of particular concern is possible exposure to ionizing radiation from a solar particle event (SPE). During an SPE, magnetic disturbances in specific regions of the Sun result in the release of intense bursts of ionizing radiation, primarily consisting of protons that have a highly variable energy spectrum. Thus, SPE events can lead to significant total body radiation exposures to astronauts in space vehicles and especially while performing extravehicular activities. Simulated energy profiles suggest that SPE radiation exposures are likely to be highest in the skin. In the current report, we have used our established miniature pig model system to evaluate the skin toxicity of simulated SPE radiation exposures that closely resemble the energy and fluence profile of the September, 1989 SPE using either conventional radiation (electrons) or proton simulated SPE radiation. Exposure of animals to electron or proton radiation led to dose-dependent increases in epidermal pigmentation, the presence of necrotic keratinocytes at the dermal-epidermal boundary and pigment incontinence, manifested by the presence of melanophages in the dermis upon histological examination. We also observed epidermal hyperplasia and a reduction in vascular density at 30 days following exposure to electron or proton simulated SPE radiation. These results suggest that the doses of electron or proton simulated SPE radiation results in significant skin toxicity that is quantitatively and qualitatively similar. Radiation-induced skin damage is often one of the first clinical signs of both acute and non-acute radiation injury where infection may occur, if not treated. In this report, histopathology analyses of acute radiation-induced skin injury are discussed. PMID:26256624

  14. Model Learning for Probabilistic Simulation on Rare Events and Scenarios

    DTIC Science & Technology

    2015-03-06

    rarely contain rate events, applied it to a rainfall flood risk analysis of Chikugo river, Japan, and showed that it can generate various rainfall...scenario that causes a flood . Rainfall pattern that causes a flood is generated by Replica Exchange Monte Carlo algorithm, and covariant shift...phenomenon was corrected by placing more weight on the flood region. This work gives a general framework to cope with the problem of handling a complex and

  15. First-Principles Simulations of Violent Space-Weather Events

    DTIC Science & Technology

    2008-01-01

    charged gases ( plasmas ) that comprise its ionosphere and magnetosphere . These changes are driven, for the most part, by fluctuations in the flow of...morphological magnetic-field and plasma signatures. Our results also exhibit homology (repeated similar events originating from a single source), prompt...magnetic field and plasma from the Sun—the solar wind. The effects of these changes can include direct damage by energetic particles to orbiting

  16. Evaluating the Capability of Information Technology to Prevent Adverse Drug Events: A Computer Simulation Approach

    PubMed Central

    Anderson, James G.; Jay, Stephen J.; Anderson, Marilyn; Hunt, Thaddeus J.

    2002-01-01

    Background: The annual cost of morbidity and mortality due to medication errors in the U.S. has been estimated at $76.6 billion. Information technology implemented systematically has the potential to significantly reduce medication errors that result in adverse drug events (ADEs). Objective: To develop a computer simulation model that can be used to evaluate the effectiveness of information technology applications designed to detect and prevent medication errors that result in adverse drug effects. Methods: A computer simulation model was constructed representing the medication delivery system in a hospital. STELLA, a continuous simulation software package, was used to construct the model. Parameters of the model were estimated from a study of prescription errors on two hospital medical/surgical units and used in the baseline simulation. Five prevention strategies were simulated based on information obtained from the literature. Results: The model simulates the four stages of the medication delivery system: prescribing, transcribing, dispensing, and administering drugs. We simulated interventions that have been demonstrated in prior studies to decrease error rates. The results suggest that an integrated medication delivery system can save up to 1,226 days of excess hospitalization and $1.4 million in associated costs annually in a large hospital. The results of the analyses regarding the effects of the interventions on the additional hospital costs associated with ADEs are somewhat sensitive to the distribution of errors in the hospital, more sensitive to the costs of an ADE, and most sensitive to the proportion of medication errors resulting in ADEs. Conclusions: The results suggest that clinical information systems are potentially a cost-effective means of preventing ADEs in hospitals and demonstrate the importance of viewing medication errors from a systems perspective. Prevention efforts that focus on a single stage of the process had limited impact on the

  17. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  18. MCNP6. Simulating Correlated Data in Fission Events

    SciTech Connect

    Rising, Michael Evan; Sood, Avneet

    2015-12-03

    This report is a series of slides discussing the MCNP6 code and its status in simulating fission. Applications of interest include global security and nuclear nonproliferation, detection of special nuclear material (SNM), passive and active interrogation techniques, and coincident neutron and photon leakage.

  19. Simulating Underbelly Blast Events using Abaqus/Explicit - CEL

    DTIC Science & Technology

    2013-01-15

    simplified hybrid elastic-plastic material model for geologic materials developed by the U.S. Army – ERDC was implemented as a VUMAT and used to describe...as a VUMAT and used to describe the soil. The simulations agree favorably with the test results and produce higher fidelity solutions than traditional

  20. A Summary of Some Discrete-Event System Control Problems

    NASA Astrophysics Data System (ADS)

    Rudie, Karen

    A summary of the area of control of discrete-event systems is given. In this research area, automata and formal language theory is used as a tool to model physical problems that arise in technological and industrial systems. The key ingredients to discrete-event control problems are a process that can be modeled by an automaton, events in that process that cannot be disabled or prevented from occurring, and a controlling agent that manipulates the events that can be disabled to guarantee that the process under control either generates all the strings in some prescribed language or as many strings as possible in some prescribed language. When multiple controlling agents act on a process, decentralized control problems arise. In decentralized discrete-event systems, it is presumed that the agents effecting control cannot each see all event occurrences. Partial observation leads to some problems that cannot be solved in polynomial time and some others that are not even decidable.

  1. Repetition-Related Reductions in Neural Activity during Emotional Simulations of Future Events

    PubMed Central

    2015-01-01

    Simulations of future experiences are often emotionally arousing, and the tendency to repeatedly simulate negative future outcomes has been identified as a predictor of the onset of symptoms of anxiety. Nonetheless, next to nothing is known about how the healthy human brain processes repeated simulations of emotional future events. In this study, we present a paradigm that can be used to study repeated simulations of the emotional future in a manner that overcomes phenomenological confounds between positive and negative events. The results show that pulvinar nucleus and orbitofrontal cortex respectively demonstrate selective reductions in neural activity in response to frequently as compared to infrequently repeated simulations of negative and positive future events. Implications for research on repeated simulations of the emotional future in both non-clinical and clinical populations are discussed. PMID:26390294

  2. Modeling and simulation of single-event effect in CMOS circuit

    NASA Astrophysics Data System (ADS)

    Suge, Yue; Xiaolin, Zhang; Yuanfu, Zhao; Lin, Liu; Hanning, Wang

    2015-11-01

    This paper reviews the status of research in modeling and simulation of single-event effects (SEE) in digital devices and integrated circuits. After introducing a brief historical overview of SEE simulation, different level simulation approaches of SEE are detailed, including material-level physical simulation where two primary methods by which ionizing radiation releases charge in a semiconductor device (direct ionization and indirect ionization) are introduced, device-level simulation where the main emerging physical phenomena affecting nanometer devices (bipolar transistor effect, charge sharing effect) and the methods envisaged for taking them into account are focused on, and circuit-level simulation where the methods for predicting single-event response about the production and propagation of single-event transients (SETs) in sequential and combinatorial logic are detailed, as well as the soft error rate trends with scaling are particularly addressed.

  3. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC

  4. Mesoscale Simulations of a Wind Ramping Event for Wind Energy Prediction

    SciTech Connect

    Rhodes, M; Lundquist, J K

    2011-09-21

    Ramping events, or rapid changes of wind speed and wind direction over a short period of time, present challenges to power grid operators in regions with significant penetrations of wind energy in the power grid portfolio. Improved predictions of wind power availability require adequate predictions of the timing of ramping events. For the ramping event investigated here, the Weather Research and Forecasting (WRF) model was run at three horizontal resolutions in 'mesoscale' mode: 8100m, 2700m, and 900m. Two Planetary Boundary Layer (PBL) schemes, the Yonsei University (YSU) and Mellor-Yamada-Janjic (MYJ) schemes, were run at each resolution as well. Simulations were not 'tuned' with nuanced choices of vertical resolution or tuning parameters so that these simulations may be considered 'out-of-the-box' tests of a numerical weather prediction code. Simulations are compared with sodar observations during a wind ramping event at a 'West Coast North America' wind farm. Despite differences in the boundary-layer schemes, no significant differences were observed in the abilities of the schemes to capture the timing of the ramping event. As collaborators have identified, the boundary conditions of these simulations probably dominate the physics of the simulations. They suggest that future investigations into characterization of ramping events employ ensembles of simulations, and that the ensembles include variations of boundary conditions. Furthermore, the failure of these simulations to capture not only the timing of the ramping event but the shape of the wind profile during the ramping event (regardless of its timing) indicates that the set-up and execution of such simulations for wind power forecasting requires skill and tuning of the simulations for a specific site.

  5. Efficient event-driven simulations shed new light on microtubule organization in the plant cortical array

    NASA Astrophysics Data System (ADS)

    Tindemans, Simon H.; Deinum, Eva E.; Lindeboom, Jelmer J.; Mulder, Bela M.

    2014-04-01

    The dynamics of the plant microtubule cytoskeleton is a paradigmatic example of the complex spatiotemporal processes characterising life at the cellular scale. This system is composed of large numbers of spatially extended particles, each endowed with its own intrinsic stochastic dynamics, and is capable of non-equilibrium self-organisation through collisional interactions of these particles. To elucidate the behaviour of such a complex system requires not only conceptual advances, but also the development of appropriate computational tools to simulate it. As the number of parameters involved is large and the behaviour is stochastic, it is essential that these simulations be fast enough to allow for an exploration of the phase space and the gathering of sufficient statistics to accurately pin down the average behaviour as well as the magnitude of fluctuations around it. Here we describe a simulation approach that meets this requirement by adopting an event-driven methodology that encompasses both the spontaneous stochastic changes in microtubule state as well as the deterministic collisions. In contrast with finite time step simulations this technique is intrinsically exact, as well as several orders of magnitude faster, which enables ordinary PC hardware to simulate systems of ˜ 10^3 microtubules on a time scale ˜ 10^{3} faster than real time. In addition we present new tools for the analysis of microtubule trajectories on curved surfaces. We illustrate the use of these methods by addressing a number of outstanding issues regarding the importance of various parameters on the transition from an isotropic to an aligned and oriented state.

  6. Simulation of LHC events on a millions threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2015-12-01

    Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.

  7. Simple Movement and Detection in Discrete Event Simulation

    DTIC Science & Technology

    2005-12-01

    with a description of uniform linear motion in the following section. We will then con- sider the simplest kind of sensing, the “ cookie -cutter.” A... cookie -cutter sensor sees everything that is within its range R, and must be notified at the precise time a target enters it range. In a time-step...simulation, cookie -cutter detection is very easy. Simply compute the distance between the sensor and the target at each time step. If the target is

  8. Torque Simulator for Rotating Systems

    NASA Technical Reports Server (NTRS)

    Davis, W. T.

    1982-01-01

    New torque brake simulates varying levels of friction in bearings of rotating body. Rolling-tail torque brake uses magnetic force to produce friction between rotating part and stationary part. Simulator electronics produce positive or negative feedback signal, depending on direction of rotation. New system allows for first time in-depth study of effects of tail-fin spin rates on pitch-, yaw-, and roll-control characteristics.

  9. Simulation Systems for Cognitive Psychology

    DTIC Science & Technology

    1982-08-01

    the deve )pment of SNOBOL). 5.1 The First Generation Of Psychological Simulation Languages Therefore, the first generatiov’ of specialized languages...SIMULATION SYSTEMS FOR COGNITIVE PSYCHOLOGY Robert Neche University of Pittsburgh. August 1982 Technical Report No. UPITT/LRDC/ONR/APS-12 This...research was sponsored by the Personnel and Training Research Programs, Psychological Sciences Division, Office of Naval Research, under Contract No. N00014

  10. A System for Interactive Behaviour Simulation.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    A psycho-ecological model is used as the basis for a simulation of interactive behavior strategies. The basic unit is an event, and each event has been recorded on closed circuit television videotape. The three basic paradigms of behavioral science--association, structure, and process--are used to anchor the simulations. The empirical foundation…

  11. Power system extreme event screening using graphpartitioning

    SciTech Connect

    Lesieutre, Bernard C.; Roy, Sandip; Donde, Vaibhav; Pinar, Ali

    2006-09-06

    We propose a partitioning problem in a power system contextthat weighs the two objectives of minimizing cuts between partitions andmaximizing the power imbalance between partitions. We then pose theproblem in a purely graph theoretic sense. We offer an approximatesolution through relaxation of the integer problem and suggest refinementusing stochastic methods. Results are presented for the IEEE 30-bus and118-bus electric power systems.

  12. Tsunami simulations for historical and plausible mega-thrust events originating in the Eastern Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Valle, Brett; Kalligeris, Nikos; Findikakis, Angelos; Okal, Emile; Synolakis, Costas

    2013-04-01

    Tsunamis have been reported at rates of one to two per year in the Mediterranean Sea, on average, over the past 2000 years (Ambraseys and Synolakis, 2010). However, quantification of tsunami hazards in the Eastern Mediterranean Sea remains difficult, as large events are infrequent. Simulations were performed for a series of seismic events originating along the Eastern Hellenic Arc and Western Cyprian Arc. The locations and source characteristics represent plausible mega-thrust events similar to historical events along the Hellenic Arc, including the 365 AD and 1303 AD events. Sensitivity simulations were performed to address uncertainty in the location and source characteristics of the 1303 AD event, and in consideration of potential future events originating along the Eastern Hellenic Arc. Sensitivity simulations were also performed for events originating along the Western Cyprian Arc. The hydrodynamic simulations used a series of codes known as the Method of Splitting Tsunami (MOST) (Titov and Synolakis, 1998). Reported results include wave propagation in the Eastern Mediterranean and tsunami inundation near Alexandria, Egypt, and for neighboring coastlines. References: Ambraseys, N. and C.E. Synolakis (2010), Tsunami Catalogs for the Eastern Mediterranean, Revisited, Journal of Earthquake Engineering 14(3): 309-330; and Titov V.V. and C.E. Synolakis (1998), 'Numerical modeling of tidal wave runup,' J. Waterw. Port Coast. Ocean Eng. 124(4): 157-171.

  13. An event generator for simulations of complex β-decay experiments

    NASA Astrophysics Data System (ADS)

    Jordan, D.; Algora, A.; Tain, J. L.

    2016-08-01

    This article describes a Monte Carlo event generator for the design, optimization and performance characterization of beta decay spectroscopy experimental set-ups. The event generator has been developed within the Geant4 simulation architecture and provides new features and greater flexibility in comparison with the current available decay generator.

  14. Calculation of 239Pu fission observables in an event-by-event simulation

    SciTech Connect

    Vogt, R; Randrup, J; Pruet, J; Younes, W

    2010-03-31

    The increased interest in more exclusive fission observables has demanded more detailed models. We describe a new computational model, FREYA, that aims to meet this need by producing large samples of complete fission events from which any observable of interest can then be extracted consistently, including any interesting correlations. The various model assumptions are described and the potential utility of the model is illustrated. As a concrete example, we use formal statistical methods, experimental data on neutron production in neutron-induced fission of {sup 239}Pu, along with FREYA, to develop quantitative insights into the relation between reaction observables and detailed microscopic aspects of fission. Current measurements of the mean number of prompt neutrons emitted in fission taken together with less accurate current measurements for the prompt post-fission neutron energy spectrum, up to the threshold for multi-chance fission, place remarkably fine constraints on microscopic theories.

  15. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  16. Systems Engineering Simulator (SES) Simulator Planning Guide

    NASA Technical Reports Server (NTRS)

    McFarlane, Michael

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  17. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family

    NASA Astrophysics Data System (ADS)

    Custodio, M. D. S.; Ambrizzi, T.; Da Rocha, R.

    2015-12-01

    The increased horizontal resolution of climate models aims to improve the simulations accuracy and to understand the non-linear processes during interactions between different spatial scales within the climate system. Up to this moment, these interactions did not have a good representation on low horizontal resolution GCMs. The variations of extreme climatic events had been described and analyzed in the scientific literature. In a scenario of global warming it is necessary understanding and explaining extreme events and to know if global models may represent these events. The purpose of this study was to understand the impact of the horizontal resolution in high resolution coupled and atmospheric global models of HiGEM project in simulating atmospheric patterns and processes of interaction between spatial scales. Moreover, evaluate the performance of coupled and uncoupled versions of the High-Resolution Global Environmental Model in capturing the signal of interannual and intraseasonal variability of precipitation over Amazon region. The results indicated that the grid refinement and ocean-atmosphere coupling contributes to a better representation of seasonal patterns, both precipitation and temperature, on the Amazon region. Besides, the climatic models analyzed represent better than other models (regional and global) the climatic characteristics of this region. This indicates a breakthrough in the development of high resolution climate models. Both coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The interannual variability analysis showed that coupled simulations intensify the impact of the ENSO in the Amazon. In the intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. The simulation of ENSO in GCMs can be attributed to their high

  18. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  19. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings.

  20. Experience producing simulated events for the DZero experiment on the SAM-Grid

    SciTech Connect

    Garzoglio, G.; Terekhov, I.; Snow, J.; Jain, S.; Nishandar, A.; /Texas U., Arlington

    2004-12-01

    Most of the simulated events for the DZero experiment at Fermilab have been historically produced by the ''remote'' collaborating institutions. One of the principal challenges reported concerns the maintenance of the local software infrastructure, which is generally different from site to site. As the understanding of the distributed computing community over distributively owned and shared resources progresses, the adoption of grid technologies to address the production of Monte Carlo events for high energy physics experiments becomes increasingly interesting. SAM-Grid is a software system developed at Fermilab, which integrates standard grid technologies for job and information management with SAM, the data handling system of the DZero and CDF experiments. During the past few months, this grid system has been tailored for the Monte Carlo production of DZero. Since the initial phase of deployment, this experience has exposed an interesting series of requirements to the SAM-Grid services, the standard middleware, the resources and their management and to the analysis framework of the experiment. As of today, the inefficiency due to the grid infrastructure has been reduced to as little as 1%. In this paper, we present our statistics and the ''lessons learned'' in running large high energy physics applications on a grid infrastructure.

  1. An Object Description Language for Distributed Discrete Event Simulations

    DTIC Science & Technology

    2001-05-24

    system of an object undergoing projectile motion under the influence of gravity. When considering this physical system we can ignore many of the...part. Thesis Advisor Associate Professor David W. Krume Draper Technical Supervison Dr. Owen L. Deutsch Thesis Committee Member. Associate Professor...to a merciful conclusion. This thesis was prepared at the Charles Stark Draper Laboratory, Inc., under projects 13025 (Multi-Agent Collaboration

  2. Extreme events in excitable systems and mechanisms of their generation.

    PubMed

    Ansmann, Gerrit; Karnatak, Rajat; Lehnertz, Klaus; Feudel, Ulrike

    2013-11-01

    We study deterministic systems, composed of excitable units of FitzHugh-Nagumo type, that are capable of self-generating and self-terminating strong deviations from their regular dynamics without the influence of noise or parameter change. These deviations are rare, short-lasting, and recurrent and can therefore be regarded as extreme events. Employing a range of methods we analyze dynamical properties of the systems, identifying features in the systems' dynamics that may qualify as precursors to extreme events. We investigate these features and elucidate mechanisms that may be responsible for the generation of the extreme events.

  3. Accuracy of harm scores entered into an event reporting system.

    PubMed

    Abbasi, Toni; Adornetto-Garcia, Debra; Johnston, Patricia A; Segovia, Julie H; Summers, Barbara

    2015-04-01

    This quality improvement project evaluated the accuracy of harm scores entered into an event reporting system by inpatient nursing staff at a National Cancer Institute-designated comprehensive cancer center. Nurses scored 10 safety scenarios using 2 versions of the Agency for Healthcare Research and Quality scale to determine interrater reliability. Results indicated inconsistency in the way nurses scored the scenarios, suggesting that the event reporting system may not accurately portray the severity of harm in patient safety events. Nurse executives can use this information to guide the development and implementation of incident reporting systems.

  4. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  5. Reading Sky and Seeing a Cloud: On the Relevance of Events for Perceptual Simulation

    PubMed Central

    2016-01-01

    Previous research has shown that processing words with an up/down association (e.g., bird, foot) can influence the subsequent identification of visual targets in congruent location (at the top/bottom of the screen). However, as facilitation and interference were found under similar conditions, the nature of the underlying mechanisms remained unclear. We propose that word comprehension relies on the perceptual simulation of a prototypical event involving the entity denoted by a word in order to provide a general account of the different findings. In 3 experiments, participants had to discriminate between 2 target pictures appearing at the top or the bottom of the screen by pressing the left versus right button. Immediately before the targets appeared, they saw an up/down word belonging to the target’s event, an up/down word unrelated to the target, or a spatially neutral control word. Prime words belonging to target event facilitated identification of targets at a stimulus onset asynchrony (SOA) of 250 ms (Experiment 1), but only when presented in the vertical location where they are typically seen, indicating that targets were integrated in the simulations activated by the prime words. Moreover, at the same SOA, there was a robust facilitation effect for targets appearing in their typical location regardless of the prime type. However, when words were presented for 100 ms (Experiment 2) or 800 ms (Experiment 3), only a location nonspecific priming effect was found, suggesting that the visual system was not activated. Implications for theories of semantic processing are discussed. PMID:27762581

  6. Systems simulations supporting NASA telerobotics

    NASA Technical Reports Server (NTRS)

    Harrison, F. W., Jr.; Pennington, J. E.

    1987-01-01

    Two simulation and analysis environments have been developed to support telerobotics research at the Langley Research Center. One is a high-fidelity, nonreal-time, interactive model called ROBSIM, which combines user-generated models of workspace environment, robots, and loads into a working system and simulates the interaction among the system components. Models include user-specified actuator, sensor, and control parameters, as well as kinematic and dynamic characteristics. Kinematic, dynamic, and response analyses can be selected, with system configuration, task trajectories, and arm states displayed using computer graphics. The second environment is a real-time, manned Telerobotic Systems Simulation (TRSS) which uses the facilities of the Intelligent Systems Research Laboratory (ISRL). It utilizes a hierarchical structure of functionally distributed computers communicating over both parallel and high-speed serial data paths to enable studies of advanced telerobotic systems. Multiple processes perform motion planning, operator communications, forward and inverse kinematics, control/sensor fusion, and I/O processing while communicating via common memory. Both ROBSIM and TRSS, including their capability, status, and future plans are discussed. Also described is the architecture of ISRL and recent telerobotic system studies in ISRL.

  7. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    NASA Astrophysics Data System (ADS)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  8. Automated calculation and simulation systems

    NASA Astrophysics Data System (ADS)

    Ohl, Thorsten

    2003-04-01

    I briefly summarize the parallel sessions on Automated Calculation and Simulation Systems for high-energy particle physics phenomenology at ACAT 2002 (Moscow State University, June 2002) and present a short overview over the current status of the field and try to identify the important trends.

  9. Simulation of Biomolecular Nanomechanical Systems

    DTIC Science & Technology

    2006-10-01

    Funding for this effort came from the Defense Advanced Research Project Agency’s Simulation of Biological System ( SIMBIOSYS ) Program. The work can...DNA Hybridization Efficiency Based on the discussions at a SIMBIOSYS Principal Investigator’s Meeting (Sept. 2003 in Monterrey, CA), experiments

  10. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  11. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, Debra P. C.; Sala, O.E.; Allen, C.D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas. ?? The Ecological Society of America.

  12. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  13. Performance and efficiency of geotextile-supported erosion control measures during simulated rainfall events

    NASA Astrophysics Data System (ADS)

    Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin

    2013-04-01

    Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show

  14. Effect of simulation on nursing knowledge and critical thinking in failure to rescue events.

    PubMed

    Schubert, Carolyn R

    2012-10-01

    Failure to rescue events are hospital deaths that result from human error and unsafe patient conditions. A failure to rescue event implies that the last and best chance to avoid tragedy is not acted on in time to avoid a disaster. Patient safety is often compromised by nurses who do not perform accurate assessments (vigilance), do not detect clinical changes (surveillance), or do not display critical thinking (recognition that something is wrong). This project used simulation as a teaching strategy to enhance nursing performance. Medical-surgical nurses took part in a simulated failure to rescue event in which the patient's clinical condition deteriorated rapidly. Nursing knowledge and critical thinking improved after the simulation and showed the effectiveness of simulation as a teaching strategy to address nursing knowledge and critical thinking skills.

  15. Canister Transfer System Event Sequence Calculation

    SciTech Connect

    Richard Morissette

    2001-08-16

    The ''Department of Energy Spent Nuclear Fuel Canister, Transportation, and Monitored Geologic Repository Systems, Structures, and Components Performance Allocation Study'' (CRWMS M&O 2000b) allocated performance to both the canisters received at the Monitored Geologic Repository (MGR) and the MGR Canister Transfer System (CTS). The purpose of this calculation is to evaluate an assumed range of canister and CTS performance allocation failure probabilities and determine the effect of these failure probabilities on the frequency of a radionuclide release. Five canister types are addressed in this calculation; high-level radioactive waste (HLW) canisters containing vitrified borosilicate glass, HLW canisters containing immobilized plutonium surrounded by borosilicate glass (Pu/HLW canisters), Department of Energy (DOE) spent nuclear fuel (DSNF) standard canisters (4 sizes), DSNF multi-canister overpacks (MCOs) for N-reactor fuel and other selected DSNF, and naval spent nuclear fuel (SNF) canisters (2 sizes). The quality assurance program applies to this calculation, and the work is performed in accordance with procedure AP-3.12Q, ''Calculations''. The work done for this calculation was evaluated according to AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'' that determined this activity to be subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000a). This work was performed in accordance with the ''Technical Work Plan for: Department of Energy Nuclear Fuel Work Packages'' (CRWMS M&O 2000c) for this activity.

  16. Aided targeting system simulation evaluation

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Becker, Curtis

    1994-01-01

    Simulation research was conducted at the Crew Station Research and Development Facility on the effectiveness and ease of use of three targeting systems. A manual system required the aviator to scan a target array area with a simulated second generation forward looking infrared (FLIR) sensor, locate and categorize targets, and construct a target hand-off list. The interface between the aviator and the system was like that of an advanced scout helicopter (manual mode). Two aided systems detected and categorized targets automatically. One system used only the FLIR sensor and the second used FLIR fused with Longbow radar. The interface for both was like that of an advanced scout helicopter aided mode. Exposure time while performing the task was reduced substantially with the aided systems, with no loss of target hand-off list accuracy. The fused sensor system showed lower time to construct the target hand-off list and a slightly lower false alarm rate than the other systems. A number of issues regarding system sensitivity and criterion, and operator interface design are discussed.

  17. Probabilistic Language Framework for Stochastic Discrete Event Systems

    DTIC Science & Technology

    1996-01-01

    Kumar, S.I. Marcus T.R. 96-18 Probabilistic Language Formalism for Stochastic Discrete Event Systems 1 2 Vijay K . Garg Department of Electrical and...Probability Theory and Its Applications, Vol. 1. Wiley, New York, NY, 2nd edition, 1966. [6] V. K . Garg . An algebraic approach to modeling probabilistic...discrete event systems. In Proceedings of 1992 IEEE Conference on Decision and Control, pages 2348{2353, Tucson, AZ, December 1992. [7] V. K . Garg

  18. Solar system events at high spatial resolution

    SciTech Connect

    Baines, K H; Gavel, D T; Getz, A M; Gibbartd, S G; MacIntosh, B; Max, C E; McKay, C P; Young, E F; de Pater, I

    1999-02-19

    Until relatively recent advances in technology, astronomical observations from the ground were limited in image resolution by the blurring effects of earth's atmosphere. The blur extent, ranging typically from 0.5 to 2 seconds of arc at the best astronomical sights, precluded ground-based observations of the details of the solar system's moons, asteroids, and outermost planets. With the maturing of a high resolution image processing technique called speckle imaging the resolution limitation of the atmosphere can now be largely overcome. Over the past three years they have used speckle imaging to observe Titan, a moon of Saturn with an atmospheric density comparable to Earth's, Io, the volcanically active innermost moon of Jupiter, and Neptune, a gas giant outer planet which has continually changing planet-encircling storms. These observations were made at the world's largest telescope, the Keck telescope in Hawaii and represent the highest resolution infrared images of these objects ever taken.

  19. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  20. Simulator verification techniques study. Integrated simulator self test system concepts

    NASA Technical Reports Server (NTRS)

    Montoya, G.; Wenglinski, T. H.

    1974-01-01

    Software and hardware requirements for implementing hardware self tests are presented in support of the development of training and procedures development simulators for the space shuttle program. Self test techniques for simulation hardware and the validation of simulation performance are stipulated. The requirements of an integrated simulator self system are analyzed. Readiness tests, fault isolation tests, and incipient fault detection tests are covered.

  1. Rare event statistics in reaction-diffusion systems.

    PubMed

    Elgart, Vlad; Kamenev, Alex

    2004-10-01

    We present an efficient method to calculate probabilities of large deviations from the typical behavior (rare events) in reaction-diffusion systems. This method is based on a semiclassical treatment of an underlying "quantum" Hamiltonian, encoding the system's evolution. To this end, we formulate the corresponding canonical dynamical system and investigate its phase portrait. This method is presented for a number of pedagogical examples.

  2. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  3. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  4. Capturing the serial nature of older drivers' responses towards challenging events: a simulator study.

    PubMed

    Bélanger, Alexandre; Gagnon, Sylvain; Yamin, Stephanie

    2010-05-01

    Older drivers' ability to trigger simultaneous responses in reaction to simulated challenging road events was examined through crash risk and local analyses of acceleration and direction data provided by the simulator. This was achieved by segregating and averaging the simulator's primary measures according to six short time intervals, one before and five during the challenging events. Twenty healthy adults aged 25-45 years old (M=29.5+/-4.32) and 20 healthy adults aged 65 and older (M=73.4+/-5.17) were exposed to five simulated scenarios involving sudden, complex and unexpected maneuvres. Participants were also administered the Useful Field of View (UFOV), single reaction time and choice reaction time tests, a visual secondary task in the simulator, and a subjective workload evaluation (NASA-TLX). Results indicated that the challenging event that required multiple synchronized reactions led to a higher crash rate in older drivers. Acceleration and orientation data analyses confirmed that the drivers who crashed limited their reaction. The other challenging events did not generate crashes because they could be anticipated and one response (braking) was sufficient to avoid crash. Our findings support the proposal (Hakamies-Blomqvist, L., Mynttinen, S., Backman, M., Mikkonen, V., 1999. Age-related differences in driving: are older drivers more serial? International Journal of Behavioral Development 23, 575-589) that older drivers have more difficulty activating car controls simultaneously putting them at risk when facing challenging and time pressure road events.

  5. A CORBA event system for ALMA common software

    NASA Astrophysics Data System (ADS)

    Fugate, David W.

    2004-09-01

    The ALMA Common Software notification channel framework provides developers with an easy to use, high-performance, event-driven system supported across multiple programming languages and operating systems. It sits on top of the CORBA notification service and hides nearly all CORBA from developers. The system is based on a push event channel model where suppliers push events onto the channel and consumers process these asynchronously. This is a many-to-many publishing model whereby multiple suppliers send events to multiple consumers on the same channel. Furthermore, these event suppliers and consumers can be coded in C++, Java, or Python on any platform supported by ACS. There are only two classes developers need to be concerned with: SimpleSupplier and Consumer. SimpleSupplier was designed so that ALMA events (defined as IDL structures) could be published in the simplest manner possible without exposing any CORBA to the developer. Essentially all that needs to be known is the channel's name and the IDL structure being published. The API takes care of everything else. With the Consumer class, the developer is responsible for providing the channel's name as well as associating event types with functions that will handle them.

  6. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  7. Stochastic simulation in systems biology

    PubMed Central

    Székely, Tamás; Burrage, Kevin

    2014-01-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest. PMID:25505503

  8. Stochastic simulation in systems biology.

    PubMed

    Székely, Tamás; Burrage, Kevin

    2014-11-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest.

  9. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  10. Representing Ground Robotic Systems in Battlefield Simulations

    DTIC Science & Technology

    2002-08-01

    representations of intelligent system performance for its battlefield simulation tools . These simulation tools differ considerably in their level of...simulation study, 2) the overall fidelity of the target simulation tool , and 3) the elements of the robotic system that are relevant to the...simulation study. In this paper, we discuss a framework for modeling robotic system performance in the context of a battlefield simulation tool . We apply

  11. Simulation of debris flow events in Sicily by cellular automata model SCIDDICA_SS3

    NASA Astrophysics Data System (ADS)

    Cancelliere, A.; Lupiano, V.; Peres, D. J.; Stancanelli, L.; Avolio, M.; Foti, E.; Di Gregorio, S.

    2013-12-01

    Debris flow models are widely used for hazard mapping or for evaluating the effectiveness of risk mitigation measures. Several models analyze the dynamics of debris flow runout solving Partial Differential Equations. In use of such models, difficulties arise in estimating kinematic geotechnical soil parameters for real phenomena. In order to overcome such difficulties, alternative semi-empirical approaches can be employed, such as macroscopic Cellular Automata (CA). In particular, for CA simulation purposes, the runout of debris flows emerges from local interactions in a dynamical system, subdivided into elementary parts, whose state evolves within a spatial and temporal discretum. The attributes of each cell (substates) describe physical characteristics. For computational reasons, the natural phenomenon is splitted into a number of elementary processes, whose proper composition makes up the CA transition function. By simultaneously applying this function to all the cells, the evolution of the phenomenon can be simulated in terms of modifications of the substates. In this study, we present an application of the macroscopic CA semi-empirical model SCIDDICA_SS3 to the Peloritani Mountains area in Sicily island, Italy. The model was applied using detailed data from the 1 October 2009 debris flow event, which was triggered by a rainfall event of about 250 mm falling in 9 hours, that caused the death of 37 persons. This region is characterized by river valleys with large hillslope angles (30°-60°), catchment basins of small extensions (0.5-12 km2) and soil composed by metamorphic material, which is easy to be eroded. CA usage implies a calibration phase, that identifies an optimal set of parameters capable of adequately play back the considered case, and a validation phase, that tests the model on a sufficient (and different) number of cases similar in terms of physical and geomorphological properties. The performance of the model can be measured in terms of a fitness

  12. Characteristics and dependencies of error in satellite-based flood event simulations

    NASA Astrophysics Data System (ADS)

    Mei, Yiwen; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Zoccatelli, Davide; Borga, Marco

    2016-04-01

    The error in satellite precipitation driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. rainfall and runoff cumulative depth and time series shape). Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite-precipitation exhibits good agreement with reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of time series shows significant dampening effect. The random error dampening effect is less pronounced for the flash flood events, and the rain flood events with high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  13. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  14. Simulation of moisture in alpine rock walls during freeze-thaw events

    NASA Astrophysics Data System (ADS)

    Schnepfleitner, Harald; Rode, Matthias; Sass, Oliver

    2014-05-01

    Rock moisture distribution during freeze-thaw events is the key to understanding frost weathering and subsequent rockfall. Data on moisture levels of natural rock walls are scarce and difficult to measure. An innovative and cheap way to avoid these problems is the use of simulation calculations. Although they are an abstraction of the real system they are widely used in natural science. A novel way to simulate moisture in natural rock walls is the use of the software WUFI which has been developed to understand the moisture behavior in building materials. However, the enormous know-how behind these commercial applications has not been exploited for geomorphological research to date. Necessary input data for the simulation are climate data in hourly resolution (temperature, rainfall, wind, irradiation) and material properties (porosity, sorption and diffusivity parameters) of the prevailing rock. Two different regions were analysed, the Gesäuse (Johnsbachtal: 700 m, limestone and dolomite) and the Sonnblick (3000 m, gneiss and granite). We aimed at comparing the two regions in terms of general susceptibility to frost weathering, as well as the influence of aspect, inclination and rock parameters and the possible impact of climate change. The calculated 1D-moisture profiles and temporal progress of rock moisture - in combination with temperature data - allow to detect possible periods of active weathering and resulting rockfalls. These results were analyzed based on two different frost weathering theories, the "classical" frost shattering theory (requiring high number of freeze-thaw cycles and a pore saturation of 90%) and the segregation ice theory (requiring a long freezing period and a pore saturation threshold of approx. 60%). An additionally considered critical factor for both theories was the frost depth, namely the duration of the "frost cracking window" (between -3 and -10°C) at each site. The results shows that in both areas, north-facing rocks are

  15. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  16. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  17. Event-triggered sliding mode control for a class of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Behera, Abhisek K.; Bandyopadhyay, Bijnan

    2016-09-01

    Event-triggering strategy is one of the real-time control implementation techniques which aims at achieving minimum resource utilisation while ensuring the satisfactory performance of the closed-loop system. In this paper, we address the problem of robust stabilisation for a class of nonlinear systems subject to external disturbances using sliding mode control (SMC) by event-triggering scheme. An event-triggering scheme is developed for SMC to ensure the sliding trajectory remains confined in the vicinity of sliding manifold. The event-triggered SMC brings the sliding mode in the system and thus the steady-state trajectories of the system also remain bounded within a predesigned region in the presence of disturbances. The design of event parameters is also given considering the practical constraints on control execution. We show that the next triggering instant is larger than its immediate past triggering instant by a given positive constant. The analysis is also presented with taking delay into account in the control updates. An upper bound for delay is calculated to ensure stability of the system. It is shown that with delay steady-state bound of the system is increased than that of the case without delay. However, the system trajectories remain bounded in the case of delay, so stability is ensured. The performance of this event-triggered SMC is demonstrated through a numerical simulation.

  18. Optimal switching policy for performance enhancement of distributed parameter systems based on event-driven control

    NASA Astrophysics Data System (ADS)

    Mu, Wen-Ying; Cui, Bao-Tong; Lou, Xu-Yang; Li, Wen

    2014-07-01

    This paper aims to improve the performance of a class of distributed parameter systems for the optimal switching of actuators and controllers based on event-driven control. It is assumed that in the available multiple actuators, only one actuator can receive the control signal and be activated over an unfixed time interval, and the other actuators keep dormant. After incorporating a state observer into the event generator, the event-driven control loop and the minimum inter-event time are ultimately bounded. Based on the event-driven state feedback control, the time intervals of unfixed length can be obtained. The optimal switching policy is based on finite horizon linear quadratic optimal control at the beginning of each time subinterval. A simulation example demonstrate the effectiveness of the proposed policy.

  19. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding.

  20. Importance of long-time simulations for rare event sampling in zinc finger proteins.

    PubMed

    Godwin, Ryan; Gmeiner, William; Salsbury, Freddie R

    2016-01-01

    Molecular dynamics (MD) simulation methods have seen significant improvement since their inception in the late 1950s. Constraints of simulation size and duration that once impeded the field have lessened with the advent of better algorithms, faster processors, and parallel computing. With newer techniques and hardware available, MD simulations of more biologically relevant timescales can now sample a broader range of conformational and dynamical changes including rare events. One concern in the literature has been under which circumstances it is sufficient to perform many shorter timescale simulations and under which circumstances fewer longer simulations are necessary. Herein, our simulations of the zinc finger NEMO (2JVX) using multiple simulations of length 15, 30, 1000, and 3000 ns are analyzed to provide clarity on this point.

  1. Simulation of centennial-scale drought events over eastern China during the past 1500 years

    NASA Astrophysics Data System (ADS)

    Sun, Weiyi; Liu, Jian; Wang, Zhiyuan

    2017-02-01

    The characteristics and causes of centennial-scale drought events over eastern China during the past 1500 years were explored based on simulations of the Community Earth System Model (CESM). The results show that centennial- scale drought events over eastern China occurred during the periods of 622-735 (Drought period 1, D1) and 1420-1516 (Drought period 2, D2) over the past 1500 years, which is comparable with climate proxy data. In D1, the drought center occurred in northern China and the Yangtze River valley; however, in southern China, precipitation was much more than usual. In D2, decreased precipitation was found across almost the whole region of eastern China. The direct cause of these two drought events was the weakened East Asian summer monsoon, and the specific process was closely linked to the air-sea interaction of the Indo-Pacific Ocean. In D1, regions of maximum cooling were observed over the western Pacific, which may have led to anomalous subsidence, weakening the Walker circulation, and reducing the northward transport of water vapor. Additionally, upward motion occurred over southern China, strengthening convection and increasing precipitation. In D2, owing to the decrease in the SST, subsidence dominated the North Indian Ocean, blocking the low-level cross-equatorial flow, enhancing the tropical westerly anomalies, and reducing the northward transport of moisture. Additionally, descending motion appeared in eastern China, subsequently decreasing the precipitation over the whole region of eastern China. The anomalous cooling of the Indo-Pacific Ocean SST may have been caused by the persistently low solar irradiation in D1; whereas, in D2, this characteristic may have been influenced not only by persistently low solar irradiation, but frequent volcanic eruptions too.

  2. BEEC: An event generator for simulating the Bc meson production at an e+e- collider

    NASA Astrophysics Data System (ADS)

    Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You

    2013-12-01

    The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in

  3. Coupled atmosphere-ocean-wave simulations of a storm event over the Gulf of Lion and Balearic Sea

    USGS Publications Warehouse

    Renault, Lionel; Chiggiato, Jacopo; Warner, John C.; Gomez, Marta; Vizoso, Guillermo; Tintore, Joaquin

    2012-01-01

    The coastal areas of the North-Western Mediterranean Sea are one of the most challenging places for ocean forecasting. This region is exposed to severe storms events that are of short duration. During these events, significant air-sea interactions, strong winds and large sea-state can have catastrophic consequences in the coastal areas. To investigate these air-sea interactions and the oceanic response to such events, we implemented the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System simulating a severe storm in the Mediterranean Sea that occurred in May 2010. During this event, wind speed reached up to 25 m.s-1 inducing significant sea surface cooling (up to 2°C) over the Gulf of Lion (GoL) and along the storm track, and generating surface waves with a significant height of 6 m. It is shown that the event, associated with a cyclogenesis between the Balearic Islands and the GoL, is relatively well reproduced by the coupled system. A surface heat budget analysis showed that ocean vertical mixing was a major contributor to the cooling tendency along the storm track and in the GoL where turbulent heat fluxes also played an important role. Sensitivity experiments on the ocean-atmosphere coupling suggested that the coupled system is sensitive to the momentum flux parameterization as well as air-sea and air-wave coupling. Comparisons with available atmospheric and oceanic observations showed that the use of the fully coupled system provides the most skillful simulation, illustrating the benefit of using a fully coupled ocean-atmosphere-wave model for the assessment of these storm events.

  4. Low-dose photons modify liver response to simulated solar particle event protons.

    PubMed

    Gridley, Daila S; Coutrakon, George B; Rizvi, Asma; Bayeta, Erben J M; Luo-Owen, Xian; Makinde, Adeola Y; Baqai, Farnaz; Koss, Peter; Slater, James M; Pecaut, Michael J

    2008-03-01

    The health consequences of exposure to low-dose radiation combined with a solar particle event during space travel remain unresolved. The goal of this study was to determine whether protracted radiation exposure alters gene expression and oxidative burst capacity in the liver, an organ vital in many biological processes. C57BL/6 mice were whole-body irradiated with 2 Gy simulated solar particle event (SPE) protons over 36 h, both with and without pre-exposure to low-dose/low-dose-rate photons ((57)Co, 0.049 Gy total at 0.024 cGy/h). Livers were excised immediately after irradiation (day 0) or on day 21 thereafter for analysis of 84 oxidative stress-related genes using RT-PCR; genes up or down-regulated by more than twofold were noted. On day 0, genes with increased expression were: photons, none; simulated SPE, Id1; photons + simulated SPE, Bax, Id1, Snrp70. Down-regulated genes at this same time were: photons, Igfbp1; simulated SPE, Arnt2, Igfbp1, Il6, Lct, Mybl2, Ptx3. By day 21, a much greater effect was noted than on day 0. Exposure to photons + simulated SPE up-regulated completely different genes than those up-regulated after either photons or the simulated SPE alone (photons, Cstb; simulated SPE, Dctn2, Khsrp, Man2b1, Snrp70; photons + simulated SPE, Casp1, Col1a1, Hspcb, Il6st, Rpl28, Spnb2). There were many down-regulated genes in all irradiated groups on day 21 (photons, 13; simulated SPE, 16; photons + simulated SPE, 16), with very little overlap among groups. Oxygen radical production by liver phagocytes was significantly enhanced by photons on day 21. The results demonstrate that whole-body irradiation with low-dose-rate photons, as well as time after exposure, had a great impact on liver response to a simulated solar particle event.

  5. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  6. Simulation System for Training in Laparoscopic Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao

    2003-01-01

    A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set

  7. Monte Carlo generator ELRADGEN 2.0 for simulation of radiative events in elastic ep-scattering of polarized particles

    NASA Astrophysics Data System (ADS)

    Akushevich, I.; Filoti, O. F.; Ilyichev, A.; Shumeiko, N.

    2012-07-01

    The structure and algorithms of the Monte Carlo generator ELRADGEN 2.0 designed to simulate radiative events in polarized ep-scattering are presented. The full set of analytical expressions for the QED radiative corrections is presented and discussed in detail. Algorithmic improvements implemented to provide faster simulation of hard real photon events are described. Numerical tests show high quality of generation of photonic variables and radiatively corrected cross section. The comparison of the elastic radiative tail simulated within the kinematical conditions of the BLAST experiment at MIT BATES shows a good agreement with experimental data. Catalogue identifier: AELO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1299 No. of bytes in distributed program, including test data, etc.: 11 348 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: All Operating system: Any RAM: 1 MB Classification: 11.2, 11.4 Nature of problem: Simulation of radiative events in polarized ep-scattering. Solution method: Monte Carlo simulation according to the distributions of the real photon kinematic variables that are calculated by the covariant method of QED radiative correction estimation. The approach provides rather fast and accurate generation. Running time: The simulation of 108 radiative events for itest:=1 takes up to 52 seconds on Pentium(R) Dual-Core 2.00 GHz processor.

  8. Discrete Event Simulation Models for CT Examination Queuing in West China Hospital

    PubMed Central

    Luo, Li; Tang, Shijun; Shi, Yingkang; Guo, Huili

    2016-01-01

    In CT examination, the emergency patients (EPs) have highest priorities in the queuing system and thus the general patients (GPs) have to wait for a long time. This leads to a low degree of satisfaction of the whole patients. The aim of this study is to improve the patients' satisfaction by designing new queuing strategies for CT examination. We divide the EPs into urgent type and emergency type and then design two queuing strategies: one is that the urgent patients (UPs) wedge into the GPs' queue with fixed interval (fixed priority model) and the other is that the patients have dynamic priorities for queuing (dynamic priority model). Based on the data from Radiology Information Database (RID) of West China Hospital (WCH), we develop some discrete event simulation models for CT examination according to the designed strategies. We compare the performance of different strategies on the basis of the simulation results. The strategy that patients have dynamic priorities for queuing makes the waiting time of GPs decrease by 13 minutes and the degree of satisfaction increase by 40.6%. We design a more reasonable CT examination queuing strategy to decrease patients' waiting time and increase their satisfaction degrees. PMID:27547237

  9. Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model

    NASA Astrophysics Data System (ADS)

    Mikolajewicz, Uwe; Ziemen, Florian

    2016-04-01

    Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.

  10. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  11. A geostatistical extreme-value framework for fast simulation of natural hazard events.

    PubMed

    Youngman, Benjamin D; Stephenson, David B

    2016-05-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student's t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements.

  12. On the problem of discrete-event systems properties preservation

    NASA Astrophysics Data System (ADS)

    Nagul, Nadezhda; Bychkov, Igor

    2017-01-01

    The paper presents a novel approach to solving a problem generally arising in studying dynamical systems, namely the problem of a system's properties preservation under some transformation. Combining algebra, logic and dynamics, the method of logical-algebraic equations (LAE-method) is developed, serving to synthesize criteria for preservation properties of systems connected by special type of morphisms. The LAE-method is applicable to various systems, but we focus on the case of discrete-event systems (DES), which are the systems that evolve in time due to the occurrence of some event sequences. We consider the issues of the LAE-method application to the reduction of supervisor for DES, the problems of DES basic properties, such as observability and controllability, preservation when sensor readings provide information about system's state and it is available to a supervisor. Decentralized supervisory control is also addressed, in particular, the question whether local supervisors properties are inherited in a global supervisor.

  13. Modeling Large Scale Circuits Using Massively Parallel Descrete-Event Simulation

    DTIC Science & Technology

    2013-06-01

    1,966,080 cores of the Sequoia Blue Gene/Q supercomputer system. For the PHOLD benchmark model, we demonstrate the ability to process 33 trillion...events in 65 seconds yielding a peak event-rate in excess of 504 billion events/second using 120 racks of Sequoia . 15. SUBJECT TERMS Circuit...13 Table 7 - SEQUOIA : Raw PHOLD Performance Data for 1, 2, 4, 8, and 48 Rack Runs ............ 18 Table 8 - SEQUOIA : Raw PHOLD Performance

  14. Simulation of linear mechanical systems

    NASA Technical Reports Server (NTRS)

    Sirlin, S. W.

    1993-01-01

    A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.

  15. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    SciTech Connect

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.

  16. The multinomial simulation algorithm for discrete stochastic simulation of reaction-diffusion systems

    NASA Astrophysics Data System (ADS)

    Lampoudi, Sotiria; Gillespie, Dan T.; Petzold, Linda R.

    2009-03-01

    The Inhomogeneous Stochastic Simulation Algorithm (ISSA) is a variant of the stochastic simulation algorithm in which the spatially inhomogeneous volume of the system is divided into homogeneous subvolumes, and the chemical reactions in those subvolumes are augmented by diffusive transfers of molecules between adjacent subvolumes. The ISSA can be prohibitively slow when the system is such that diffusive transfers occur much more frequently than chemical reactions. In this paper we present the Multinomial Simulation Algorithm (MSA), which is designed to, on the one hand, outperform the ISSA when diffusive transfer events outnumber reaction events, and on the other, to handle small reactant populations with greater accuracy than deterministic-stochastic hybrid algorithms. The MSA treats reactions in the usual ISSA fashion, but uses appropriately conditioned binomial random variables for representing the net numbers of molecules diffusing from any given subvolume to a neighbor within a prescribed distance. Simulation results illustrate the benefits of the algorithm.

  17. State-dependent doubly weighted stochastic simulation algorithm for automatic characterization of stochastic biochemical rare events

    NASA Astrophysics Data System (ADS)

    Roh, Min K.; Daigle, Bernie J.; Gillespie, Dan T.; Petzold, Linda R.

    2011-12-01

    In recent years there has been substantial growth in the development of algorithms for characterizing rare events in stochastic biochemical systems. Two such algorithms, the state-dependent weighted stochastic simulation algorithm (swSSA) and the doubly weighted SSA (dwSSA) are extensions of the weighted SSA (wSSA) by H. Kuwahara and I. Mura [J. Chem. Phys. 129, 165101 (2008)], 10.1063/1.2987701. The swSSA substantially reduces estimator variance by implementing system state-dependent importance sampling (IS) parameters, but lacks an automatic parameter identification strategy. In contrast, the dwSSA provides for the automatic determination of state-independent IS parameters, thus it is inefficient for systems whose states vary widely in time. We present a novel modification of the dwSSA—the state-dependent doubly weighted SSA (sdwSSA)—that combines the strengths of the swSSA and the dwSSA without inheriting their weaknesses. The sdwSSA automatically computes state-dependent IS parameters via the multilevel cross-entropy method. We apply the method to three examples: a reversible isomerization process, a yeast polarization model, and a lac operon model. Our results demonstrate that the sdwSSA offers substantial improvements over previous methods in terms of both accuracy and efficiency.

  18. High-speed event detector for embedded nanopore bio-systems.

    PubMed

    Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie

    2015-08-01

    Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.

  19. Designing power system simulators for the smart grid: combining controls, communications, and electro-mechanical dynamics

    SciTech Connect

    Nutaro, James J

    2011-01-01

    Open source software has a leading role in research on simulation technology for electrical power systems. Research simulators demonstrate new features for which there is nascent but growing demand not yet provided for by commercial simulators. Of particular interest is the inclusion of models of software-intensive and communication-intensive controls in simulations of power system transients. This paper describes two features of the ORNL power system simulator that help it meet this need. First is its use of discrete event simulation for all aspects of the model: control, communication, and electro-mechanical dynamics. Second is an interoperability interface that enables the ORNL power system simulator to be integrated with existing, discrete event simulators of digital communication systems. The paper concludes with a brief discussion of how these aspects of the ORNL power system simulator might be inserted into production-grade simulation tools.

  20. Event-triggered output feedback control for distributed networked systems.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature.

  1. Adaptable, high recall, event extraction system with minimal configuration

    PubMed Central

    2015-01-01

    Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems

  2. Effects of a simulated agricultural runoff event on sediment toxicity in a managed backwater wetland

    Technology Transfer Automated Retrieval System (TEKTRAN)

    permethrin (both cis and trans isomers), on 10-day sediment toxicity to Hyalella azteca in a managed natural backwater wetland after a simulated agricultural runoff event. Sediment samples were collected at 10, 40, 100, 300, and 500 m from inflow 13 days prior to amendment and 1, 5, 12, 22, and 36 ...

  3. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  4. Widespread, Very Heavy Precipitation Events in Contemporary and Scenario Summer Climates from NARCCAP Simulations

    NASA Astrophysics Data System (ADS)

    Kawazoe, S.; Gutowski, W. J., Jr.

    2015-12-01

    We analyze the ability of regional climate models (RCMs) to simulate very heavy daily precipitation and supporting processes for both contemporary and future-scenario simulations during summer (JJA). RCM output comes from North American Regional Climate Change Assessment Program (NARCCAP) simulations, which are all run at a spatial resolution of 50 km. Analysis focuses on the upper Mississippi basin for summer, between 1982-1998 for the contemporary climate, and 2052-2068 during the scenario climate. We also compare simulated precipitation and supporting processes with those obtained from observed precipitation and reanalysis atmospheric states. Precipitation observations are from the University of Washington (UW) and the Climate Prediction Center (CPC) gridded dataset. Utilizing two observational datasets helps determine if any uncertainties arise from differences in precipitation gridding schemes. Reanalysis fields come from the North American Regional Reanalysis. The NARCCAP models generally reproduce well the precipitation-vs.-intensity spectrum seen in observations, while producing overly strong precipitation at high intensity thresholds. In the future-scenario climate, there is a decrease in frequency for light to moderate precipitation intensities, while an increase in frequency is seen for the higher intensity events. Further analysis focuses on precipitation events exceeding the 99.5 percentile that occur simultaneously at several points in the region, yielding so-called "widespread events". For widespread events, we analyze local and large scale environmental parameters, such as 2-m temperature and specific humidity, 500-hPa geopotential heights, Convective Available Potential Energy (CAPE), vertically integrated moisture flux convergence, among others, to compare atmospheric states and processes leading to such events in the models and observations. The results suggest that an analysis of atmospheric states supporting very heavy precipitation events is a

  5. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  6. Topics in gravitation - numerical simulations of event horizons and parameter estimation for LISA

    NASA Astrophysics Data System (ADS)

    Cohen, Michael Isaac

    2011-08-01

    In Part I, we consider numerical simulations of event horizons. Event horizons are the defining physical features of black hole spacetimes, and are of considerable interest in studying black hole dynamics. Here, we reconsider three techniques to find event horizons in numerical spacetimes, and find that straightforward integration of geodesics backward in time is most robust. We apply this method to various systems, from a highly spinning Kerr hole through to an asymmetric binary black hole inspiral. We find that the exponential rate at which outgoing null geodesics diverge from the event horizon of a Kerr black hole is the surface gravity of the hole. In head-on mergers we are able to track quasi-normal ringing of the merged black hole through seven oscillations, covering a dynamic range of about 10^5. In the head-on "kick" merger, we find that computing the Landau-Lifshitz velocity of the event horizon is very useful for an improved understanding of the kick behaviour. Finally, in the inspiral simulations, we find that the topological structure of the black holes does not produce an intermediate toroidal phase, though the structure is consistent with a potential re-slicing of the spacetime in order to introduce such a phase. We further discuss the topological structure of non-axisymmetric collisions. In Part II, we consider parameter estimation of cosmic string burst gravitational waves in Mock LISA data. A network of observable, macroscopic cosmic (super-)strings may well have formed in the early Universe. If so, the cusps that generically develop on cosmic-string loops emit bursts of gravitational radiation that could be detectable by gravitational-wave interferometers, such as the ground-based LIGO/Virgo detectors and the planned, space-based LISA detector. We develop two versions of a LISA-oriented string-burst search pipeline within the context of the Mock LISA Data Challenges, which rely on the publicly available MultiNest and PyMC software packages

  7. Modelling the dependence and internal structure of storm events for continuous rainfall simulation

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Yeboah; Melching, Charles S.

    2012-09-01

    SummaryPair-copula construction methodology has been explored to model the dependence structure between net storm event depth (R), maximum wet periods' depth (M), and the total wet periods' duration (L), noting that the total storm event depth is RT = R + M. Random variable R was used instead of RT in order to avoid physical boundary effects due to the condition of RT ⩾ M. The flexibility of pair-copula construction allowed the examination of 11 bivariate copulas at the three bivariate stages of the three-dimensional (3D) copula. For 21 years of hourly rainfall data from Cook County, Illinois, USA, examined, three different copulas were found suitable for the bivariate stages. For the internal storm event structure, a Geometric distribution was used to model the net event duration, defined as the difference between the total duration (D) and L. A two-parameter Poisson model was adopted for modelling the distribution of the L wet periods within D, and the first-order autoregressive Lognormal model was applied for the distribution of RT over the L wet periods. Incorporation of an inter-event (I) sub-model completed the continuous rainfall simulation scheme. The strong seasonality in the marginal and dependence model parameters was captured using first harmonic Fourier series, thus, reducing the number of parameters. Polynomial functions were fitted to the internal storm event model parameters which did not exhibit seasonal variability. Four hundred simulation runs were carried out in order to verify the developed model. Kolmogorov-Smirnov (KS) tests found the hypothesis that the observed and simulated storm event quantiles come from the same distribution cannot be rejected at the 5% significance level in nearly all cases. Gross statistics (dry probability, mean, variance, skewness, autocorrelations, and the intensity-duration-frequency (IDF) curves) of the continuous rainfall time series at several aggregation levels were very well preserved by the developed model.

  8. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  9. Exercise-Associated Collapse in Endurance Events: A Classification System.

    ERIC Educational Resources Information Center

    Roberts, William O.

    1989-01-01

    Describes a classification system devised for exercise-associated collapse in endurance events based on casualties observed at six Twin Cities Marathons. Major diagnostic criteria are body temperature and mental status. Management protocol includes fluid and fuel replacement, temperature correction, and leg cramp treatment. (Author/SM)

  10. Designing and Securing an Event Processing System for Smart Spaces

    ERIC Educational Resources Information Center

    Li, Zang

    2011-01-01

    Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…

  11. Event-based H2/H∞ controllers for networked control systems

    NASA Astrophysics Data System (ADS)

    Orihuela, L.; Millán, P.; Vivas, C.; Rubio, F. R.

    2014-12-01

    This paper is concerned with event-based H2/H∞ control design for networked systems with interval time-varying delays. The contributions are twofold. First, conditions for uniform ultimately bounded stability are provided in the H2/H∞ event-based context. The relation between the boundedness of the stability region and the threshold that triggers the events is studied. Second, a practical design procedure for event-based H2/H∞ control is provided. The method makes use of Lyapunov-Krasovskii functionals (LKFs) and it is characterised by its generality, as only mild assumptions are imposed on the structures of the LKF and the cost functional. The robustness and performance of the proposed technique is showed through numerical simulations.

  12. TLEs and early VLF events: Simulating the important impact of transmitter-disturbance-receiver geometry

    NASA Astrophysics Data System (ADS)

    NaitAmor, S.; Ghalila, H.; Cohen, M. B.

    2017-01-01

    Early very low frequency (VLF) events are perturbations to subionospherically propagating VLF radio transmitters which sometimes occur when lightning activity is near the transmitter-receiver path. They are often correlated to Transient Luminous Events (TLEs). Recent analysis have focused on a new type of early events whose recovery time persists for many minutes, called LOng Recovery Events (LOREs). The underlying cause of these events is still unclear. Curiously, LOREs sometimes appear on only one path, while the same event observed on a different transmitter-receiver path does not indicate a LORE. In this paper we observe and simulate two cases of early signal perturbations: The first is a typical early VLF event, and the second is a LORE. Both were recorded by two AWESOME VLF receivers in North Africa on 12 December 2009, during the EuroSprite campaign. We combine observations with theoretical modeling to infer the electron density change that most closely reproduces the observed perturbation. Our results explain the cases where LOREs are detected on only one path as resulting from transmitter-receiver geometry significantly which impacts the modal content and therefore the observed VLF recovery time.

  13. Event-chain Monte Carlo algorithms for hard-sphere systems.

    PubMed

    Bernard, Etienne P; Krauth, Werner; Wilson, David B

    2009-11-01

    In this paper we present the event-chain algorithms, which are fast Markov-chain Monte Carlo methods for hard spheres and related systems. In a single move of these rejection-free methods, an arbitrarily long chain of particles is displaced, and long-range coherent motion can be induced. Numerical simulations show that event-chain algorithms clearly outperform the conventional Metropolis method. Irreversible versions of the algorithms, which violate detailed balance, improve the speed of the method even further. We also compare our method with a recent implementations of the molecular-dynamics algorithm.

  14. Event-triggered reliable control for fuzzy Markovian jump systems with mismatched membership functions.

    PubMed

    Hou, Liyuan; Cheng, Jun; Qi, Wenhai

    2017-01-01

    The problem of event-triggered reliable control for fuzzy Markovian jump system (FMJS) with mismatched membership functions (MMFs) is addressed. Based on the mode-dependent reliable control and event-triggered communication scheme, the stability conditions and control design procedure are formulated. More precisely, a general actuator-failure is designed such that the FMJS is reliable in the sense of stochastically stable and reduce the utilization of network resources. Furthermore, the improved MMFs are introduced to reduce the conservativeness of obtained results. Finally, simulation results indicate the effectiveness of the proposed methodology.

  15. Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.

    PubMed

    Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung

    2012-04-10

    We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.

  16. Simulation of Anomalous Regional Climate Events with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    1999-01-01

    The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.

  17. Using Discrete Event Computer Simulation to Improve Patient Flow in a Ghanaian Acute Care Hospital

    PubMed Central

    Best, Allyson M.; Dixon, Cinnamon A.; Kelton, W. David; Lindsell, Christopher J.

    2014-01-01

    Objectives Crowding and limited resources have increased the strain on acute care facilities and emergency departments (EDs) worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation (DES) is a computer-based tool that can be used to estimate how changes to complex healthcare delivery systems, such as EDs, will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. Methods We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (e.g. modified staff start times and roles) and resource-additional (e.g. increased staff) operational interventions on patient throughput. Previously captured, de-identified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). Results The base-case (no change) scenario had a mean LOS of 292 minutes (95% CI 291, 293). In isolation, neither adding staffing, changing staff roles, nor varying shift times affected overall patient LOS. Specifically, adding two registration workers, history takers, and physicians resulted in a 23.8 (95% CI 22.3, 25.3) minute LOS decrease. However, when shift start-times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI 94, 98); and with the simultaneous combination of staff roles (Registration and History-taking) there was an overall mean LOS reduction of 152 minutes (95% CI 150, 154). Conclusions Resource-neutral interventions identified through DES modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. DES offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute

  18. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling

    PubMed Central

    Sauer, Bryan G.; Singh, Kanwar P.; Wagner, Barry L.; Vanden Hoek, Matthew S.; Twilley, Katherine; Cohn, Steven M.; Shami, Vanessa M.; Wang, Andrew Y.

    2016-01-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience. PMID:27853739

  19. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  20. An Integrated Approach To Payload System Simulation

    NASA Technical Reports Server (NTRS)

    Lee, M.; Swartz, R. L., Jr.; Teng, A.; Weidner, R. J.

    1996-01-01

    This paper describes a payload system simulation implemented at JPL as part of a comprehensive mission simulation facility. The flight software function includes communication with other process modules, instrument control, and data management. The payload system simulation software consists of: a camera subsystem, a virtual world, and a mission visualization toolset.

  1. Evaluating the aerosol indirect effect in WRF-Chem simulations of the January 2013 Beijing air pollution event.

    NASA Astrophysics Data System (ADS)

    Peckham, Steven; Grell, Georg; Xie, Ying; Wu, Jian-Bin

    2015-04-01

    In January 2013, an unusual weather pattern over Northern China produced unusually cool, moist conditions for the region. Recent peer-reviewed scientific manuscripts report that during this time period, Beijing experienced a historically severe haze and smog event with observed monthly average fine particulate matter (PM2.5) concentrations exceeding 225 micrograms per cubic meter. MODIS satellite observations produced AOD values of approximately 1.5 to 2 for the same time. In addition, over eastern and northern China record-breaking hourly average PM2.5 concentrations of more than 700 μg m-3 were observed. Clearly, the severity and persistence of this air pollution episode has raised the interest of the scientific community as well as widespread public attention. Despite the significance of this and similar air pollution events, several questions regarding the ability of numerical weather prediction models to forecast such events remain. Some of these questions are: • What is the importance of including aerosols in the weather prediction models? • What is the current capability of weather prediction models to simulate aerosol impacts upon the weather? • How important is it to include the aerosol feedbacks (direct and indirect effect) in the numerical model forecasts? In an attempt to address these and other questions, a Joint Working Group of the Commission for Atmospheric Sciences and the World Climate Research Programme has been convened. This Working Group on Numerical Experimentation (WGNE), has set aside several events of interest and has asked its members to generate numerical simulations of the events and examine the results. As part of this project, weather and pollution simulations were produced at the NOAA Earth System Research Laboratory using the Weather Research and Forecasting (WRF) chemistry model. These particular simulations include the aerosol indirect effect and are being done in collaboration with a group in China that will produce

  2. Safety Discrete Event Models for Holonic Cyclic Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Ciufudean, Calin; Filote, Constantin

    In this paper the expression “holonic cyclic manufacturing systems” refers to complex assembly/disassembly systems or fork/join systems, kanban systems, and in general, to any discrete event system that transforms raw material and/or components into products. Such a system is said to be cyclic if it provides the same sequence of products indefinitely. This paper considers the scheduling of holonic cyclic manufacturing systems and describes a new approach using Petri nets formalism. We propose an approach to frame the optimum schedule of holonic cyclic manufacturing systems in order to maximize the throughput while minimize the work in process. We also propose an algorithm to verify the optimum schedule.

  3. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    SciTech Connect

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  4. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  5. Effects of microphysics parameterization schemes on the simulation of a heavy rainfall event in Shanghai

    NASA Astrophysics Data System (ADS)

    Kan, Yu; Liu, Chaoshun; Qiao, Fengxue; Liu, Yanan; Gao, Wei; Sun, Zhibin

    2016-09-01

    A typical heavy rainfall event occurred in Shanghai on September 13, 2009 was simulated using the Weather Research and Forecasting Model (WRF) to study the impact of microphysics parameterization on heavy precipitation simulations. Sensitivity experiments were conducted using the cumulus parameterization scheme of Betts-Miller-Janjic (BMJ), but with three different microphysics schemes (Lin et al, WRF Single-Moment 5-class scheme (WSM5) and WRF Single-Moment 6-class scheme (WSM6)) under three-way nested domains with horizontal resolutions of 36km, 12km and 4km. The results showed that all three microphysics schemes are able to capture the general pattern of this heavy rainfall event, but differ in simulating the location, center and intensity of precipitation. Specifically, the Lin scheme overestimated the rainfall intensity and simulated the rainfall location drifting northeastwards. However, the WSM5 scheme better simulated the rainfall location but stronger intensity than the observation, while the WSM6 scheme better produced the rainfall intensity, but with an unrealistic rainfall area.

  6. Stochastic Optimal Regulation of Nonlinear Networked Control Systems by Using Event-Driven Adaptive Dynamic Programming.

    PubMed

    Sahoo, Avimanyu; Jagannathan, Sarangapani

    2017-02-01

    In this paper, an event-driven stochastic adaptive dynamic programming (ADP)-based technique is introduced for nonlinear systems with a communication network within its feedback loop. A near optimal control policy is designed using an actor-critic framework and ADP with event sampled state vector. First, the system dynamics are approximated by using a novel neural network (NN) identifier with event sampled state vector. The optimal control policy is generated via an actor NN by using the NN identifier and value function approximated by a critic NN through ADP. The stochastic NN identifier, actor, and critic NN weights are tuned at the event sampled instants leading to aperiodic weight tuning laws. Above all, an adaptive event sampling condition based on estimated NN weights is designed by using the Lyapunov technique to ensure ultimate boundedness of all the closed-loop signals along with the approximation accuracy. The net result is event-driven stochastic ADP technique that can significantly reduce the computation and network transmissions. Finally, the analytical design is substantiated with simulation results.

  7. Computer simulation of initial events in the biochemical mechanisms of DNA damage

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Holley, W. R.

    1993-01-01

    Understanding the systematic and quantitative correlation between the physical events of energy deposition by ionizing radiation and the ensuing chemical and biochemical processes leading to DNA damage is one of the goals in radiation research. Significant progress has been made toward achieving the stated goal by using theoretical modeling techniques. These techniques are strongly dependent on computer simulation procedures. A review of such techniques with details of various stages of simulation development, including a comparison with available experimental data, is presented in this article.

  8. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    SciTech Connect

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning.

  9. Numerical simulations of solar energetic particle event timescales associated with ICMEs

    NASA Astrophysics Data System (ADS)

    Qi, Shi-Yang; Qin, Gang; Wang, Yang

    2017-03-01

    Recently, S. W. Kahler studied the timescales of solar energetic particle (SEP) events associated with coronal mass ejections (CMEs) from analysis of spacecraft data. They obtained different timescales for SEP events, such as TO, the onset time from CME launch to SEP onset, TR, the rise time from onset to half the peak intensity (0.5{I}{{p}}), and TD, the duration of the SEP intensity above 0.5{I}{{p}}. In this work, we solve the transport equation for SEPs considering interplanetary coronal mass ejection (ICME) shocks as energetic particle sources. With our modeling assumptions, our simulations show similar results to Kahler’s analysis of spacecraft data, that the weighted average of TD increases with both CME speed and width. Moreover, from our simulation results, we suggest TD is directly dependent on CME speed, but not dependent on CME width, which were not found in the analysis of observational data.

  10. Selective Attention in Multi-Chip Address-Event Systems

    PubMed Central

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model. PMID:22346689

  11. Selective attention in multi-chip address-event systems.

    PubMed

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

  12. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  13. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  14. A comparison of active adverse event surveillance systems worldwide.

    PubMed

    Huang, Yu-Lin; Moon, Jinhee; Segal, Jodi B

    2014-08-01

    Post-marketing drug surveillance for adverse drug events (ADEs) has typically relied on spontaneous reporting. Recently, regulatory agencies have turned their attention to more preemptive approaches that use existing data for surveillance. We conducted an environmental scan to identify active surveillance systems worldwide that use existing data for the detection of ADEs. We extracted data about the systems' structures, data, and functions. We synthesized the information across systems to identify common features of these systems. We identified nine active surveillance systems. Two systems are US based-the FDA Sentinel Initiative (including both the Mini-Sentinel Initiative and the Federal Partner Collaboration) and the Vaccine Safety Datalink (VSD); two are Canadian-the Canadian Network for Observational Drug Effect Studies (CNODES) and the Vaccine and Immunization Surveillance in Ontario (VISION); and two are European-the Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR) Alliance and the Vaccine Adverse Event Surveillance and Communication (VAESCO). Additionally, there is the Asian Pharmacoepidemiology Network (AsPEN) and the Shanghai Drug Monitoring and Evaluative System (SDMES). We identified two systems in the UK-the Vigilance and Risk Management of Medicines (VRMM) Division and the Drug Safety Research Unit (DSRU), an independent academic unit. These surveillance systems mostly use administrative claims or electronic medical records; most conduct pharmacovigilance on behalf of a regulatory agency. Either a common data model or a centralized model is used to access existing data. The systems have been built using national data alone or via partnership with other countries. However, active surveillance systems using existing data remain rare. North America and Europe have the most population coverage; with Asian countries making good advances.

  15. Event-triggered nonlinear consensus in directed multi-agent systems with combinational state measurements

    NASA Astrophysics Data System (ADS)

    Li, Huaqing; Chen, Guo; Xiao, Li

    2016-10-01

    Event-triggered sampling control is motivated by the applications of embedded microprocessors equipped in the agents with limited computation and storage resources. This paper studied global consensus in multi-agent systems with inherent nonlinear dynamics on general directed networks using decentralised event-triggered strategy. For each agent, the controller updates are event-based and only triggered at its own event times by only utilising the locally current sampling data. A high-performance sampling event that only needs local neighbours' states at their own discrete time instants is presented. Furthermore, we introduce two kinds of general algebraic connectivity for strongly connected networks and strongly connected components of the directed network containing a spanning tree so as to describe the system's ability for reaching consensus. A detailed theoretical analysis on consensus is performed and two criteria are derived by virtue of algebraic graph theory, matrix theory and Lyapunov control approach. It is shown that the Zeno behaviour of triggering time sequence is excluded during the system's whole working process. A numerical simulation is given to show the effectiveness of the theoretical results.

  16. A Process Improvement Study on a Military System of Clinics to Manage Patient Demand and Resource Utilization Using Discrete-Event Simulation, Sensitivity Analysis, and Cost-Benefit Analysis

    DTIC Science & Technology

    2015-03-12

    systems waste billions of dollars. Even President Barack Obama (2014) has concerns over the inefficiencies in the healthcare system. With variable...First, monthly salary being paid to the staff member, based on type, is accounted for. Second, the cost equivalent of wait-time reduced or 4...nationwide public attention. It is reported that healthcare systems waste billions of dollars [1]; even President Barack Obama has concerns over

  17. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  18. Wire chamber requirements and tracking simulation studies for tracking systems at the superconducting super collider

    SciTech Connect

    Hanson, G.G.; Niczyporuk, B.B.; Palounek, A.P.T.

    1989-02-01

    Limitations placed on wire chambers by radiation damage and rate requirements in the SSC environment are reviewed. Possible conceptual designs for wire chamber tracking systems which meet these requirements are discussed. Computer simulation studies of tracking in such systems are presented. Simulations of events from interesting physics at the SSC, including hits from minimum bias background events, are examined. Results of some preliminary pattern recognition studies are given. Such computer simulation studies are necessary to determine the feasibility of wire chamber tracking systems for complex events in a high-rate environment such as the SSC. 11 refs., 9 figs., 1 tab.

  19. Power System Extreme Event Detection: The VulnerabilityFrontier

    SciTech Connect

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  20. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  1. DeMO: An Ontology for Discrete-event Modeling and Simulation.

    PubMed

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-09-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.

  2. Evaluating resilience of DNP3-controlled SCADA systems against event buffer flooding

    SciTech Connect

    Yan, Guanhua; Nicol, David M; Jin, Dong

    2010-12-16

    The DNP3 protocol is widely used in SCADA systems (particularly electrical power) as a means of communicating observed sensor state information back to a control center. Typical architectures using DNP3 have a two level hierarchy, where a specialized data aggregator device receives observed state from devices within a local region, and the control center collects the aggregated state from the data aggregator. The DNP3 communication between control center and data aggregator is asynchronous with the DNP3 communication between data aggregator and relays; this leads to the possibility of completely filling a data aggregator's buffer of pending events, when a relay is compromised or spoofed and sends overly many (false) events to the data aggregator. This paper investigates how a real-world SCADA device responds to event buffer flooding. A Discrete-Time Markov Chain (DTMC) model is developed for understanding this. The DTMC model is validated by a Moebius simulation model and data collected on real SCADA testbed.

  3. The Evaluation of a Pulmonary Display to Detect Adverse Respiratory Events Using High Resolution Human Simulator

    PubMed Central

    Wachter, S. Blake; Johnson, Ken; Albert, Robert; Syroid, Noah; Drews, Frank; Westenskow, Dwayne

    2006-01-01

    Objective Authors developed a picture-graphics display for pulmonary function to present typical respiratory data used in perioperative and intensive care environments. The display utilizes color, shape and emergent alerting to highlight abnormal pulmonary physiology. The display serves as an adjunct to traditional operating room displays and monitors. Design To evaluate the prototype, nineteen clinician volunteers each managed four adverse respiratory events and one normal event using a high-resolution patient simulator which included the new displays (intervention subjects) and traditional displays (control subjects). Between-group comparisons included (i) time to diagnosis and treatment for each adverse respiratory event; (ii) the number of unnecessary treatments during the normal scenario; and (iii) self-reported workload estimates while managing study events. Measurements Two expert anesthesiologists reviewed video-taped transcriptions of the volunteers to determine time to treat and time to diagnosis. Time values were then compared between groups using a Mann-Whitney-U Test. Estimated workload for both groups was assessed using the NASA-TLX and compared between groups using an ANOVA. P-values < 0.05 were considered significant. Results Clinician volunteers detected and treated obstructed endotracheal tubes and intrinsic PEEP problems faster with graphical rather than conventional displays (p < 0.05). During the normal scenario simulation, 3 clinicians using the graphical display, and 5 clinicians using the conventional display gave unnecessary treatments. Clinician-volunteers reported significantly lower subjective workloads using the graphical display for the obstructed endotracheal tube scenario (p < 0.001) and the intrinsic PEEP scenario (p < 0.03). Conclusion Authors conclude that the graphical pulmonary display may serve as a useful adjunct to traditional displays in identifying adverse respiratory events. PMID:16929038

  4. Mutual Events in the Uranian satellite system in 2007

    NASA Astrophysics Data System (ADS)

    Arlot, J. E.

    2008-09-01

    The equinox time on the giant planets When the Sun crosses the equatorial plane of a giant planet, it is the equinox time occurring every half orbit of the planet, i.e. every 6 years for Jupiter, 14 years for Saturn, 42 years for Uranus and 82 years for Neptune. Except Neptune, each planet have several major satellites orbiting in the equatorial plane, then, during the equinox time, the satellites will eclipse each other mutually. Since the Earth follows the Sun, during the equinox time, a terrestrial observer will see each satellite occulting each other during the same period. These events may be observed with photometric receivers since the light from the satellites will decrease during the events. The light curve will provide information on the geometric configuration of the the satellites at the time of the event with an accuracy of a few kilometers, not depending on the distance of the satellite system. Then, we are able to get an astrometric observation with an accuracy several times better than using direct imaging for positions. Equinox on Uranus in 2007 In 2007, it was equinox time on Uranus. The Sun crossed the equatorial plane of Uranus on December 6, 2007. Since the opposition Uranus-Sun was at the end of August 2007, observations were performed from May to December 2007. Since the declination of Uranus was between -5 and -6 degrees, observations were better to make in the southern hemisphere. However, some difficulties had to be solved: the faintness of the satellites (magnitude between 14 and 16), the brightness of the planet (magnitude 5) making difficult the photometric observation of the satellites. The used of K' filter associated to a large telescope allows to increase the number of observable events. Dynamics of the Uranian satellites One of the goals of the observations was to evaluate the accuracy of the current dynamical models of the motion of the satellites. This knowledge is important for several reasons: most of time the Uranian system is

  5. Systemic chemokine levels, coronary heart disease, and ischemic stroke events

    PubMed Central

    Canouï-Poitrine, F.; Luc, G.; Mallat, Z.; Machez, E.; Bingham, A.; Ferrieres, J.; Ruidavets, J.-B.; Montaye, M.; Yarnell, J.; Haas, B.; Arveiler, D.; Morange, P.; Kee, F.; Evans, A.; Amouyel, P.; Ducimetiere, P.

    2011-01-01

    Objectives: To quantify the association between systemic levels of the chemokine regulated on activation normal T-cell expressed and secreted (RANTES/CCL5), interferon-γ-inducible protein-10 (IP-10/CXCL10), monocyte chemoattractant protein-1 (MCP-1/CCL2), and eotaxin-1 (CCL11) with future coronary heart disease (CHD) and ischemic stroke events and to assess their usefulness for CHD and ischemic stroke risk prediction in the PRIME Study. Methods: After 10 years of follow-up of 9,771 men, 2 nested case-control studies were built including 621 first CHD events and 1,242 matched controls and 95 first ischemic stroke events and 190 matched controls. Standardized hazard ratios (HRs) for each log-transformed chemokine were estimated by conditional logistic regression. Results: None of the 4 chemokines were independent predictors of CHD, either with respect to stable angina or to acute coronary syndrome. Conversely, RANTES (HR = 1.70; 95% confidence interval [CI] 1.05–2.74), IP-10 (HR = 1.53; 95% CI 1.06–2.20), and eotaxin-1 (HR = 1.59; 95% CI 1.02–2.46), but not MCP-1 (HR = 0.99; 95% CI 0.68–1.46), were associated with ischemic stroke independently of traditional cardiovascular risk factors, hs-CRP, and fibrinogen. When the first 3 chemokines were included in the same multivariate model, RANTES and IP-10 remained predictive of ischemic stroke. Their addition to a traditional risk factor model predicting ischemic stroke substantially improved the C-statistic from 0.6756 to 0.7425 (p = 0.004). Conclusions: In asymptomatic men, higher systemic levels of RANTES and IP-10 are independent predictors of ischemic stroke but not of CHD events. RANTES and IP-10 may improve the accuracy of ischemic stroke risk prediction over traditional risk factors. PMID:21849651

  6. Sensitivity of a simulated extreme precipitation event to spatial resolution, parametrisations and assimilation

    NASA Astrophysics Data System (ADS)

    Ferreira, J.; Carvalho, A.; Carvalheiro, L.; Rocha, A.; Castanheira, J.

    2010-09-01

    . The first part of this study evaluates the sensitivity of the model to horizontal resolution and physical parametrisations in the prediction of the selected precipitation extreme events. Additionally, two other sensitivity tests were performed with the OP1 configuration, one regarding the cumulus physics parametrisation, which has been switched of (i.e. explicit calculation of convective eddies), to compare the results with the operational configuration and the other with assimilation of surface and upper air data. Physical processes of the precipitation in this period have been revealed through the analysis of the precipitation fields associated with the microphysics and the cumulus parametrisations. During the early morning microphysics plays an important role, whereas for late morning precipitation is due to a squall line convective system. As expected, results show that model resolution affects the amount of predicted precipitation and the parameterizations affect the location and time of the extreme precipitation. For this particular event, assimilation seems to degrade the simulation, particularly the maximum of precipitation.

  7. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  8. An adverse event capture and management system for cancer studies

    PubMed Central

    2015-01-01

    Background Comprehensive capture of Adverse Events (AEs) is crucial for monitoring for side effects of a therapy while assessing efficacy. For cancer studies, the National Cancer Institute has developed the Common Terminology Criteria for Adverse Events (CTCAE) as a required standard for recording attributes and grading AEs. The AE assessments should be part of the Electronic Health Record (EHR) system; yet, due to patient-centric EHR design and implementation, many EHR's don't provide straightforward functions to assess ongoing AEs to indicate a resolution or a grade change for clinical trials. Methods At UAMS, we have implemented a standards-based Adverse Event Reporting System (AERS) that is integrated with the Epic EHR and other research systems to track new and existing AEs, including automated lab result grading in a regulatory compliant manner. Within a patient's chart, providers can launch AERS, which opens the patient's ongoing AEs as default and allows providers to assess (resolution/ongoing) existing AEs. In another tab, it allows providers to create a new AE. Also, we have separated symptoms from diagnoses in the CTCAE to minimize inaccurate designation of the clinical observations. Upon completion of assessments, a physician would submit the AEs to the EHR via a Health Level 7 (HL7) message and then to other systems utilizing a Representational State Transfer Web Service. Conclusions AERS currently supports CTCAE version 3 and 4 with more than 65 cancer studies and 350 patients on those studies. This type of standard integrated into the EHR aids in research and data sharing in a compliant, efficient, and safe manner. PMID:26424052

  9. Did the Solar system form in a sequential triggered star formation event?

    NASA Astrophysics Data System (ADS)

    Parker, Richard J.; Dale, James E.

    2016-02-01

    The presence and abundance of the short-lived radioisotopes (SLRs) 26Al and 60Fe during the formation of the Solar system is difficult to explain unless the Sun formed in the vicinity of one or more massive star(s) that exploded as supernovae. Two different scenarios have been proposed to explain the delivery of SLRs to the protosolar nebula: (i) direct pollution of the protosolar disc by supernova ejecta, and (ii) the formation of the Sun in a sequential star formation event in which supernovae shockwaves trigger further star formation which is enriched in SLRs. The sequentially triggered model has been suggested as being more astrophysically likely than the direct pollution scenario. In this paper, we investigate this claim by analysing a combination of N-body and smoothed particle hydrodynamics simulations of star formation. We find that sequential star formation would result in large age spreads (or even bi-modal age distributions for spatially coincident events) due to the dynamical relaxation of the first star formation event(s). Secondly, we discuss the probability of triggering spatially and temporally discrete populations of stars and find this to be only possible in very contrived situations. Taken together, these results suggest that the formation of the Solar system in a triggered star formation event is as improbable, if not more so, than the direct pollution of the protosolar disc by a supernova.

  10. Simulation of extreme rainfall event of November 2009 over Jeddah, Saudi Arabia: the explicit role of topography and surface heating

    NASA Astrophysics Data System (ADS)

    Almazroui, Mansour; Raju, P. V. S.; Yusef, A.; Hussein, M. A. A.; Omar, M.

    2017-02-01

    In this paper, a nonhydrostatic Weather Research and Forecasting (WRF) model has been used to simulate the extreme precipitation event of 25 November 2009, over Jeddah, Saudi Arabia. The model is integrated in three nested (27, 9, and 3 km) domains with the initial and boundary forcing derived from the NCEP reanalysis datasets. As a control experiment, the model integrated for 48 h initiated at 0000 UTC on 24 November 2009. The simulated rainfall in the control experiment depicts in well agreement with Tropical Rainfall Measurement Mission rainfall estimates in terms of intensity as well as spatio-temporal distribution. Results indicate that a strong low-level (850 hPa) wind over Jeddah and surrounding regions enhanced the moisture and temperature gradient and created a conditionally unstable atmosphere that favored the development of the mesoscale system. The influences of topography and heat exchange process in the atmosphere were investigated on the development of extreme precipitation event; two sensitivity experiments are carried out: one without topography and another without exchange of surface heating to the atmosphere. The results depict that both surface heating and topography played crucial role in determining the spatial distribution and intensity of the extreme rainfall over Jeddah. The topography favored enhanced uplift motion that further strengthened the low-level jet and hence the rainfall over Jeddah and adjacent areas. On the other hand, the absence of surface heating considerably reduced the simulated rainfall by 30% as compared to the observations.

  11. Consistent simulations of multiple proxy responses to an abrupt climate change event.

    PubMed

    LeGrande, A N; Schmidt, G A; Shindell, D T; Field, C V; Miller, R L; Koch, D M; Faluvegi, G; Hoffmann, G

    2006-01-24

    Isotope, aerosol, and methane records document an abrupt cooling event across the Northern Hemisphere at 8.2 kiloyears before present (kyr), while separate geologic lines of evidence document the catastrophic drainage of the glacial Lakes Agassiz and Ojibway into the Hudson Bay at approximately the same time. This melt water pulse may have been the catalyst for a decrease in North Atlantic Deep Water formation and subsequent cooling around the Northern Hemisphere. However, lack of direct evidence for ocean cooling has lead to speculation that this abrupt event was purely local to Greenland and called into question this proposed mechanism. We simulate the response to this melt water pulse using a coupled general circulation model that explicitly tracks water isotopes and with atmosphere-only experiments that calculate changes in atmospheric aerosol deposition (specifically (10)Be and dust) and wetland methane emissions. The simulations produce a short period of significantly diminished North Atlantic Deep Water and are able to quantitatively match paleoclimate observations, including the lack of isotopic signal in the North Atlantic. This direct comparison with multiple proxy records provides compelling evidence that changes in ocean circulation played a major role in this abrupt climate change event.

  12. A global MHD simulation of an event with a quasi-steady northward IMF

    NASA Astrophysics Data System (ADS)

    Merkin, V. G.; Papadopoulos, D.; Lyon, J.; Anderson, B.

    2005-12-01

    We show results of a global MHD simulation, using the Lyon-Fedder-Mobarry (LFM) model, of an event previously examined using data from Iridium spacecraft observations as well as DMSP and IMAGE FUV data. The event is chosen because of the steady northward IMF sustained over a three-hour period during 16 July 2000. The Iridium observations showed very weak or absent Region 2 currents in the ionosphere, which makes the event favorable for global MHD modeling, despite the fact that it occured on the day after the "Bastille Day" storm, and there was a significant remnant ring current in the magnetosphere as indicated by a relatively high Dst index. Here, we compare the ionospheric field-aligned current and electric potential patterns with those recovered from Iridium observations. Particular attention is paid to a comparative analysis of the Pointing flux and the energy flux of precipitating particles, and the verification of the simulated particle flux against IMAGE FUV observations, that is used to validate the LFM precipitation model during weak driving.

  13. Study of reconnection events through Global MHD simulation and observational data

    NASA Astrophysics Data System (ADS)

    Cardoso, F. R.; Gonzalez, W. D.; Sibeck, D. G.; Kuznetsova, M. M.; Alves, M. V.

    2011-12-01

    Magnetic reconnection is the dominant mechanism for solar wind energy and momentum transfer to the magnetosphere. It can be a continuous or a transient process. Time-varying reconnection produces flux transfer events (FTEs) which can be identified by bipolar signatures in the component of the magnetic field normal to the magnetopause, deflections in the component tangential, and variations in the magnetic field magnitude. Some events exhibit the mixed magnetospheric and magnetosheath plasma populations expected for reconnection. Global magnetohydrodynamics (MHD) simulations are important tools to understand the relevant magnetic reconnection mechanisms. We have identified magnetic reconnection events, especially FTEs, in global MHD simulations and observations. We study their spatial and temporal characteristics as a function of solar wind parameters, in particular the interplanetary magnetic field orientation. We determine the origin of FTEs as well as the properties that describe them such as their dimension, extent and motion as a function of time. In particular, we track the motion of FTEs in an attempt to determine their point of origin, their destination, and how fast they move.

  14. Method for simulating discontinuous physical systems

    DOEpatents

    Baty, Roy S.; Vaughn, Mark R.

    2001-01-01

    The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.

  15. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    PubMed Central

    Kittipittayakorn, Cholada

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  16. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  17. Numerical simulation of a winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-09-01

    This study analyzes the cause of rare occurrence of winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using Weather Research and Forecasting (WRF) model with Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options, hail or graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with comparative analysis of the two options of GCE microphysics. On evaluating the model simulations, it is observed that hail option shows similar precipitation intensity with TRMM observation than the graupel option and is able to simulate hail precipitation. Using the model simulated output with hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached upto the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of WD. Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  18. Simulation of abrasive water jet cutting process: Part 1. Unit event approach

    NASA Astrophysics Data System (ADS)

    Lebar, Andrej; Junkar, Mihael

    2004-11-01

    Abrasive water jet (AWJ) machined surfaces exhibit the texture typical of machining with high energy density beam processing technologies. It has a superior surface quality in the upper region and rough surface in the lower zone with pronounced texture marks called striations. The nature of the mechanisms involved in the domain of AWJ machining is still not well understood but is essential for AWJ control improvement. In this paper, the development of an AWJ machining simulation is reported on. It is based on an AWJ process unit event, which in this case represents the impact of a particular abrasive grain. The geometrical characteristics of the unit event are measured on a physical model of the AWJ process. The measured dependences and the proposed model relations are then implemented in the AWJ machining process simulation. The obtained results are in good agreement in the engraving regime of AWJ machining. To expand the validity of the simulation further, a cellular automata approach is explored in the second part of the paper.

  19. An investigation into pilot and system response to critical in-flight events, volume 2

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1981-01-01

    Critical in-flight event is studied using mission simulation and written tests of pilot responses. Materials and procedures used in knowledge tests, written tests, and mission simulations are included

  20. A computer management system for patient simulations.

    PubMed

    Finkelsteine, M W; Johnson, L A; Lilly, G E

    1991-04-01

    A series of interactive videodisc patient simulations is being used to teach clinical problem-solving skills, including diagnosis and management, to dental students. This series is called Oral Disease Simulations for Diagnosis and Management (ODSDM). A computer management system has been developed in response to the following needs. First, the sequence in which students perform simulations is critical. Second, maintaining records of completed simulations and student performance on each simulation is a time-consuming task for faculty. Third, the simulations require ongoing evaluation to ensure high quality instruction. The primary objective of the management system is to ensure that each student masters diagnosis. Mastery must be obtained at a specific level before advancing to the next level. The management system does this by individualizing the sequence of the simulations to adapt to the needs of each student. The management system generates reports which provide information about students or the simulations. Student reports contain demographic and performance information. System reports include information about individual patient simulations and act as a quality control mechanism for the simulations.

  1. Rare switching events in non-stationary systems.

    PubMed

    Becker, Nils B; ten Wolde, Pieter Rein

    2012-05-07

    Physical systems with many degrees of freedom can often be understood in terms of transitions between a small number of metastable states. For time-homogeneous systems with short-term memory these transitions are fully characterized by a set of rate constants. We consider the question how to extend such a coarse-grained description to non-stationary systems and to systems with finite memory. We identify the physical regimes in which time-dependent rates are meaningful, and state microscopic expressions that can be used to measure both externally time-dependent and history-dependent rates in microscopic simulations. Our description can be used to generalize Markov state models to time-dependent Markovian or non-Markovian systems.

  2. Spatial and Temporal Signatures of Flux Transfer Events in Global Simulations of Magnetopause Dynamics

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Sibeck, David Gary; Hesse, Michael; Berrios, David; Rastaetter, Lutz; Toth, Gabor; Gombosi, Tamas I.

    2011-01-01

    Flux transfer events (FTEs) were originally identified by transient bipolar variations of the magnetic field component normal to the nominal magnetopause centered on enhancements in the total magnetic field strength. Recent Cluster and THEMIS multi-point measurements provided a wide range of signatures that are interpreted as evidence for FTE passage (e.g., crater FTE's, traveling magnetic erosion regions). We use the global magnetohydrodynamic (MHD) code BATS-R-US developed at the University of Michigan to model the global three-dimensional structure and temporal evolution of FTEs during multi-spacecraft magnetopause crossing events. Comparison of observed and simulated signatures and sensitivity analysis of the results to the probe location will be presented. We will demonstrate a variety of observable signatures in magnetic field profile that depend on space probe location with respect to the FTE passage. The global structure of FTEs will be illustrated using advanced visualization tools developed at the Community Coordinated Modeling Center

  3. Integrating Existing Simulation Components into a Cohesive Simulation System

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment

  4. Role of land state in a high resolution mesoscale model for simulating the Uttarakhand heavy rainfall event over India

    NASA Astrophysics Data System (ADS)

    Rajesh, P. V.; Pattnaik, S.; Rai, D.; Osuri, K. K.; Mohanty, U. C.; Tripathy, S.

    2016-04-01

    In 2013, Indian summer monsoon witnessed a very heavy rainfall event (>30 cm/day) over Uttarakhand in north India, claiming more than 5000 lives and property damage worth approximately 40 billion USD. This event was associated with the interaction of two synoptic systems, i.e., intensified subtropical westerly trough over north India and north-westward moving monsoon depression formed over the Bay of Bengal. The event had occurred over highly variable terrain and land surface characteristics. Although global models predicted the large scale event, they failed to predict realistic location, timing, amount, intensity and distribution of rainfall over the region. The goal of this study is to assess the impact of land state conditions in simulating this severe event using a high resolution mesoscale model. The land conditions such as multi-layer soil moisture and soil temperature fields were generated from High Resolution Land Data Assimilation (HRLDAS) modelling system. Two experiments were conducted namely, (1) CNTL (Control, without land data assimilation) and (2) LDAS, with land data assimilation (i.e., with HRLDAS-based soil moisture and temperature fields) using Weather Research and Forecasting (WRF) modelling system. Initial soil moisture correlation and root mean square error for LDAS is 0.73 and 0.05, whereas for CNTL it is 0.63 and 0.053 respectively, with a stronger heat low in LDAS. The differences in wind and moisture transport in LDAS favoured increased moisture transport from Arabian Sea through a convectively unstable region embedded within two low pressure centers over Arabian Sea and Bay of Bengal. The improvement in rainfall is significantly correlated to the persistent generation of potential vorticity (PV) in LDAS. Further, PV tendency analysis confirmed that the increased generation of PV is due to the enhanced horizontal PV advection component rather than the diabatic heating terms due to modified flow fields. These results suggest that, two

  5. Patient flow improvement for an ophthalmic specialist outpatient clinic with aid of discrete event simulation and design of experiment.

    PubMed

    Pan, Chong; Zhang, Dali; Kon, Audrey Wan Mei; Wai, Charity Sue Lea; Ang, Woo Boon

    2015-06-01

    Continuous improvement in process efficiency for specialist outpatient clinic (SOC) systems is increasingly being demanded due to the growth of the patient population in Singapore. In this paper, we propose a discrete event simulation (DES) model to represent the patient and information flow in an ophthalmic SOC system in the Singapore National Eye Centre (SNEC). Different improvement strategies to reduce the turnaround time for patients in the SOC were proposed and evaluated with the aid of the DES model and the Design of Experiment (DOE). Two strategies for better patient appointment scheduling and one strategy for dilation-free examination are estimated to have a significant impact on turnaround time for patients. One of the improvement strategies has been implemented in the actual SOC system in the SNEC with promising improvement reported.

  6. Simulations of The Extreme Precipitation Event Enhanced by Sea Surface Temperature Anomaly over the Black Sea

    NASA Astrophysics Data System (ADS)

    Hakan Doǧan, Onur; Önol, Barış

    2016-04-01

    Istanbul Technical University, Aeronautics and Astronautics Faculty, Meteorological Engineering, Istanbul, Turkey In this study, we examined the extreme precipitation case over the Eastern Black Sea region of Turkey by using regional climate model, RegCM4. The flood caused by excessive rain in August 26, 2010 killed 12 people and the landslides in Rize province have damaged many buildings. The station based two days total precipitation exceeds 200 mm. One of the usual suspects for this extreme event is positive anomaly of sea surface temperature (SST) over the Black Sea where the significant warming trend is clear in the last three decades. In August 2010, the monthly mean SST is higher than 3 °C with respect to the period of 1981-2010. We designed three sensitivity simulations with RegCM4 to define the effects of the Black Sea as a moisture source. The simulation domain with 10-km horizontal resolution covers all the countries bordering the Black Sea and simulation period is defined for entire August 2010. It is also noted that the spatial variability of the precipitation produced by the reference simulation (Sim-0) is consistent with the TRMM data. In terms of analysis of the sensitivity to SST, we forced the simulations by subtracting 1 °C (Sim-1), 2 °C (Sim-2) and 3 °C (Sim-3) from the ERA-Interim 6-hourly SST data (considering only the Black Sea). The sensitivity simulations indicate that daily total precipitation for all these simulations gradually decreased based on the reference simulation (Sim-0). 3-hourly maximum precipitation rates for Sim-0, Sim-1, Sim-2 and Sim-3 are 32, 25, 13 and 10.5 mm respectively over the hotspot region. Despite the fact that the simulations signal points out the same direction, degradation of the precipitation intensity does not indicate the same magnitude for all simulations. It is revealed that 2 °C (Sim-2) threshold is critical for SST sensitivity. We also calculated the humidity differences from the simulation and these

  7. Hydrocode simulation of the Chicxulub impact event and the production of climatically active gases

    NASA Astrophysics Data System (ADS)

    Pierazzo, Elisabetta; Kring, David A.; Melosh, H. Jay

    1998-12-01

    We constructed a numerical model of the Chicxulub impact event using the Chart-D Squared (CSQ) code coupled with the ANalytic Equation Of State (ANEOS) package. In the simulations we utilized a target stratigraphy based on borehole data and employed newly developed equations of state for the materials that are believed to play a crucial role in the impact-related extinction hypothesis: carbonates (calcite) and evaporites (anhydrite). Simulations explored the effects of different projectile sizes (10 to 30 km in diameter) and porosity (0 to 50%). The effect of impact speed is addressed by doing simulations of asteroid impacts (vi=20km/s) and comet impacts (vi=50km/s). The masses of climatically important species injected into the upper atmosphere by the impact increase with the energy of the impact event, ranging from 350 to 3500 Gt for CO2, from 40 to 560 Gt for S, and from 200 to 1400 Gt for water vapor. While our results are in good agreement with those of Ivanov et al. [1996], our estimated CO2 production is 1 to 2 orders of magnitude lower than the results of Takata and Ahrens [1994], indicating that the impact event enhanced the end-Cretaceous atmospheric CO2 inventory by, at most, 40%. Consequently, sulfur may have been the most important climatically active gas injected into the stratosphere. The amount of S released by the impact is several orders of magnitude higher than any known volcanic eruption and, with H2O, is high enough to produce a sudden and significant perturbation of Earth's climate.

  8. Extended temperature-accelerated dynamics: enabling long-time full-scale modeling of large rare-event systems.

    PubMed

    Bochenkov, Vladimir; Suetin, Nikolay; Shankar, Sadasivan

    2014-09-07

    A new method, the Extended Temperature-Accelerated Dynamics (XTAD), is introduced for modeling long-timescale evolution of large rare-event systems. The method is based on the Temperature-Accelerated Dynamics approach [M. Sørensen and A. Voter, J. Chem. Phys. 112, 9599 (2000)], but uses full-scale parallel molecular dynamics simulations to probe a potential energy surface of an entire system, combined with the adaptive on-the-fly system decomposition for analyzing the energetics of rare events. The method removes limitations on a feasible system size and enables to handle simultaneous diffusion events, including both large-scale concerted and local transitions. Due to the intrinsically parallel algorithm, XTAD not only allows studies of various diffusion mechanisms in solid state physics, but also opens the avenue for atomistic simulations of a range of technologically relevant processes in material science, such as thin film growth on nano- and microstructured surfaces.

  9. Using simulation to evaluate warhead monitoring system effectiveness

    SciTech Connect

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.; Liles, Karina R.; Meyer, Nicholas J.; Oster, Matthew R.; Waterworth, Angela M.

    2015-07-12

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled as declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical

  10. Helmet mounted display systems for helicopter simulation

    NASA Technical Reports Server (NTRS)

    Haworth, Loran A.; Bucher, Nancy; Runnings, David

    1989-01-01

    Simulation scientists continually pursue improved flight simulation technology with the goal of closely replicating the 'real world' physical environment. The presentation/display of visual information for flight simulation is one such area enjoying recent technical improvements that are fundamental for conducting simulated operations close to the terrain. Detailed and appropriate visual information is especially critical for Nap-Of-the-Earth (NOE) helicopter flight simulation where the pilot maintains an 'eyes-out' orientation to avoid obstructions and terrain. This paper elaborates on the visually coupled Wide Field Of View Helmet Mounted Display (WFOVHMD) system technology as a viable visual display system for helicopter simulation. In addition the paper discusses research conducted on the NASA-Ames Vertical Motion Simulator that examined one critical research issue for helmet mounted displays.

  11. Simulations of Wave Propagation in the Jovian Atmosphere after SL9 Impact Events

    NASA Astrophysics Data System (ADS)

    Pond, Jarrad W.; Palotai, C.; Korycansky, D.; Harrington, J.

    2013-10-01

    Our previous numerical investigations into Jovian impacts, including the Shoemaker Levy- 9 (SL9) event (Korycansky et al. 2006 ApJ 646. 642; Palotai et al. 2011 ApJ 731. 3), the 2009 bolide (Pond et al. 2012 ApJ 745. 113), and the ephemeral flashes caused by smaller impactors in 2010 and 2012 (Hueso et al. 2013; Submitted to A&A), have covered only up to approximately 3 to 30 seconds after impact. Here, we present further SL9 impacts extending to minutes after collision with Jupiter’s atmosphere, with a focus on the propagation of shock waves generated as a result of the impact events. Using a similar yet more efficient remapping method than previously presented (Pond et al. 2012; DPS 2012), we move our simulation results onto a larger computational grid, conserving quantities with minimal error. The Jovian atmosphere is extended as needed to accommodate the evolution of the features of the impact event. We restart the simulation, allowing the impact event to continue to progress to greater spatial extents and for longer times, but at lower resolutions. This remap-restart process can be implemented multiple times to achieve the spatial and temporal scales needed to investigate the observable effects of waves generated by the deposition of energy and momentum into the Jovian atmosphere by an SL9-like impactor. As before, we use the three-dimensional, parallel hydrodynamics code ZEUS-MP 2 (Hayes et al. 2006 ApJ.SS. 165. 188) to conduct our simulations. Wave characteristics are tracked throughout these simulations. Of particular interest are the wave speeds and wave positions in the atmosphere as a function of time. These properties are compared to the characteristics of the HST rings to see if shock wave behavior within one hour of impact is consistent with waves observed at one hour post-impact and beyond (Hammel et al. 1995 Science 267. 1288). This research was supported by National Science Foundation Grant AST-1109729 and NASA Planetary Atmospheres Program Grant

  12. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Agocs, A.; Aiola, S.; Antolini, R.; Avanzini, C.; Baldini Ferroli, R.; Bencivenni, G.; Bossini, E.; Bressan, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gemme, G.; Gnesi, I.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Li, S.; Librizzi, F.; Maggiora, A.; Massai, M.; Miozzi, S.; Panareo, M.; Paoletti, R.; Perasso, L.; Pilo, F.; Piragino, G.; Regano, A.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Spandre, G.; Squarcia, S.; Taiuti, M.; Tosello, F.; Votano, L.; Williams, M. C. S.; Yánez, G.; Zichichi, A.; Zuyeuski, R.

    2014-08-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 105 km2. In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  13. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model.

  14. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    NASA Astrophysics Data System (ADS)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  15. Assessment of extreme precipitation events over Amazon simulated by global climate models from HIGEM family.

    NASA Astrophysics Data System (ADS)

    Custodio, Maria; Ambrizzi, Tercio; da Rocha, Rosmeri

    2015-04-01

    coupled and uncoupled models capture the observed signal of the ENSO and MJO oscillations, although with reversed phase in some cases. The austral summer and winter composites of interannual and intraseasonal anomalies showed for wet and dry extreme events the same spatial distribution in models and reanalyses. The interannual variability analysis showed that coupled simulations intensify the impact of the El Niño Southern Oscillation (ENSO) in the Amazon. In the Intraseasonal scale, although the simulations intensify this signal, the coupled models present larger similarities with observations than the atmospheric models for the extremes of precipitation. Note that there are differences between simulated and observed IS anomalies indicating that the models have problems to correctly represent the intensity of low frequency phenomena in this scale. The simulation of ENSO in GCMs can be attributed to their high resolution, mainly in the oceanic component, which contributes to the better solution of the small scale vortices in the ocean. This implies in improvements in the forecasting of sea surface temperature (SST) and as consequence in the ability of atmosphere to respond to this feature.

  16. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    PubMed Central

    2012-01-01

    Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality

  17. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  18. Quantum law of rare events for systems with bosonic symmetry.

    PubMed

    Sokolovski, D

    2013-03-15

    In classical physics, the joint probability of a number of individually rare independent events is given by the Poisson distribution. It describes, for example, the unidirectional transfer of a population between the densely and sparsely populated states of a classical two-state system. We derive a quantum version of the law for a large number of noninteracting systems (particles) obeying Bose-Einstein statistics. The classical law is significantly modified by quantum interference, which allows, among other effects, for the counterflow of particles back into the densely populated state. The suggested observation of this classically forbidden counterflow effect can be achieved with modern laser-based techniques used for manipulating and trapping cold atoms.

  19. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    SciTech Connect

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  20. Integrating event detection system operation characteristics into sensor placement optimization.

    SciTech Connect

    Hart, William Eugene; McKenna, Sean Andrew; Phillips, Cynthia Ann; Murray, Regan Elizabeth; Hart, David Blaine

    2010-05-01

    We consider the problem of placing sensors in a municipal water network when we can choose both the location of sensors and the sensitivity and specificity of the contamination warning system. Sensor stations in a municipal water distribution network continuously send sensor output information to a centralized computing facility, and event detection systems at the control center determine when to signal an anomaly worthy of response. Although most sensor placement research has assumed perfect anomaly detection, signal analysis software has parameters that control the tradeoff between false alarms and false negatives. We describe a nonlinear sensor placement formulation, which we heuristically optimize with a linear approximation that can be solved as a mixed-integer linear program. We report the results of initial experiments on a real network and discuss tradeoffs between early detection of contamination incidents, and control of false alarms.

  1. Event-Triggered Fault Detection Filter Design for a Continuous-Time Networked Control System.

    PubMed

    Wang, Yu-Long; Shi, Peng; Lim, Cheng-Chew; Liu, Yuan

    2016-12-01

    This paper studies the problem of event-triggered fault detection filter (FDF) and controller coordinated design for a continuous-time networked control system (NCS) with biased sensor faults. By considering sensor-to-FDF network-induced delays and packet dropouts, which do not impose a constraint on the event-triggering mechanism, and proposing the simultaneous network bandwidth utilization ratio and fault occurrence probability-based event-triggering mechanism, a new closed-loop model for the considered NCS is established. Based on the established model, the event-triggered H ∞ performance analysis, and FDF and controller coordinated design are presented. The combined mutually exclusive distribution and Wirtinger-based integral inequality approach is proposed for the first time to deal with integral inequalities for products of vectors. This approach is proved to be less conservative than the existing Wirtinger-based integral inequality approach. The designed FDF and controller can guarantee the sensitivity of the residual signal to faults and the robustness of the NCS to external disturbances. The simulation results verify the effectiveness of the proposed event-triggering mechanism, and the FDF and controller coordinated design.

  2. INTEGRATED SYSTEM SIMULATION IN X-RAY RADIOGRAPHY

    SciTech Connect

    T. KWAN; ET AL

    2001-01-01

    An integrated simulation capability is being developed to examine the fidelity of a dynamic radiographic system. This capability consists of a suite of simulation codes which individually model electromagnetic and particle transport phenomena and are chained together to model an entire radiographic event. Our study showed that the electron beam spot size at the converter target plays the key role in determining material edge locations. The angular spectrum is a relatively insensitive factor in radiographic fidelity. We also found that the full energy spectrum of the imaging photons must be modeled to obtain an accurate analysis of material densities.

  3. The role of regional climate model setup in simulating two extreme precipitation events in the European Alpine region

    NASA Astrophysics Data System (ADS)

    Awan, Nauman Khurshid; Gobiet, Andreas; Suklitsch, Martin

    2014-09-01

    In this study we have investigated the role of domain settings and model's physics in simulating two extreme precipitation events. Four regional climate models, all driven with a re-analysis dataset were used to create an ensemble of 61 high-resolution simulations by varying physical parameterization schemes, domain sizes, nudging and nesting techniques. The two discussed events are three-day time slices taken from approximately 15-months long climate simulations. The results show that dynamical downscaling significantly improves the spatial characteristics such as correlation, variability as well as location and intensity of maximum precipitation. Spatial variability, which is underestimated by most of the simulations can be improved by choosing suitable vertical resolution, convective and microphysics scheme. The results further suggest that for studies focusing on extreme precipitation events relatively small domains or nudging could be advantageous. However, a final conclusion on this issue would be premature, since only two extreme precipitation events are considered.

  4. The role of regional climate model setup in simulating two extreme precipitation events in the European Alpine region

    NASA Astrophysics Data System (ADS)

    Awan, Nauman Khurshid; Gobiet, Andreas; Suklitsch, Martin

    2015-01-01

    In this study we have investigated the role of domain settings and model's physics in simulating two extreme precipitation events. Four regional climate models, all driven with a re-analysis dataset were used to create an ensemble of 61 high-resolution simulations by varying physical parameterization schemes, domain sizes, nudging and nesting techniques. The two discussed events are three-day time slices taken from approximately 15-months long climate simulations. The results show that dynamical downscaling significantly improves the spatial characteristics such as correlation, variability as well as location and intensity of maximum precipitation. Spatial variability, which is underestimated by most of the simulations can be improved by choosing suitable vertical resolution, convective and microphysics scheme. The results further suggest that for studies focusing on extreme precipitation events relatively small domains or nudging could be advantageous. However, a final conclusion on this issue would be premature, since only two extreme precipitation events are considered.

  5. An electronic notebook for physical system simulation

    NASA Astrophysics Data System (ADS)

    Kelsey, Robert L.

    2003-09-01

    A scientist who sets up and runs experiments typically keeps notes of this process in a lab notebook. A scientist who runs computer simulations should be no different. Experiments and simulations both require a set-up process which should be documented along with the results of the experiment or simulation. The documentation is important for knowing and understanding what was attempted, what took place, and how to reproduce it in the future. Modern simulations of physical systems have become more complex due in part to larger computational resources and increased understanding of physical systems. These simulations may be performed by combining the results from multiple computer codes. The machines that these simulations are executed on are often massively parallel/distributed systems. The output result of one of these simulations can be a terabyte of data and can require months of computing. All of these things contribute to the difficulty of keeping a useful record of the process of setting up and executing a simulation for a physical system. An electronic notebook for physical system simulations has been designed to help document the set up and execution process. Much of the documenting is done automatically by the simulation rather than the scientist running the simulation. The simulation knows what codes, data, software libraries, and versions thereof it is drawing together. All of these pieces of information become documented in the electronic notebook. The electronic notebook is designed with and uses the eXtensible Markup Language (XML). XML facilitates the representation, storage, interchange, and further use of the documented information.

  6. : A Scalable and Transparent System for Simulating MPI Programs

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    is a scalable, transparent system for experimenting with the execution of parallel programs on simulated computing platforms. The level of simulated detail can be varied for application behavior as well as for machine characteristics. Unique features of are repeatability of execution, scalability to millions of simulated (virtual) MPI ranks, scalability to hundreds of thousands of host (real) MPI ranks, portability of the system to a variety of host supercomputing platforms, and the ability to experiment with scientific applications whose source-code is available. The set of source-code interfaces supported by is being expanded to support a wider set of applications, and MPI-based scientific computing benchmarks are being ported. In proof-of-concept experiments, has been successfully exercised to spawn and sustain very large-scale executions of an MPI test program given in source code form. Low slowdowns are observed, due to its use of purely discrete event style of execution, and due to the scalability and efficiency of the underlying parallel discrete event simulation engine, sik. In the largest runs, has been executed on up to 216,000 cores of a Cray XT5 supercomputer, successfully simulating over 27 million virtual MPI ranks, each virtual rank containing its own thread context, and all ranks fully synchronized by virtual time.

  7. Global Positioning System Simulator Field Operational Procedures

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Quinn, David A.; Day, John H. (Technical Monitor)

    2002-01-01

    Global Positioning System (GPS) simulation is an important activity in the development or qualification of GPS signal receivers for space flight. Because a GPS simulator is a critical resource it is highly desirable to develop a set of field operational procedures to supplement the basic procedures provided by most simulator vendors. Validated field procedures allow better utilization of the GPS simulator in the development of new test scenarios and simulation operations. These procedures expedite simulation scenario development while resulting in scenarios that are more representative of the true design, as well as enabling construction of more complex simulations than previously possible, for example, spacecraft maneuvers. One difficulty in the development of a simulation scenario is specifying various modes of test vehicle motion and associated maneuvers requiring that a user specify some (but not all) of a few closely related simulation parameters. Currently this can only be done by trial and error. A stand-alone procedure that implements the simulator maneuver motion equations and solves for the motion profile transient times, jerk and acceleration would be of considerable value. Another procedure would permit the specification of some configuration parameters that would determine the simulated GPS signal composition. The resulting signal navigation message, for example, would force the receiver under test to use only the intended C-code component of the simulated GPS signal. A representative class of GPS simulation-related field operational procedures is described in this paper. These procedures were developed and used in support of GPS integration and testing for many successful spacecraft missions such as SAC-A, EO-1, AMSAT, VCL, SeaStar, sounding rockets, and by using the industry standard Spirent Global Simulation Systems Incorporated (GSSI) STR series simulators.

  8. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    NASA Astrophysics Data System (ADS)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  9. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  10. Characteristics of flight simulator visual systems

    NASA Technical Reports Server (NTRS)

    Statler, I. C. (Editor)

    1981-01-01

    The physical parameters of the flight simulator visual system that characterize the system and determine its fidelity are identified and defined. The characteristics of visual simulation systems are discussed in terms of the basic categories of spatial, energy, and temporal properties corresponding to the three fundamental quantities of length, mass, and time. Each of these parameters are further addressed in relation to its effect, its appropriate units or descriptors, methods of measurement, and its use or importance to image quality.

  11. Simulation of rainfall-runoff for major flash flood events in Karachi

    NASA Astrophysics Data System (ADS)

    Zafar, Sumaira

    2016-07-01

    Metropolitan city Karachi has strategic importance for Pakistan. With the each passing decade the city is facing urban sprawl and rapid population growth. These rapid changes directly affecting the natural resources of city including its drainage pattern. Karachi has three major cities Malir River with the catchment area of 2252 sqkm and Lyari River has catchment area about 470.4 sqkm. These are non-perennial rivers and active only during storms. Change of natural surfaces into hard pavement causing an increase in rainfall-runoff response. Curve Number is increased which is now causing flash floods in the urban locality of Karachi. There is only one gauge installed on the upstream of the river but there no record for the discharge. Only one gauge located at the upstream is not sufficient for discharge measurements. To simulate the maximum discharge of Malir River rainfall (1985 to 2014) data were collected from Pakistan meteorological department. Major rainfall events use to simulate the rainfall runoff. Maximum rainfall-runoff response was recorded in during 1994, 2007 and 2013. This runoff causes damages and inundation in floodplain areas of Karachi. These flash flooding events not only damage the property but also cause losses of lives

  12. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  13. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  14. Quantum Simulation for Open-System Dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Dong-Sheng; de Oliveira, Marcos Cesar; Berry, Dominic; Sanders, Barry

    2013-03-01

    Simulations are essential for predicting and explaining properties of physical and mathematical systems yet so far have been restricted to classical and closed quantum systems. Although forays have been made into open-system quantum simulation, the strict algorithmic aspect has not been explored yet is necessary to account fully for resource consumption to deliver bounded-error answers to computational questions. An open-system quantum simulator would encompass classical and closed-system simulation and also solve outstanding problems concerning, e.g. dynamical phase transitions in non-equilibrium systems, establishing long-range order via dissipation, verifying the simulatability of open-system dynamics on a quantum Turing machine. We construct an efficient autonomous algorithm for designing an efficient quantum circuit to simulate many-body open-system dynamics described by a local Hamiltonian plus decoherence due to separate baths for each particle. The execution time and number of gates for the quantum simulator both scale polynomially with the system size. DSW funded by USARO. MCO funded by AITF and Brazilian agencies CNPq and FAPESP through Instituto Nacional de Ciencia e Tecnologia-Informacao Quantica (INCT-IQ). DWB funded by ARC Future Fellowship (FT100100761). BCS funded by AITF, CIFAR, NSERC and USARO.

  15. EPICS simulation tools for control system development

    SciTech Connect

    Wright, R.M.; Kerstiens, D.M.; Vaughn, G.D.; Weiss, R.E.

    1994-09-01

    When developing control system software there are many times when the ability to simulate the response of the instrumentation can be very useful. Examples are: (i) when the operator interface is being designed and the users want an idea of what the finished system might took like; (ii) when the interface hardware is not yet available; (iii) when the reaction of the control system to an error condition must be tested, but the actual occurrence of such an error would cause undesirable side effects; (iv) when operators are being trained to use the system; (v) when an improvement or bug fix needs to be tested, but the running system cannot be shut down for long. The Experimental Physics and Industrial Control System (EPICS) provides tools for building simple simulations and interfacing to more complex simulations of accelerator hardware. At the lowest level an individual data channel can be switched to take its input from either a simulated data location or from the actual hardware. At a slightly higher level, sequences can be run on the real-time interface processor so that output to the hardware is intercepted and an appropriate substitute value is provided for the corresponding read-back records. At a still higher level a program can use the Channel Access software bus facility of EPICS to control some global aspect of an accelerator or can interface to an external accelerator simulation instead of the actual accelerator. The goal of testing control system software using simulated hardware is to minimize the changes required in shifting between the simulated system and the real system. The degree of success of the EPICS tools in meeting the minimum change goal will be addressed with suggestions for improvements. The implementation of simulated responses using EPICS tools will be discussed and examples of experience using the EPICS tools to create and interface to simulations will be given.

  16. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  17. Simulation of gas hydrate dissociation caused by repeated tectonic uplift events

    NASA Astrophysics Data System (ADS)

    Goto, Shusaku; Matsubayashi, Osamu; Nagakubo, Sadao

    2016-05-01

    Gas hydrate dissociation by tectonic uplift is often used to explain geologic and geophysical phenomena, such as hydrate accumulation probably caused by hydrate recycling and the occurrence of double bottom-simulating reflectors in tectonically active areas. However, little is known of gas hydrate dissociation resulting from tectonic uplift. This study investigates gas hydrate dissociation in marine sediments caused by repeated tectonic uplift events using a numerical model incorporating the latent heat of gas hydrate dissociation. The simulations showed that tectonic uplift causes upward movement of some depth interval of hydrate-bearing sediment immediately above the base of gas hydrate stability (BGHS) to the gas hydrate instability zone because the sediment initially maintains its temperature: in that interval, gas hydrate dissociates while absorbing heat; consequently, the temperature of the interval decreases to that of the hydrate stability boundary at that depth. Until the next uplift event, endothermic gas hydrate dissociation proceeds at the BGHS using heat mainly supplied from the sediment around the BGHS, lowering the temperature of that sediment. The cumulative effects of these two endothermic gas hydrate dissociations caused by repeated uplift events lower the sediment temperature around the BGHS, suggesting that in a marine area in which sediment with a highly concentrated hydrate-bearing layer just above the BGHS has been frequently uplifted, the endothermic gas hydrate dissociation produces a gradual decrease in thermal gradient from the seafloor to the BGHS. Sensitivity analysis for model parameters showed that water depth, amount of uplift, gas hydrate saturation, and basal heat flow strongly influence the gas hydrate dissociation rate and sediment temperature around the BGHS.

  18. 2007 Mutual events within the binary system of (22) Kalliope

    NASA Astrophysics Data System (ADS)

    Descamps, P.; Marchis, F.; Pollock, J.; Berthier, J.; Birlan, M.; Vachier, F.; Colas, F.

    2008-11-01

    In 2007, the asteroid Kalliope will reach one of its annual equinoxes. As a consequence, its small satellite Linus orbiting in the equatorial plane will undergo a season of mutual eclipses and occultations very similar to the one that the Galilean satellites undergo every 6 years. This paper is aimed at preparing a campaign of observations of these mutual events occurring from February to May 2007. This opportunity occurs only under favorable geometric conditions when the Sun and/or the Earth are close to the orbital plane of the system. This is the first international campaign devoted to the observation of photometric events within an asynchronous asteroidal binary system. We took advantage of a reliable orbit solution of Linus to predict a series of 24 mutual eclipses and 12 mutual occultations observable in the spring of 2007. Thanks to the brightness of Kalliope ( mv≃11), these observations are easy to perform even with a small telescope. Anomalous attenuation events could be observed lasting for about 1-3 h with amplitude up to 0.09 mag. The attenuations are of two distinct types that can clearly be identified as primary and secondary eclipses similar to those that have been previously observed in other minor planet binary systems [Pravec, P., Scheirich, P., Kusnirák, P., Sarounová, L., Mottola, S., Hahn, G., Brown, P., Esquerdo, G., Kaiser, N., Krzeminski, Z., Pray, D.P., Warner, B.D., Harris, A.W., Nolan, M.C., Howell, E.S., Benner, L.A.M., Margot, J.-L., Galád, A., Holliday, W., Hicks, M.D., Krugly, Yu.N., Tholen, D., Whiteley, R., Marchis, F., Degraff, D.R., Grauer, A., Larson, S., Velichko, F.P., Cooney, W.R., Stephens, R., Zhu, J., Kirsch, K., Dyvig, R., Snyder, L., Reddy, V., Moore, S., Gajdos, S., Világi, J., Masi, G., Higgins, D., Funkhouser, G., Knight, B., Slivan, S., Behrend, R., Grenon, M., Burki, G., Roy, R., Demeautis, C., Matter, D., Waelchli, N., Revaz, Y., Klotz, A., Rieugné, M., Thierry, P., Cotrez, V., Brunetto, L., Kober, G., 2006

  19. Channel simulation for optical communication systems

    NASA Technical Reports Server (NTRS)

    Tycz, M.; Fitzmaurice, M. W.

    1974-01-01

    A technique is reported for simulating the signal fading that will be experienced by typical optical communication systems. The desired irradiance or amplitude fading statistics can be simulated by incorporating a linearized optical modulator subsystem between the transmitter and receiver. This technique has been implemented in the design and construction of a laboratory channel simulator. The design of the processing electronics is discussed along with the results of tests performed for each mode of operation.

  20. Defining and representing events in a satellite scheduling system - The IEPS (Interactive Experimenter Planning System) approach

    NASA Technical Reports Server (NTRS)

    Mclean, David R.; Littlefield, Ronald G.; Macoughtry, William O.

    1987-01-01

    A methodology is described for defining and representing satellite events from the IEPS perspective. The task of doing this is divided into four categories and includes defining and representing resource windows, event parameters, event scheduling strategies, and event constraints. The description of each of these categories includes examples from the IEPS ERBS-TDRSS Contact Planning System. This is a system which is being used by the Earth Radiation Budget Satellite (ERBS) schedulers to request TDRSS contact times from the NCC. The system is written in the C programming language and uses a custom built inference engine (TIE1) to do constraint checking and a custom built strategies interpreter to derive the plan. The planning system runs on the IBM-PC/AT or on any similar hardware which has a C development environment and 640K of memory.

  1. ROBOSIM, a simulator for robotic systems

    NASA Technical Reports Server (NTRS)

    Hinman, Elaine M.; Fernandez, Ken; Cook, George E.

    1991-01-01

    ROBOSIM, a simulator for robotic systems, was developed by NASA to aid in the rapid prototyping of automation. ROBOSIM has allowed the development of improved robotic systems concepts for both earth-based and proposed on-orbit applications while significantly reducing development costs. In a cooperative effort with an area university, ROBOSIM was further developed for use in the classroom as a safe and cost-effective way of allowing students to study robotic systems. Students have used ROBOSIM to study existing robotic systems and systems which they have designed in the classroom. Since an advanced simulator/trainer of this type is beneficial not only to NASA projects and programs but industry and academia as well, NASA is in the process of developing this technology for wider public use. An update on the simulators's new application areas, the improvements made to the simulator's design, and current efforts to ensure the timely transfer of this technology are presented.

  2. Solar simulator for concentrator photovoltaic systems.

    PubMed

    Domínguez, César; Antón, Ignacio; Sala, Gabriel

    2008-09-15

    A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories.

  3. Simulations of Sea Level Rise Effects on Complex Coastal Systems

    NASA Astrophysics Data System (ADS)

    Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Reed, C. W.

    2009-12-01

    It is now established that complex coastal systems with elements such as beaches, inlets, bays, and rivers adjust their morphologies according to time-varying balances in between the processes that control the exchange of sediment. Accelerated sea level rise introduces a major perturbation into the sediment-sharing systems. A modeling framework based on a new SL-PR model which is an advanced version of the aggregate-scale CST Model and the event-scale CMS-2D and CMS-Wave combination have been used to simulate the recent evolution of a portion of the Florida panhandle coast. This combination of models provides a method to evaluate coefficients in the aggregate-scale model that were previously treated as fitted parameters. That is, by carrying out simulations of a complex coastal system with runs of the event-scale model representing more than a year it is now possible to directly relate the coefficients in the large-scale SL-PR model to measureable physical parameters in the current and wave fields. This cross-scale modeling procedure has been used to simulate the shoreline evolution at the Santa Rosa Island, a long barrier which houses significant military infrastructure at the north Gulf Coast. The model has been used to simulate 137 years of measured shoreline change and to extend these to predictions of future rates of shoreline migration.

  4. Biological impact of low dose-rate simulated solar particle event radiation in vivo.

    PubMed

    Chang, P Y; Doppalapudi, R; Bakke, J; Wang, A; Menda, S; Davis, Z

    2010-08-01

    C57Bl6-lacZ animals were exposed to a range of low dose-rate simulated solar particle event (sSPE) radiation at the NASA-sponsored Research Laboratory (NSRL) at Brookhaven National Laboratory (BNL). Peripheral blood was harvested from animals from 1 to 12 days after total body irradiation (TBI) to quantify the level of circulating reticulocytes (RET) and micronucleated reticulocytes (MN-RET) as an early indicator of radiation-induced genotoxicity. Bone marrow lymphocytes and hippocampal tissues from each animal were collected at 12 days and up to two months, to evaluate dose-dependent late effects after sSPE exposure. Early hematopoietic changes show that the % RET was reduced up to 3 days in response to radiation exposure but recovered at 12 days postirradiation. The % MN-RET in peripheral blood was temporally regulated and dependant on the total accumulated dose. Total chromosome aberrations in lymphocytes increased linearly with dose within a week after radiation and remained significantly higher than the control values at 4 weeks after exposure. The level of aberrations in the irradiated animals returned to control levels by 8 weeks postirradiation. Measurements of chromosome 2 and 8 specific aberrations indicate that, consistent with conventional giemsa-staining methods, the level of aberrations is also not significantly higher than in control animals at 8 weeks postirradiation. The hippocampus was surveyed for differential transcriptional regulation of genes known to be associated with neurogenesis. Our results showed differential expression of neurotrophin and their associated receptor genes within 1 week after sSPE exposure. Progressive changes in the profile of expressed genes known to be involved in neurogenic signaling pathways were dependent on the sSPE dose. Our results to date suggest that radiation-induced changes in the hematopoietic system, i.e., chromosome aberrations in lymphocytes, are transient and do not persist past 4 weeks after radiation

  5. Unified Behavior Framework for Discrete Event Simulation Systems

    DTIC Science & Technology

    2015-03-26

    information to form a world model. The robot would then use this model in planning algorithms to reason and form its next actions, then execute those...and higher search algorithms . Figure 2.4: General composition for three-layered architecture. Bonasso [2] and Gat [9] both developed similar three...through the tree according 15 to the algorithms contained in the four different types listed above [15]. When this activation signal reaches a leaf node

  6. Warship Combat System Selection Methodology Based on Discrete Event Simulation

    DTIC Science & Technology

    2010-09-01

    Mk 2 1 20 mm Vulcan Phalanx Mk 15 Mod 1 16 RIM- 162 8 Harpoon Block 1C (2 quad) 4 Mk 46 Mod 5 Independ ence (LCS) US 2790 Ton Sea Giraffe VDS...DPG 35 mm 1 SGE-30 Goalkee per 30 mm 16 RIM 7P 8 Harpoon Block 1C (2 quad) 2 Mk 46 Mod 5 Lekiu Malaysia 1845 Ton DA08; Sea Giraffe 150HC...Oto Melara 76/62 Super Rapid 2 DS 30M Mk 2 30 mm No 12 MM40 Block 3 N/A Baynunah United Arab Emirates 830 Ton Sea Giraffe ; Scanter 2001

  7. IRIS observations and MHD simulations of explosive events in the transition region of the Sun

    NASA Astrophysics Data System (ADS)

    Guo, Lijia; Innes, Davina; Huang, Yi-Min; Bhattacharjee, Amitava

    2016-05-01

    Small-scale explosive events on the Sun are thought to be related to magnetic reconnection. While Petschek reconnection has been considered as a reconnection mechanism for explosive events on the Sun for quite a long time, the fragmentation of a current sheet in the high-Lundquist-number regime caused by the plasmoid instability has recently been proposed as a possible mechanism for fast reconnection. The actual reconnection sites are too small to be resolved with images but these reconnection mechanisms, Petschek and the plasmoid instability, have very different density and velocity structures and so can be distinguished by high-resolution line profiles observations. We use high-resolution sit-and-stare spectral observations of the Si IV line, obtained by the IRIS spectrometer, to identify sites of reconnection, and follow the development of line profiles. The aim is to obtain a survey of typical line profiles produced by small-scale reconnection events in the transition region and compare them with synthetic line profiles from numerical simulations of a reconnecting current sheet to determine whether reconnection occurs via the plasmoid instabilty or the Petschek mechanism. Direct comparison between IRIS observations and numerical results suggests that the observed Si IV profiles can be reproduced with a fragmented current layer subject to plasmoid instability but not by bi-directional jets that characterise the Petschek mechanism. This result suggests that if these small-scale events are reconnection sites, then fast reconnection proceeds via the plasmoid instability, rather than the Petschek mechanism during small-scale reconnection on the Sun.

  8. High Frequency Mechanical Pyroshock Simulations for Payload Systems

    SciTech Connect

    BATEMAN,VESTA I.; BROWN,FREDERICK A.; CAP,JEROME S.; NUSSER,MICHAEL A.

    1999-12-15

    Sandia National Laboratories (SNL) designs mechanical systems with components that must survive high frequency shock environments including pyrotechnic shock. These environments have not been simulated very well in the past at the payload system level because of weight limitations of traditional pyroshock mechanical simulations using resonant beams and plates. A new concept utilizing tuned resonators attached to the payload system and driven with the impact of an airgun projectile allow these simulations to be performed in the laboratory with high precision and repeatability without the use of explosives. A tuned resonator has been designed and constructed for a particular payload system. Comparison of laboratory responses with measurements made at the component locations during actual pyrotechnic events show excellent agreement for a bandwidth of DC to 4 kHz. The bases of comparison are shock spectra. This simple concept applies the mechanical pyroshock simulation simultaneously to all components with the correct boundary conditions in the payload system and is a considerable improvement over previous experimental techniques and simulations.

  9. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  10. Particle simulation of plasmas and stellar systems

    SciTech Connect

    Tajima, T.; Clark, A.; Craddock, G.G.; Gilden, D.L.; Leung, W.K.; Li, Y.M.; Robertson, J.A.; Saltzman, B.J.

    1985-04-01

    A computational technique is introduced which allows the student and researcher an opportunity to observe the physical behavior of a class of many-body systems. A series of examples is offered which illustrates the diversity of problems that may be studied using particle simulation. These simulations were in fact assigned as homework in a course on computational physics.

  11. Distributed convex optimisation with event-triggered communication in networked systems

    NASA Astrophysics Data System (ADS)

    Liu, Jiayun; Chen, Weisheng

    2016-12-01

    This paper studies the distributed convex optimisation problem over directed networks. Motivated by practical considerations, we propose a novel distributed zero-gradient-sum optimisation algorithm with event-triggered communication. Therefore, communication and control updates just occur at discrete instants when some predefined condition satisfies. Thus, compared with the time-driven distributed optimisation algorithms, the proposed algorithm has the advantages of less energy consumption and less communication cost. Based on Lyapunov approaches, we show that the proposed algorithm makes the system states asymptotically converge to the solution of the problem exponentially fast and the Zeno behaviour is excluded. Finally, simulation example is given to illustrate the effectiveness of the proposed algorithm.

  12. The SAO {AXAF} Simulation System

    NASA Astrophysics Data System (ADS)

    Jerius, D.; Freeman, M.; Gaetz, T.; Hughes, J. P.; Podgorski, W.

    As part of our efforts to support the AXAF program, the SAO AXAF Mission Support Team has developed a software suite to simulate the AXAF telescope. The software traces the fate of photons through the telescope, from the X-ray source through apertures, baffles, the telescope optics, and finally to the photons' ultimate interactions with the focal plane detectors. We model relevant physical processes, including geometrical reflection, scattering due to surface microroughness, distortions of the optics due to the mirror mounts, attenuation through baffles, etc. The software is composed of programs and scripts, each specialized to a given task, which communicate through UNIX pipes. Software tasks are centered about functional components of the telescope (e.g., apertures, mirrors, detectors) and provide a comfortable and flexible paradigm for performing simulations. The use of separate programs and the UNIX pipe facility allows great flexibility in building different configurations of the telescope and distilling diagnostics from the photon stream through the telescope. We are able to transparently use symmetric multi-processing (e.g., SPARCStation 10s and SGI Challenges) and can easily use sequential multi-processing (via workstation clusters). Some of the tasks are amenable to parallel processing and have been implemented using the MPI standard.

  13. Slip velocity and stresses in granular Poiseuille flow via event-driven simulation.

    PubMed

    Chikkadi, Vijayakumar; Alam, Meheboob

    2009-08-01

    Event-driven simulations of inelastic smooth hard disks are used to probe the slip velocity and rheology in gravity-driven granular Poiseuille flow. It is shown that both the slip velocity (U(w)) and its gradient (dU(w)/dy) depend crucially on the mean density, wall roughness, and inelastic dissipation. While the gradient of slip velocity follows a single power-law relation with Knudsen number, the variation in U(w) with Kn shows three distinct regimes in terms of Knudsen number. An interesting possibility of Knudsen-number-dependent specularity coefficient emerges from a comparison of our results with a first-order transport theory for the slip velocity. Simulation results on stresses are compared with kinetic-theory predictions, with reasonable agreement of our data in the quasielastic limit. The deviation of simulations from theory increases with increasing dissipation which is tied to the increasing magnitude of the first normal stress difference (N(1)) that shows interesting nonmonotonic behavior with density. As in simple shear flow, there is a sign change of N(1) at some critical density and its collisional component and the related collisional anisotropy are responsible for this sign reversal.

  14. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    PubMed

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  15. Bayesian Techniques for Comparing Time-dependent GRMHD Simulations to Variable Event Horizon Telescope Observations

    NASA Astrophysics Data System (ADS)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  16. Interactive communication systems simulation model - ICSSM

    NASA Astrophysics Data System (ADS)

    Wade, W. D.; Mortara, M. E.; Leong, P. K.; Frost, V. S.

    1984-01-01

    The design of ICSSM, a nonreal time computer-aided simulation and analysis tool for communications systems, is presented, ICSSM is capable of supporting modeling, simulation, and analysis of any system representable in terms of a network of multiport functional blocks. Its applicability is limited only by the modeler's ingenuity to decompose the system to functional blocks and to represent these functional blocks algorithmically. ICSSM has been constructed modularly, consisting of five subsytems to facilitate the tasks of formulating the model, exercising the model, evaluating and showing the simulation results, and storing and maintaining a library of modeling elements, analysis, and utility subroutines. It is written exclusively in ANSI Standard Fortran IV language, and is now operational in a Honeywell DPS 7/80 M computer under the MULTICS Operating System. Description of a recent simulation using ICSSM and some generic modules of general interest developed as a result of the modeling work are also presented.

  17. Dynamical downscaling simulation and future projection of summer rainfall in Taiwan: Contributions from different types of rain events

    NASA Astrophysics Data System (ADS)

    Huang, Wan-Ru; Chang, Ya-Hui; Hsu, Huang-Hsiung; Cheng, Chao-Tzuen; Tu, Chia-Ying

    2016-12-01

    Summer rainfall in Taiwan is composed of four types of rain events: tropical cyclone (TC), frontal convection (FC), diurnal convection (DC), and other southerly convection (SC) that propagates from the nearby ocean. In this study, we accessed the present-day simulation (1979-2003) and future projection (2075-2099, the Representative Concentration Pathway 8.5 scenario) of rainfall in Taiwan by using the regional Weather Research and Forecasting model driven by the global High Resolution Atmospheric Model. The results indicated that the dynamical downscaling process adds value to the present-day simulation of summer rainfall in Taiwan and the contribution of different types of rain events. It was found that summer rainfall in Taiwan will increase in a warmer future and that this change was mainly due to an increase in SC rainfall (i.e., light rainfall event). The following trends in Taiwan rainfall were also predicted in a warmer future: (1) SC rainy days will increase because the intensified monsoonal flow facilitates the propagation of more SC toward Taiwan, (2) TC rainy days will decrease as the Western North Pacific subtropical high extends southwestward and prevents TC systems from passing over Taiwan, (3) DC rainy days will decrease in response to the increased local thermal stability, and (4) FC rainy days will show no significant changes. Moreover, all types of rainfall are projected to become more intense in the future due to the increased moisture supply in a warmer climate. These findings highlight how the rainfall characteristics in East Asia may change in response to climate change.

  18. Water system modeling for dispatcher training simulators

    SciTech Connect

    Rajagopal, S.; Sigari, P.G. ); Allen, J.E.; Assadian, M. )

    1993-08-01

    This paper addresses the existing need for training dispatchers in the operation of power systems where it involves managing large water systems. The problem formulation and implementation of water system modeling for the Dispatcher Training Simulators (DTS) are presented in this paper. The method systematically builds the water network descriptions. The model periodically calculates the water system flows, storage values, and currently available hydro generation capacities. The model is controllable by the instructor and provides the simulated telemetry of water system data to the control center functions in the DTS. The water system modeling enhances the power system modeling subsystem of the DTS. The method is validated on a large water system and power system data. The results and the benefits of water system modeling are discussed.

  19. Digital simulation of stiff linear dynamic systems.

    NASA Technical Reports Server (NTRS)

    Holland, L. D.; Walsh, J. R., Jr.; Kerr, J. H.

    1972-01-01

    A method is derived for digital computer simulation of linear time-invariant systems when the insignificant eigenvalues involved in such systems are eliminated by an ALSAP root removal technique. The method is applied to a thirteenth-order dynamic system representing a passive RLC network.

  20. Weightlessness simulation system and process

    NASA Technical Reports Server (NTRS)

    Vykukal, Hubert C. (Inventor)

    1987-01-01

    A weightlessness simulator has a chamber and a suit in the chamber. O-rings and valves hermetically seal the chamber. A vacuum pump connected to the chamber establishes a pressure in the chamber less than atmospheric pressure. A water supply tank and water supply line supply a body of water to the chamber as a result of partial vacuum created in the chamber. In use, an astronaut enters the pressure suit through a port, which remains open to ambient atmosphere, thus supplying air to the astronaut during use. The pressure less than atmospheric pressure in the chamber is chosen so that the pressure differential from the inside to the outside of the suit corresponds to the pressure differential with the suit in outer space.

  1. Ozone sensitivity to industrial ethene emissions events in regulatory air quality modeling simulations for Houston, Texas

    NASA Astrophysics Data System (ADS)

    Couzo, E.; Olatosi, A. O.; Vizuete, W.

    2010-12-01

    The Houston-Galveston-Brazoria (HGB) area has had multiple decades of persistent high ozone (O3) values. We have analyzed ten years of ground-level measurements at 25 monitors in Houston and found that peak 1-hr O3 concentrations were often associated with large hourly O3 increases. A non-typical O3 change (NTOC) - defined here as an increase of at least 40 ppb/hr or 60 ppb/2hrs - was measured 25% of the time when concentrations recorded at a monitor exceeded the 8-hr O3 standard. We found that regulatory air quality model simulations (120 total days in 2005 and 2006) used to support the 2010 State Implementation Plan for the HGB non-attainment area were limited in their ability to simulate observed NTOCs, and under predicted the maximum observed rate of change by more than 50 ppb/hr. We show that the regulatory model, using "average" emissions in accordance with current EPA methodology, does not predict the spatially isolated, high O3 events measured at monitors. Even when day-specific emissions inventories are used, the model makes 1-hr O3 predictions nearly identical to simulations using the "average" emissions inventory and increases hourly O3 concentrations and changes by only 8 ppb and 3 ppb/hr. Observed NTOCs have been linked to stochastic industrial releases of some volatile organic compounds, specifically ethene and propene. We also examined whether short-term ethene releases in the regulatory air quality model are producing rapid hourly changes in ozone concentrations. Ethene emissions events are known to have been included in a day specific emissions inventory, but were removed for regulatory purposes to comport with EPA modeling guidance providing a natural sensitivity study. These results will show whether the regulatory model is able to respond to these emission events and produce the observed increases in ozone concentrations. The model’s ability to replicate an important observed phenomenon is critical in the selection of effective control

  2. Evaluation of the southern California seismic velocity models through simulation of recorded events

    NASA Astrophysics Data System (ADS)

    Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Khoshnevis, Naeem; Cheng, Keli

    2016-06-01

    Significant effort has been devoted over the last two decades to the development of various seismic velocity models for the region of southern California, United States. These models are mostly used in forward wave propagation simulation studies, but also as base models for tomographic and source inversions. Two of these models, the community velocity models CVM-S and CVM-H, are among the most commonly used for this region. This includes two alternative variations to the original models, the recently released CVM-S4.26 which incorporates results from a sequence of tomographic inversions into CVM-S, and the user-controlled option of CVM-H to replace the near-surface profiles with a VS30-based geotechnical model. Although either one of these models is regarded as acceptable by the modeling community, it is known that they have differences in their representation of the crustal structure and sedimentary deposits in the region, and thus can lead to different results in forward and inverse problems. In this paper, we evaluate the accuracy of these models when used to predict the ground motion in the greater Los Angeles region by means of an assessment of a collection of simulations of recent events. In total, we consider 30 moderate-magnitude earthquakes (3.5 < Mw < 5.5) between 1998 and 2014, and compare synthetics with data recorded by seismic networks during these events. The simulations are done using a finite-element parallel code, with numerical models that satisfy a maximum frequency of 1 Hz and a minimum shear wave velocity of 200 m s-1. The comparisons between data and synthetics are ranked quantitatively by means of a goodness-of-fit (GOF) criteria. We analyse the regional distribution of the GOF results for all events and all models, and draw conclusions from the results and how these correlate to the models. We find that, in light of our comparisons, the model CVM-S4.26 consistently yields better results.

  3. Human Systems Modeling and Simulation

    DTIC Science & Technology

    2005-12-01

    individuals, organizations, and other social forms as systems of practices. A rereading of the propositional and system forms shows that they make no... social inter-dependencies that underwrite human behavior: designing, prototyping, testing and delivering extensions to Synergia’s ACCORD technology for...also and primarily the cognitive and social inter-dependencies that underwrite human behavior. • Develop technology for the computational specification

  4. Numerical propulsion system simulation: An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  5. Numerical propulsion system simulation - An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  6. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  7. Colorimetric calibration of coupled infrared simulation system

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Fei, Jindong; Gao, Yang; Du, Jian

    2015-10-01

    In order to test 2-color infrared sensors, a coupled infrared simulation system can generate radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. There are two channels in the coupled simulation system, optically combined by a diachronic beam combiner. Each channel has an infrared blackbody, a filter, a diaphragm, and diaphragm-motors. The system is projected to the sensor under testing by a collimator. This makes it difficult to calibrate the system with only one-band thermal imager. Errors will be caused in the radiance levels measured by the narrow band thermal imager. This paper describes colorimetric temperature measurement techniques that have been developed to perform radiometric calibrations of these infrared simulation systems above. The calibration system consists of two infrared thermal imagers; one is operated at the wavelength range of MW-IR, and the other at the range of LW-IR.

  8. Construction of the real patient simulator system.

    PubMed

    Chan, Richard; Sun, C T

    2012-05-01

    Simulation for perfusion education has been used for at least the past 25 years. The earlier models were either electronic (computer games) or fluid dynamic models and provided invaluable adjuncts to perfusion training and education. In 2009, the *North Shore-LIJ Health System at Great Neck, New York, opened an innovative "Bioskill Center" dedicated to simulated virtual reality advanced hands-on surgical training as well as perfusion simulation. Professional cardiac surgical organizations now show great interest in using simulation for training and recertification. Simulation will continue to be the direction for future perfusion training and education. This manuscript introduces a cost-effective system developed from discarded perfusion products and it is not intended to detail the actual lengthy process of its construction.

  9. Two Types of El Niño Events Simulated in the SNU Coupled GCM

    NASA Astrophysics Data System (ADS)

    Lim, M.; Kang, I.; Kug, J.; Ham, Y.

    2010-12-01

    Recent studies have reported that there exist more than one type of El Niño. One is the cold tongue (CT) El Niño, which shows stronger sea surface temperature anomalies (SSTA) in the eastern Pacific, and the other is the warm pool (WP) El Niño, which features SSTA in the central Pacific. The WP El Niño is different from the CT El Niño, not only the action center but also developing and transition mechanisms. In addition, WP El Niño occurs more frequently in recent decades. Furthermore, Yeh et al. (2009) suggests that WP El Niño occurrence will increase in future climate under global warming . Given the importance of correct simulations and predictions of the two types of El Niño events, it is required to better understand mechanisms which control them in numerical models. The present study investigates the CT and WP El Niño events simulated by two different versions of the SNU coupled GCM (SNUCGCM). The SNUCGCM is an ocean-atmosphere coupled model which couples the SNU Atmospheric GCM (SNUAGCM) to the Modular Ocean Model ver. 2.2 (MOM 2.2) Ocean GCM developed at Geophysical Fluid Dynamics Laboratory (GFDL). Two versions are control version (CNTL), second version (CTOK) includes the cumulus momentum transport parameterization and minimum entrainment rate. All two versions of the SNUCGCM used in this study reasonably simulate ENSO variability, with center of positive SSTA shifted slightly to the west compared to the observation. It is worthwhile to note that models simulate the major observed features of the WP El Niño distinguished from the CT El Niño. Furthermore, all versions of model simulate the occurrence of the CT El Niño more frequently than that of the WP El Niño as the observation. The CNTL shows interannual variability of SSTA over the tropical Pacific weaker than that in the CTOK, while the intensity of the WP El Niño event in the CNTL is stronger than that of the CTOK. In addition, the WP El Niño frequently occurs in the CNTL compared to

  10. Another Program Simulates A Modular Manufacturing System

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Wang, Jian

    1996-01-01

    SSE5 computer program provides simulation environment for modeling manufacturing systems containing relatively small numbers of stations and operators. Designed to simulate manufacturing of apparel, also used in other manufacturing domains. Valuable for small or medium-size firms, including those lacking expertise to develop detailed mathematical models or have only minimal knowledge in describing manufacturing systems and in analyzing results of simulations on mathematical models. Two other programs available bundled together as SSE (MFS-26245). Each program models slightly different manufacturing scenario. Written in Turbo C v2.0 for IBM PC-series and compatible computers running MS-DOS and successfully compiled using Turbo C++ v3.0.

  11. High Resolution Simulation of a Colorado Rockies Extreme Snow and Rain Event in both a Current and Future Climate

    NASA Astrophysics Data System (ADS)

    Rasmussen, Roy; Ikeda, Kyoko; Liu, Changhai; Gutmann, Ethan; Gochis, David

    2016-04-01

    Modeling of extreme weather events often require very finely resolved treatment of atmospheric circulation structures in order to produce and localize the large moisture fluxes that result in extreme precipitation. This is particularly true for cool season orographic precipitation processes where the representation of the landform can significantly impact vertical velocity profiles and cloud moisture entrainment rates. This study presents results for high resolution regional climate modeling study of the Colorado Headwaters region using an updated version of the Weather Research and Forecasting (WRF) model run at 4 km horizontal resolution and a hydrological extension package called WRF-Hydro. Previous work has shown that the WRF modeling system can produce credible depictions of winter orographic precipitation over the Colorado Rockies if run at horizontal resolutions < 6 km. Here we present results from a detailed study of an extreme springtime snowfall event that occurred along the Colorado Front Range in March 2003. Results from the impact of warming on total precipitation, snow-rain partitioning and surface hydrological fluxes (evapotranspiration and runoff) will be discussed in the context of how potential changes in temperature impact the amount of precipitation, the phase of precipitation (rain vs. snow) and the timing and amplitude of streamflow responses. The results show using the Pseudo Global Warming technique that intense precipitation rates significantly increased during the event and a significant fraction of the snowfall converts to rain which significantly amplifies the runoff response from one where runoff is produced gradually to one in which runoff is rapidly translated into streamflow values that approach significant flooding risks. Results from a new, CONUS scale high resolution climate simulation of extreme events in a current and future climate will be presented as time permits.

  12. Responses of Hyalella azteca and phytoplankton to a simulated agricultural runoff event in a managed backwater wetland.

    PubMed

    Lizotte, Richard E; Shields, F Douglas; Murdock, Justin N; Knight, Scott S

    2012-05-01

    We assessed the aqueous toxicity mitigation capacity of a hydrologically managed floodplain wetland following a synthetic runoff event amended with a mixture of sediments, nutrients (nitrogen and phosphorus), and pesticides (atrazine, S-metolachlor, and permethrin) using 48-h Hyalella azteca survival and phytoplankton pigment, chlorophyll a. The runoff event simulated a 1h, 1.27 cm rainfall event from a 16 ha agricultural field. Water (1L) was collected every 30 min within the first 4h, every 4h until 48 h, and on days 5, 7, 14, 21, and 28 post-amendment at distances of 0, 10, 40, 300 and 500 m from the amendment point for chlorophyll a, suspended sediment, nutrient, and pesticide analyses. H. azteca 48-h laboratory survival was assessed in water collected at each site at 0, 4, 24, 48 h, 5 d and 7 d. Greatest sediment, nutrient, and pesticide concentrations occurred within 3h of amendment at 0m, 10 m, 40 m, and 300 m downstream. Sediments and nutrients showed little variation at 500 m whereas pesticides peaked within 48 h but at <15% of upstream peak concentrations. After 28 d, all mixture components were near or below pre-amendment concentrations. H. azteca survival significantly decreased within 48 h of amendment up to 300 m in association with permethrin concentrations. Chlorophyll a decreased within the first 24h of amendment up to 40m primarily in conjunction with herbicide concentrations. Variations in chlorophyll a at 300 and 500 m were associated with nutrients. Managed floodplain wetlands can rapidly and effectively trap and process agricultural runoff during moderate rainfall events, mitigating impacts to aquatic invertebrates and algae in receiving aquatic systems.

  13. Behavioral and Physiological Responses of Calves to Marshalling and Roping in a Simulated Rodeo Event

    PubMed Central

    Sinclair, Michelle; Keeley, Tamara; Lefebvre, Anne-Cecile; Phillips, Clive J. C.

    2016-01-01

    Simple Summary Rodeos often include a calf roping event, where calves are first lassoed by a rider on a horse, who then dismounts, ties the calves’ legs, lifts it from the ground and releases it back to the floor. We tested whether calves that were familiar to the roping experience stress during the roping event, and found increased concentrations of stress hormones in their blood after the roping. We also found increased concentrations of stress hormones in the blood of calves that had never been roped before but were just marshelled across the arena by the horse and rider. We conclude that the roping event in rodeos is stressful for both experienced and naïve calves. Abstract Rodeos are public events at which stockpeople face tests of their ability to manage cattle and horses, some of which relate directly to rangeland cattle husbandry. One of these is calf roping, in which a calf released from a chute is pursued by a horse and rider, who lassoes, lifts and drops the calf to the ground and finally ties it around the legs. Measurements were made of behavior and stress responses of ten rodeo-naïve calves marshalled by a horse and rider, and ten rodeo-experienced calves that were roped. Naïve calves marshalled by a horse and rider traversed the arena slowly, whereas rodeo-experienced calves ran rapidly until roped. Each activity was repeated once after two hours. Blood samples taken before and after each activity demonstrated increased cortisol, epinephrine and nor-epinephrine in both groups. However, there was no evidence of a continued increase in stress hormones in either group by the start of the repeated activity, suggesting that the elevated stress hormones were not a response to a prolonged effect of the initial blood sampling. It is concluded that both the marshalling of calves naïve to the roping chute by stockpeople and the roping and dropping of experienced calves are stressful in a simulated rodeo calf roping event. PMID:27136590

  14. Changes in Intense Rainfall Events over the Central United States in AOGCM-Driven Regional Climate Model Simulations

    NASA Astrophysics Data System (ADS)

    Daniel, A. R.; Arritt, R. W.; Groisman, P. Y.

    2014-12-01

    We have evaluated trends in extreme precipitation frequency for the central United States (Groisman et al. 2012) using atmosphere-ocean global climate model (AOGCM) driven regional climate simulations. Nested regional climate model simulations were conducted using RegCM4.4 over the CORDEX-North America domain with 50 km grid spacing. Initial and lateral boundary conditions are taken from the HadGEM2-ES and GFDL-ESM2M AOGCMs (for RCP8.5 emissions scenario) to simulate present and future climate (1951-2098). For each run, RegCM4 uses three different convection schemes: Emanuel scheme, Grell scheme, and Mixed scheme which uses the Emanuel scheme over water and Grell over land.Current findings show the regional climate simulations are of the same magnitude of average frequency for heavy ( 25.4-76.2 mm/day), and extreme (154.9+ mm/day) precipitation events while very heavy events (76.2+ mm/day) were less frequent by an order of magnitude. For current and recent past climate (1951-2005), frequency of precipitation events is similar in both HadGEM2-ES and GFDL-ESM2M AOGCM-driven regional climate simulations with most variation due to the convection scheme being used. Initial results seem to exhibit similar trends in the increase of frequency for each precipitation event as is seen in observations. In accordance with Groisman et al. (2012), preliminary findings also show months during the cold season had more frequent heavy events in comparison to very heavy and extreme events while months during the warm season had more frequent very heavy and extreme events in comparison to heavy events. Further analysis will better determine the correlation and accuracy of these regional climate simulations.

  15. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  16. Dynamics of the MAP IOP 15 severe Mistral event: Observations and high-resolution numerical simulations

    NASA Astrophysics Data System (ADS)

    Guénard, V.; Drobinski, P.; Caccia, J. L.; Tedeschi, G.; Currier, P.

    2006-04-01

    This paper investigates the fundamental processes involved in a severe Mistral event that occurred during the Mesoscale Alpine Program (from 6 to 9 November 1999). The Mistral refers to a violent north/north-westerly wind blowing in south-eastern France from the Rhône valley to the French Riviera. The study is based on measurements from radiosoundings launched from Lyon and Nîmes and from two UHF wind profilers located near Marseille and Toulon allowing a good description of the flow in the complex terrain formed by the south-western Alps. Observational results are compared with RAMS non-hydrostatic numerical simulations performed with 27 km, 9 km and 3 km nested grids. The numerical simulations capture the flow complexity both upstream of the Alps and in the coastal area affected by the Mistral. They correctly reproduce horizontal wind speeds and directions, vertical velocities, virtual potential temperature and relative humidity documented by the observational network. The simulations are used to point out the main dynamical processes generating the Mistral. It is found that flow splitting around the Alps and around the isolated peaks bordering the south-eastern part of the Rhône valley (Mont Ventoux 1909 m, Massif du Lubéron 1425 m) induces the low-level jet observed near Marseille that lasts for 36 hours. The high-resolution simulation indicates that the transient low-level jet lasting for only 9 hours observed at Toulon is due to a gravity wave breaking over local topography (the Sainte Baume 1147 m) where hydraulic jumps are involved. A mountain wake with two opposite-sign potential-vorticity banners is generated. The mesoscale wake explains the westward progression of the large-scale Alpine wake.

  17. The impact of inpatient boarding on ED efficiency: a discrete-event simulation study.

    PubMed

    Bair, Aaron E; Song, Wheyming T; Chen, Yi-Chun; Morris, Beth A

    2010-10-01

    In this study, a discrete-event simulation approach was used to model Emergency Department's (ED) patient flow to investigate the effect of inpatient boarding on the ED efficiency in terms of the National Emergency Department Crowding Scale (NEDOCS) score and the rate of patients who leave without being seen (LWBS). The decision variable in this model was the boarder-released-ratio defined as the ratio of admitted patients whose boarding time is zero to all admitted patients. Our analysis shows that the Overcrowded(+) (a NEDOCS score over 100) ratio decreased from 88.4% to 50.4%, and the rate of LWBS patients decreased from 10.8% to 8.4% when the boarder-released-ratio changed from 0% to 100%. These results show that inpatient boarding significantly impacts both the NEDOCS score and the rate of LWBS patient and this analysis provides a quantification of the impact of boarding on emergency department patient crowding.

  18. Single-event response of the SiGe HBT in TCAD simulations and laser microbeam experiment

    NASA Astrophysics Data System (ADS)

    Li, Pei; Guo, Hong-Xia; Guo, Qi; Zhang, Jin-Xin; Xiao, Yao; Wei, Ying; Cui, Jiang-Wei; Wen, Lin; Liu, Mo-Han; Wang, Xin

    2015-08-01

    In this paper the single-event responses of the silicon germanium heterojunction bipolar transistors (SiGe HBTs) are investigated by TCAD simulations and laser microbeam experiment. A three-dimensional (3D) simulation model is established, the single event effect (SEE) simulation is further carried out on the basis of SiGe HBT devices, and then, together with the laser microbeam test, the charge collection behaviors are analyzed, including the single event transient (SET) induced transient terminal currents, and the sensitive area of SEE charge collection. The simulations and experimental results are discussed in detail and it is demonstrated that the nature of the current transient is controlled by the behaviors of the collector-substrate (C/S) junction and charge collection by sensitive electrodes, thereby giving out the sensitive area and electrode of SiGe HBT in SEE. Project supported by the National Natural Science Foundation of China (Grant No. 61274106).

  19. Diagnosis of repeated failures in discrete event systems.

    SciTech Connect

    Garcia, H. E.; Jiang, S.; Kumar, R.; Univ. of Kentucky; Iowa State Univ.

    2002-01-01

    We introduce the notion of repeated failure diagnosability for diagnosing the occurrence of a repeated number of failures in discrete event systems. This generalizes the earlier notion of diagnosability that was used to diagnose the occurrence of a failure, but from which the information regarding the multiplicity of the occurrence of the failure could not be obtained. It is possible that in some systems the same type of failure repeats a multiple number of times. It is desirable to have a diagnoser which not only diagnoses that such a failure has occurred but also determines the number of times the failure has occurred. To aide such analysis we introduce the notions of K-diagnosability (K failures diagnosability), [1, K]-diagnosability (1 through K failures diagnosability), and [1, /spl infin/]-diagnosability (1 through /spl infin/ failures diagnosability). Here the first notion is the weakest of all three, and the earlier notion of diagnosability is the same as that of K-diagnosability or that of [1, K]-diagnosability with K=1. We give polynomial algorithms for checking these various notions of repeated failure diagnosability, and also present a procedure of polynomial complexity for the online diagnosis of repeated failures.

  20. Diagnosis of repeated/intermittent failures in discrete event systems.

    SciTech Connect

    Garcia, H. E.; Jiang, S.; Kumar, R.

    2003-04-01

    We introduce the notion of repeated failure diagnosability for diagnosing the occurrence of a repeated number of failures in discrete event systems. This generalizes the earlier notion of diagnosability that was used to diagnose the occurrence of a failure, but from which the information regarding the multiplicity of the occurrence of the failure could not be obtained. It is possible that in some systems the same type of failure repeats a multiple number of times. It is desirable to have a diagnoser which not only diagnoses that such a failure has occurred but also determines the number of times the failure has occurred. To aide such analysis we introduce the notions of K-diagnosability (K failures diagnosability), [1,K]-diagnosability (1 through K failures diagnosability), and [1,1]-diagnosability (1 through 1 failures diagnosability). Here the rst (resp., last) notion is the weakest (resp., strongest) of all three, and the earlier notion of diagnosability is the same as that of K-diagnosability or that of [1,K]- diagnosability with K = 1. We give polynomial algorithms for checking these various notions of repeated failure diagnosability, and also present a procedure of polynomial complexity for the on-line diagnosis of repeated failures.

  1. PRELIMINARY SYSTEMS ANALYSIS AND SIMULATION

    DTIC Science & Technology

    stability augmentation system specification; the development of the bridge concept for roll and yaw louver control; support of various hardware tests; the generation of the specification for the DeFlorez point light source visual display; furnishing consolation services during the DeFlorez display installation and testing, and developing the yaw, roll and pitch direction cosine

  2. Electric-Power System Simulator

    NASA Technical Reports Server (NTRS)

    Caldwell, R. W.; Grumm, R. L.; Biedebach, B. L.

    1984-01-01

    Shows different combinations of generation, storage, and load components: display, video monitor with keyboard input to microprocessor, and video monitor for display of load curves and power generation. Planning tool for electric utilities, regulatory agencies, and laymen in understanding basics of electric-power systems operation.

  3. Final Technical Report "Multiscale Simulation Algorithms for Biochemical Systems"

    SciTech Connect

    Petzold, Linda R.

    2012-10-25

    Biochemical systems are inherently multiscale and stochastic. In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. An analysis tool that respects these dynamical characteristics is the stochastic simulation algorithm (SSA, Gillespie, 1976), a numerical simulation procedure that is essentially exact for chemical systems that are spatially homogeneous or well stirred. Despite recent improvements, as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the multiscale nature of the underlying problem: (1) stiffness, i.e. the presence of multiple timescales, the fastest of which are stable; and (2) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). This project has focused on the development of fast and adaptive algorithms, and the fun- damental theory upon which they must be based, for the multiscale simulation of biochemical systems. Areas addressed by this project include: (1) Theoretical and practical foundations for ac- celerated discrete stochastic simulation (tau-leaping); (2) Dealing with stiffness (fast reactions) in an efficient and well-justified manner in discrete stochastic simulation; (3) Development of adaptive multiscale algorithms for spatially homogeneous discrete stochastic simulation; (4) Development of high-performance SSA algorithms.

  4. Reduced salinity increases susceptibility of zooxanthellate jellyfish to herbicide toxicity during a simulated rainfall event.

    PubMed

    Klein, Shannon G; Pitt, Kylie A; Carroll, Anthony R

    2016-02-01

    Accurately predicting how marine biota are likely to respond to changing ocean conditions requires accurate simulation of interacting stressors, exposure regimes and recovery periods. Jellyfish populations have increased in some parts of the world and, despite few direct empirical tests, are hypothesised to be increasing because they are robust to a range of environmental stressors. Here, we investigated the effects of contaminated runoff on a zooxanthellate jellyfish by exposing juvenile Cassiopea sp. medusae to a photosystem II (PSII) herbicide, atrazine and reduced salinity conditions that occur following rainfall. Four levels of atrazine (0ngL(-1), 10ngL(-1), 2μgL(-1), 20μgL(-1)) and three levels of salinity (35 ppt, 25 ppt, 17 ppt) were varied, mimicking the timeline of light, moderate and heavy rainfall events. Normal conditions were then slowly re-established over four days to mimic the recovery of the ecosystem post-rain and the experiment continued for a further 7 days to observe potential recovery of the medusae. Pulse-amplitude modulated (PAM) chlorophyll fluorescence, growth and bell contraction rates of medusae were measured. Medusae exposed to the combination of high atrazine and lowest salinity died. After 3 days of exposure, bell contraction rates were reduced by 88% and medusae were 16% smaller in the lowest salinity treatments. By Day 5 of the experiment, all medusae that survived the initial pulse event began to recover quickly. Although atrazine decreased YII under normal salinity conditions, YII was further reduced when medusae were exposed to both low salinity and atrazine simultaneously. Atrazine breakdown products were more concentrated in jellyfish tissues than atrazine at the end of the experiment, suggesting that although bioaccumulation occurred, atrazine was metabolised. Our results suggest that reduced salinity may increase the susceptibility of medusae to herbicide exposure during heavy rainfall events.

  5. Ab initio molecular dynamics simulations of low energy recoil events in MgO

    DOE PAGES

    Petersen, B. A.; Liu, B.; Weber, W. J.; ...

    2017-01-11

    In this paper, low-energy recoil events in MgO are studied using ab initio molecular dynamics simulations to reveal the dynamic displacement processes and final defect configurations. Threshold displacement energies, Ed, are obtained for Mg and O along three low-index crystallographic directions, [100], [110], and [111]. The minimum values for Ed are found along the [110] direction consisting of the same element, either Mg or O atoms. Minimum threshold values of 29.5 eV for Mg and 25.5 eV for O, respectively, are suggested from the calculations. For other directions, the threshold energies are considerably higher, 65.5 and 150.0 eV for Omore » along [111] and [100], and 122.5 eV for Mg along both [111] and [100] directions, respectively. These results show that the recoil events in MgO are partial-charge transfer assisted processes where the charge transfer plays an important role. Finally, there is a similar trend found in other oxide materials, where the threshold displacement energy correlates linearly with the peak partial-charge transfer, suggesting this behavior might be universal in ceramic oxides.« less

  6. An RCM-E simulation of a steady magnetospheric convection event

    NASA Astrophysics Data System (ADS)

    Yang, J.; Toffoletto, F.; Wolf, R.; Song, Y.

    2009-12-01

    We present simulation results of an idealized steady magnetospheric convection (SMC) event using the Rice Convection Model coupled with an equilibrium magnetic field solver (RCM-E). The event is modeled by placing a plasma distribution with substantially depleted entropy parameter PV5/3 on the RCM's high latitude boundary. The calculated magnetic field shows a highly depressed configuration due to the enhanced westward current around geosynchronous orbit where the resulting partial ring current is stronger and more symmetric than in a typical substorm growth phase. The magnitude of BZ component in the mid plasma sheet is large compared to empirical magnetic field models. Contrary to some previous results, there is no deep BZ minimum in the near-Earth plasma sheet. This suggests that the magnetosphere could transfer into a strong adiabatic earthward convection mode without significant stretching of the plasma-sheet magnetic field, when there are flux tubes with depleted plasma content continuously entering the inner magnetosphere from the mid-tail. Virtual AU/AL and Dst indices are also calculated using a synthetic magnetogram code and are compared to typical features in published observations.

  7. Ab initio molecular dynamics simulations of low energy recoil events in MgO

    NASA Astrophysics Data System (ADS)

    Petersen, B. A.; Liu, B.; Weber, W. J.; Zhang, Y.

    2017-04-01

    Low-energy recoil events in MgO are studied using ab intio molecular dynamics simulations to reveal the dynamic displacement processes and final defect configurations. Threshold displacement energies, Ed, are obtained for Mg and O along three low-index crystallographic directions, [100], [110], and [111]. The minimum values for Ed are found along the [110] direction consisting of the same element, either Mg or O atoms. Minimum threshold values of 29.5 eV for Mg and 25.5 eV for O, respectively, are suggested from the calculations. For other directions, the threshold energies are considerably higher, 65.5 and 150.0 eV for O along [111] and [100], and 122.5 eV for Mg along both [111] and [100] directions, respectively. These results show that the recoil events in MgO are partial-charge transfer assisted processes where the charge transfer plays an important role. There is a similar trend found in other oxide materials, where the threshold displacement energy correlates linearly with the peak partial-charge transfer, suggesting this behavior might be universal in ceramic oxides.

  8. Applications Of Monte Carlo Radiation Transport Simulation Techniques For Predicting Single Event Effects In Microelectronics

    SciTech Connect

    Warren, Kevin; Reed, Robert; Weller, Robert; Mendenhall, Marcus; Sierawski, Brian; Schrimpf, Ronald

    2011-06-01

    MRED (Monte Carlo Radiative Energy Deposition) is Vanderbilt University's Geant4 application for simulating radiation events in semiconductors. Geant4 is comprised of the best available computational physics models for the transport of radiation through matter. In addition to basic radiation transport physics contained in the Geant4 core, MRED has the capability to track energy loss in tetrahedral geometric objects, includes a cross section biasing and track weighting technique for variance reduction, and additional features relevant to semiconductor device applications. The crucial element of predicting Single Event Upset (SEU) parameters using radiation transport software is the creation of a dosimetry model that accurately approximates the net collected charge at transistor contacts as a function of deposited energy. The dosimetry technique described here is the multiple sensitive volume (MSV) model. It is shown to be a reasonable approximation of the charge collection process and its parameters can be calibrated to experimental measurements of SEU cross sections. The MSV model, within the framework of MRED, is examined for heavy ion and high-energy proton SEU measurements of a static random access memory.

  9. TRANSIMS: Transportation analysis and simulation system

    SciTech Connect

    Smith, L.; Beckman, R.; Baggerly, K.

    1995-07-01

    This document summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  10. Explicit simulation of a midlatitude Mesoscale Convective System

    SciTech Connect

    Alexander, G.D.; Cotton, W.R.

    1996-04-01

    We have explicitly simulated the mesoscale convective system (MCS) observed on 23-24 June 1985 during PRE-STORM, the Preliminary Regional Experiment for the Stormscale Operational and Research and Meterology Program. Stensrud and Maddox (1988), Johnson and Bartels (1992), and Bernstein and Johnson (1994) are among the researchers who have investigated various aspects of this MCS event. We have performed this MCS simulation (and a similar one of a tropical MCS; Alexander and Cotton 1994) in the spirit of the Global Energy and Water Cycle Experiment Cloud Systems Study (GCSS), in which cloud-resolving models are used to assist in the formulation and testing of cloud parameterization schemes for larger-scale models. In this paper, we describe (1) the nature of our 23-24 June MCS dimulation and (2) our efforts to date in using our explicit MCS simulations to assist in the development of a GCM parameterization for mesoscale flow branches. The paper is organized as follows. First, we discuss the synoptic situation surrounding the 23-24 June PRE-STORM MCS followed by a discussion of the model setup and results of our simulation. We then discuss the use of our MCS simulation. We then discuss the use of our MCS simulations in developing a GCM parameterization for mesoscale flow branches and summarize our results.

  11. Space radiator simulation system analysis

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  12. Lunar Rocks: Available for Year of the Solar System Events

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2010-12-01

    sections may be use requested for college and university courses where petrographic microscopes are available for viewing. Requestors should contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov NASA also loans sets of Moon rocks for use in classrooms, libraries, museums, and planetariums through the Lunar Sample Education Program. Lunar samples (three soils and three rocks) are encapsulated in a six-inch diameter clear plastic disk. A CD with PowerPoint presentations, analogue samples from Earth, a classroom activity guide, and additional printed material accompany the disks. Educators may qualify for the use of these disks by attending a content and security certification workshop sponsored by NASA's Aerospace Education Services Program (AESP). Contact Ms. Margaret Maher, AESP Director. Email address: mjm67@psu.edu NASA makes these precious samples available for the public and encourages the use of lunar rocks to highlight Year of the Solar System events. Surely these interesting specimens of another world will enhance the experience of all YSS participants so please take advantage of these lunar samples and borrow them for events and classes.

  13. Numerically simulating the sandwich plate system structures

    NASA Astrophysics Data System (ADS)

    Feng, Guo-Qing; Li, Gang; Liu, Zhi-Hui; Niu, Huai-Lei; Li, Chen-Feng

    2010-09-01

    Sandwich plate systems (SPS) are advanced materials that have begun to receive extensive attention in naval architecture and ocean engineering. At present, according to the rules of classification societies, a mixture of shell and solid elements are required to simulate an SPS. Based on the principle of stiffness decomposition, a new numerical simulation method for shell elements was proposed. In accordance with the principle of stiffness decomposition, the total stiffness can be decomposed into the bending stiffness and shear stiffness. Displacement and stress response related to bending stiffness was calculated with the laminated shell element. Displacement and stress response due to shear was calculated by use of a computational code write by FORTRAN language. Then the total displacement and stress response for the SPS was obtained by adding together these two parts of total displacement and stress. Finally, a rectangular SPS plate and a double-bottom structure were used for a simulation. The results show that the deflection simulated by the elements proposed in the paper is larger than the same simulated by solid elements and the analytical solution according to Hoff theory and approximate to the same simulated by the mixture of shell-solid elements, and the stress simulated by the elements proposed in the paper is approximate to the other simulating methods. So compared with calculations based on a mixture of shell and solid elements, the numerical simulation method given in the paper is more efficient and easier to do.

  14. Classification of single-trial auditory events using dry-wireless EEG during real and motion simulated flight

    PubMed Central

    Callan, Daniel E.; Durantin, Gautier; Terzibas, Cengiz

    2015-01-01

    Application of neuro-augmentation technology based on dry-wireless EEG may be considerably beneficial for aviation and space operations because of the inherent dangers involved. In this study we evaluate classification performance of perceptual events using a dry-wireless EEG system during motion platform based flight simulation and actual flight in an open cockpit biplane to determine if the system can be used in the presence of considerable environmental and physiological artifacts. A passive task involving 200 random auditory presentations of a chirp sound was used for evaluation. The advantage of this auditory task is that it does not interfere with the perceptual motor processes involved with piloting the plane. Classification was based on identifying the presentation of a chirp sound vs. silent periods. Evaluation of Independent component analysis (ICA) and Kalman filtering to enhance classification performance by extracting brain activity related to the auditory event from other non-task related brain activity and artifacts was assessed. The results of permutation testing revealed that single trial classification of presence or absence of an auditory event was significantly above chance for all conditions on a novel test set. The best performance could be achieved with both ICA and Kalman filtering relative to no processing: Platform Off (83.4% vs. 78.3%), Platform On (73.1% vs. 71.6%), Biplane Engine Off (81.1% vs. 77.4%), and Biplane Engine On (79.2% vs. 66.1%). This experiment demonstrates that dry-wireless EEG can be used in environments with considerable vibration, wind, acoustic noise, and physiological artifacts and achieve good single trial classification performance that is necessary for future successful application of neuro-augmentation technology based on brain-machine interfaces. PMID:25741249

  15. Communication Simulations for Power System Applications

    SciTech Connect

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.; Fisher, Andrew R.; Hauer, Matthew L.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systems will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.

  16. Extreme events in a vortex gas simulation of a turbulent half-jet

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Saikishan; Pathikonda, Gokul; Narasimha, Roddam

    2012-11-01

    Extensive simulations [arXiv:1008.2876v1 [physics.flu-dyn], BAPS.2010.DFD.LE.4] have shown that the temporally evolving vortex gas mixing layer has 3 regimes, including one which has a universal spreading rate. The present study explores the development of spatially evolving mixing layers, using a vortex gas model based on Basu et al. (1995 Appl. Math. Modelling). The effects of the velocity ratio (r) are analyzed via the most extensive simulations of this kind till date, involving up to 10000 vortices and averaging over up to 1000 convective times. While the temporal limit is approached as r approaches unity, striking features such as extreme events involving coherent structures, bending, deviation of the convection velocity from mean velocity, spatial feedback and greater sensitivity to downstream and free stream boundary conditions are observed in the half-jet (r = 0) limit. A detailed statistical analysis reveals possible causes for the large scatter across experiments, as opposed to the commonly adopted explanation of asymptotic dependence on initial conditions. Supported in part by contract no. Intel/RN/4288.

  17. Simulation study on single event burnout in linear doping buffer layer engineered power VDMOSFET

    NASA Astrophysics Data System (ADS)

    Yunpeng, Jia; Hongyuan, Su; Rui, Jin; Dongqing, Hu; Yu, Wu

    2016-02-01

    The addition of a buffer layer can improve the device's secondary breakdown voltage, thus, improving the single event burnout (SEB) threshold voltage. In this paper, an N type linear doping buffer layer is proposed. According to quasi-stationary avalanche simulation and heavy ion beam simulation, the results show that an optimized linear doping buffer layer is critical. As SEB is induced by heavy ions impacting, the electric field of an optimized linear doping buffer device is much lower than that with an optimized constant doping buffer layer at a given buffer layer thickness and the same biasing voltages. Secondary breakdown voltage and the parasitic bipolar turn-on current are much higher than those with the optimized constant doping buffer layer. So the linear buffer layer is more advantageous to improving the device's SEB performance. Project supported by the National Natural Science Foundation of China (No. 61176071), the Doctoral Fund of Ministry of Education of China (No. 20111103120016), and the Science and Technology Program of State Grid Corporation of China (No. SGRI-WD-71-13-006).

  18. Simulation of DKIST solar adaptive optics system

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Carlisle, Elizabeth; Schmidt, Dirk

    2016-07-01

    Solar adaptive optics (AO) simulations are a valuable tool to guide the design and optimization process of current and future solar AO and multi-conjugate AO (MCAO) systems. Solar AO and MCAO systems rely on extended object cross-correlating Shack-Hartmann wavefront sensors to measure the wavefront. Accurate solar AO simulations require computationally intensive operations, which have until recently presented a prohibitive computational cost. We present an update on the status of a solar AO and MCAO simulation tool being developed at the National Solar Observatory. The simulation tool is a multi-threaded application written in the C++ language that takes advantage of current large multi-core CPU computer systems and fast ethernet connections to provide accurate full simulation of solar AO and MCAO systems. It interfaces with KAOS, a state of the art solar AO control software developed by the Kiepenheuer-Institut fuer Sonnenphysik, that provides reliable AO control. We report on the latest results produced by the solar AO simulation tool.

  19. Observing System Simulation Experiments: An Overview

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.; Errico, Ronald M.

    2016-01-01

    An overview of Observing System Simulation Experiments (OSSEs) will be given, with focus on calibration and validation of OSSE frameworks. Pitfalls and practice will be discussed, including observation error characteristics, incestuousness, and experimental design. The potential use of OSSEs for investigation of the behaviour of data assimilation systems will be explored, including some results from experiments using the NASAGMAO OSSE.

  20. Space shuttle visual simulation system design study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.

  1. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  2. Program Simulates A Modular Manufacturing System

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Wang, Jian

    1996-01-01

    SSE computer program provides simulation environment for modeling manufacturing systems containing relatively small numbers of stations and operators. Designed to simulate manufacturing of apparel, also used in other manufacturing domains. Excellent for small or medium-size firms including those lacking expertise to develop detailed models or have only minimal knowledge in describing manufacturing systems and in analyzing results of simulations on mathematical models. User does not need to know simulation language to use SSE. Used to design new modules and to evaluate existing modules. Originally written in Turbo C v2.0 for IBM PC-compatible computers running MS-DOS and successfully implemented by use of Turbo C++ v3.0.

  3. The GRB Simulator: A System for Testing GOES Rebroadcast (GRB)

    NASA Astrophysics Data System (ADS)

    Gibbons, K.; Race, R.; Miller, C.; Barnes, K.; Dittberner, G.

    2012-12-01

    GOES Rebroadcast (GRB) signals in the GOES-R era will replace the current legacy GOES Variable (GVAR) signal and will have substantially different characteristics, including a change in data rate from a single 2.1 Mbps stream to two digital streams of 15.5 Mbps each. The GRB Simulator is a portable system that outputs a high-fidelity stream of Consultative Committee for Space Data Systems (CCSDS) formatted GRB packet data equivalent to live GRB data. The data is used for on-site testing of user ingest and data handling systems known as field terminal sites. The GRB Simulator is a fully self-contained system which includes all hardware units needed for operation. The operator manages configurations to edit preferences, define individual test scenarios, and manage event logs and reports. Simulations are controlled by test scenarios, which are scripts that specify the test data and provide a series of actions for the GRB Simulator to perform when generating GRB output. Scenarios allow for the insertion of errors or modification of GRB packet headers for testing purposes. The GRB Simulator provides a built-in editor for managing scenarios. Data output by the simulator is derived from either proxy data files containing Level 1b (L1b) or GLM L2+ data, test pattern images, or non-image test pattern generation commands specified from within a scenario. The GRB Simulator outputs packets containing both instrument and GRB Information data. Instrument packets contain data simulated from any instrument: the Advanced Baseline Imager (ABI), Solar Ultraviolet Imager (SUVI), Space Environment In-Situ Suite (SEISS), Extreme Ultraviolet Sensor (EUVS) and X-ray Irradiance Sensor (XRS) called EXIS, Geostationary Lightning Mapper (GLM), or the Magnetometer. The GRB Information packets contain information such as satellite schedules. The GRB Simulator will provide GRB data as either baseband (digital) or Intermediate Frequency (IF) output to the test system. GRB packet data will be sent

  4. Federated Simulations for Systems of Systems Integration

    DTIC Science & Technology

    2008-12-01

    It calls for a common definition of requirements via the Mili- tary Missions and Means framework (Sheehan, Dietz, Bray, Harris, and Wong 2004). It... missions and means framework . Technical Report TR-756, Army Material Systems Analysis Activity. Tolk, A., T. Litwin, and R. Kewley. 2008, December. A

  5. Thermal enclosure system functional simulation user's manual

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1994-01-01

    A form and function simulation of the thermal enclosure system (TES) for a microgravity protein crystal growth experiment has been developed as part of an investigation of the benefits and limitations of intravehicular telerobotics to aid in microgravity science and production. A user can specify the time, temperature, and sample rate profile for a given experiment, and menu options and status are presented on an LCD display. This report describes the features and operational procedures for the functional simulation.

  6. Event detection and sub-state discovery from biomolecular simulations using higher-order statistics: application to enzyme adenylate kinase.

    PubMed

    Ramanathan, Arvind; Savol, Andrej J; Agarwal, Pratul K; Chennubhotla, Chakra S

    2012-11-01

    Biomolecular simulations at millisecond and longer time-scales can provide vital insights into functional mechanisms. Because post-simulation analyses of such large trajectory datasets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (Ramanathan et al., PLoS One 2011;6:e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this article, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD--a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states, and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on microsecond timescale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three subdomains (LID, CORE, and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate that HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations.

  7. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable

  8. Multipurpose simulation systems for regional development forecasting

    SciTech Connect

    Kostina, N.I.

    1995-09-01

    We examine the development of automaton-modeling multipurpose simulation systems as an efficient form of simulation software for MIS. Such systems constitute a single problem-oriented package of applications based on a general simulation model, which is equipped with a task source language, interaction tools, file management tools, and an output document editor. The simulation models are described by the method of probabilistic-automaton modeling, which ensures standard representation of models and standardization of the modeling algorithm. Examples of such systems include the demographic forecasting system DEPROG, the VOKON system for assessing the quality of consumer services in terms of free time, and the SONET system for servicing partially accessible customers. The development of computer-aided systems for production and economic control is now moving to the second state, namely operationalization of optimization and forecasting problems, whose solution may account for the main economic effect of MIS. Computation and information problems, which were the main focus of the first stage of MIS development, are thus acquiring the role of a source of information for optimization and forecasting problems in addition to their direct contribution to preparation and analysis of current production and economic information.

  9. Shuttle Propulsion System Major Events and the Final 22 Flights

    NASA Technical Reports Server (NTRS)

    Owen, James W.

    2011-01-01

    Numerous lessons have been documented from the Space Shuttle Propulsion elements. Major events include loss of the Solid Rocket Boosters (SRB's) on STS-4 and shutdown of a Space Shuttle Main Engine (SSME) during ascent on STS-51F. On STS-112 only half the pyrotechnics fired during release of the vehicle from the launch pad, a testament for redundancy. STS-91 exhibited freezing of a main combustion chamber pressure measurement and on STS-93 nozzle tube ruptures necessitated a low liquid level oxygen cut off of the main engines. A number of on pad aborts were experienced during the early program resulting in delays. And the two accidents, STS-51L and STS-107, had unique heritage in history from early program decisions and vehicle configuration. Following STS-51L significant resources were invested in developing fundamental physical understanding of solid rocket motor environments and material system behavior. And following STS-107, the risk of ascent debris was better characterized and controlled. Situational awareness during all mission phases improved, and the management team instituted effective risk assessment practices. The last 22 flights of the Space Shuttle, following the Columbia accident, were characterized by remarkable improvement in safety and reliability. Numerous problems were solved in addition to reduction of the ascent debris hazard. The Shuttle system, though not as operable as envisioned in the 1970's, successfully assembled the International Space Station (ISS). By the end of the program, the remarkable Space Shuttle Propulsion system achieved very high performance, was largely reusable, exhibited high reliability, and was a heavy lift earth to orbit propulsion system. During the program a number of project management and engineering processes were implemented and improved. Technical performance, schedule accountability, cost control, and risk management were effectively managed and implemented. Award fee contracting was implemented to provide

  10. Spatial Aspects in Biological System Simulations

    PubMed Central

    Resat, Haluk; Costa, Michelle N.; Shankaran, Harish

    2012-01-01

    Mathematical models of the dynamical properties of biological systems aim to improve our understanding of the studied system with the ultimate goal of being able to predict system responses in the absence of experimentation. Despite the enormous advances that have been made in biological modeling and simulation, the inherently multiscale character of biological systems and the stochasticity of biological processes continue to present significant computational and conceptual challenges. Biological systems often consist of well-organized structural hierarchies, which inevitably lead to multiscale problems. This chapter introduces and discusses the advantages and shortcomings of several simulation methods that are being used by the scientific community to investigate the spatiotemporal properties of model biological systems. We first describe the foundations of the methods and then describe their relevance and possible application areas with illustrative examples from our own research. Possible ways to address the encountered computational difficulties are also discussed. PMID:21187236

  11. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to

  12. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to

  13. System Equivalent for Real Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Lin, Xi

    2011-07-01

    The purpose of this research is to develop a method of making system equivalents for the Real Time Digital Simulator (RTDS), which should enhance its capability of simulating large power systems. The proposed equivalent combines a Frequency Dependent Network Equivalent (FDNE) for the high frequency electromagnetic transients and a Transient Stability Analysis (TSA) type simulation block for the electromechanical transients. The frequency dependent characteristic for FDNE is obtained by curve-fitting frequency domain admittance characteristics using the Vector Fitting method. An approach for approximating the frequency dependent characteristic of large power networks from readily available typical power-flow data is also introduced. A new scheme of incorporating TSA solution in RTDS is proposed. This report shows how the TSA algorithm can be adapted to a real time platform. The validity of this method is confirmed with examples, including the study of a multi in-feed HVDC system based network.

  14. Individualized, discrete event, simulations provide insight into inter- and intra-subject variability of extended-release, drug products

    PubMed Central

    2012-01-01

    Objective Develop and validate particular, concrete, and abstract yet plausible in silico mechanistic explanations for large intra- and interindividual variability observed for eleven bioequivalence study participants. Do so in the face of considerable uncertainty about mechanisms. Methods We constructed an object-oriented, discrete event model called subject (we use small caps to distinguish computational objects from their biological counterparts). It maps abstractly to a dissolution test system and study subject to whom product was administered orally. A subject comprises four interconnected grid spaces and event mechanisms that map to different physiological features and processes. Drugs move within and between spaces. We followed an established, Iterative Refinement Protocol. Individualized mechanisms were made sufficiently complicated to achieve prespecified Similarity Criteria, but no more so. Within subjects, the dissolution space is linked to both a product-subject Interaction Space and the GI tract. The GI tract and Interaction Space connect to plasma, from which drug is eliminated. Results We discovered parameterizations that enabled the eleven subject simulation results to achieve the most stringent Similarity Criteria. Simulated profiles closely resembled those with normal, odd, and double peaks. We observed important subject-by-formulation interactions within subjects. Conclusion We hypothesize that there were interactions within bioequivalence study participants corresponding to the subject-by-formulation interactions within subjects. Further progress requires methods to transition currently abstract subject mechanisms iteratively and parsimoniously to be more physiologically realistic. As that objective is achieved, the approach presented is expected to become beneficial to drug development (e.g., controlled release) and to a reduction in the number of subjects needed per study plus faster regulatory review. PMID:22938185

  15. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  16. Systems simulation for an airport trailing vortex warning system

    NASA Technical Reports Server (NTRS)

    Jeffreys, H. B.

    1972-01-01

    The approach, development, and limited system studies associated with a system simulation for an Airport Trailing Vortex Warning System are documented. The usefulness is shown of a systems engineering approach to the problem of developing a system, as dictated by aircraft vortices, which will increase air-traffic flow in the takeoff/landing corridors of busy airports while maintaining the required safety factor for each operation. The simulation program has been developed in a modular form which permits new, more sophisticated component models, when they become available and are required, to be incorporated into the program with a minimum of program modifications. This report documents a limited system study that has been performed using this Total System Simulation Model. The resulting preliminary system requirements, conclusions, and recommendations are given.

  17. Bernoulli-Langevin Wind Speed Model for Simulation of Storm Events

    NASA Astrophysics Data System (ADS)

    Fürstenau, Norbert; Mittendorf, Monika

    2016-12-01

    We present a simple nonlinear dynamics Langevin model for predicting the instationary wind speed profile during storm events typically accompanying extreme low-pressure situations. It is based on a second-degree Bernoulli equation with δ-correlated Gaussian noise and may complement stationary stochastic wind models. Transition between increasing and decreasing wind speed and (quasi) stationary normal wind and storm states are induced by the sign change of the controlling time-dependent rate parameter k(t). This approach corresponds to the simplified nonlinear laser dynamics for the incoherent to coherent transition of light emission that can be understood by a phase transition analogy within equilibrium thermodynamics [H. Haken, Synergetics, 3rd ed., Springer, Berlin, Heidelberg, New York 1983/2004.]. Evidence for the nonlinear dynamics two-state approach is generated by fitting of two historical wind speed profiles (low-pressure situations "Xaver" and "Christian", 2013) taken from Meteorological Terminal Air Report weather data, with a logistic approximation (i.e. constant rate coefficients k) to the solution of our dynamical model using a sum of sigmoid functions. The analytical solution of our dynamical two-state Bernoulli equation as obtained with a sinusoidal rate ansatz k(t) of period T (=storm duration) exhibits reasonable agreement with the logistic fit to the empirical data. Noise parameter estimates of speed fluctuations are derived from empirical fit residuals and by means of a stationary solution of the corresponding Fokker-Planck equation. Numerical simulations with the Bernoulli-Langevin equation demonstrate the potential for stochastic wind speed profile modeling and predictive filtering under extreme storm events that is suggested for applications in anticipative air traffic management.

  18. Mobilization of PAHs and PCBs from In-Place Contaminated Marine Sediments During Simulated Resuspension Events

    NASA Astrophysics Data System (ADS)

    Latimer, J. S.; Davis, W. R.; Keith, D. J.

    1999-10-01

    A particle entrainment simulator was used to experimentally produce representative estuarine resuspension conditions to investigate the resulting transport of polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs) to the overlying water column. Contaminants were evaluated in bulk sediments, size fractionated sediments, resuspended particulate material and in some cases, dissolved phases during the experiments. The two types of sediments used in the experiments, dredged material and bedded estuarine sediment, represented gradients in contaminant loadings and sediment textural characteristics. For the bedded sediment, resuspension tended to winnow the sediments of finer particles. However, in the case of the more highly contaminated dredge material, non-selective resuspension was most common. Resuspension resulted in up to orders of magnitude higher particle-bound organic contaminant concentrations in the overlying water column. Dissolved phase PAH changes during resuspension were variable and at most, increased by a factor of three. The sifting process resulted in the partitioning of fine and coarse particle contaminant loading. For bedded sediments, accurate predictions of PAH and PCB loadings on resuspended particles were made using the mass of resuspended particles of different sizes and the concentrations of contaminants in the particle pools of the bulk sediment. However, due possibly to contributions from other unmeasured particles (e.g. colloids), predictions were not possible for the dredge material. Thus, knowledge of the redistribution and fate of colloids may be important. The partitioning of PAHs between the dissolved and particulate phases during resuspension events was predicted to within a factor of two from the amount of organic carbon in each of the resuspended samples. These experiments show that contaminant transport is a function of the chemistry and textural characteristics of the bulk sediment and the winnowing action

  19. Computer simulations of learning in neural systems.

    PubMed

    Salu, Y

    1983-04-01

    Recent experiments have shown that, in some cases, strengths of synaptic ties are being modified in learning. However, it is not known what the rules that control those modifications are, especially what determines which synapses will be modified and which will remain unchanged during a learning episode. Two postulated rules that may solve that problem are introduced. To check their effectiveness, the rules are tested in many computer models that simulate learning in neural systems. The simulations demonstrate that, theoretically, the two postulated rules are effective in organizing the synaptic changes. If they are found to also exist in biological systems, these postulated rules may be an important element in the learning process.

  20. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  1. Expert system for scheduling simulation lab sessions

    NASA Technical Reports Server (NTRS)

    Lund, Chet

    1990-01-01

    Implementation and results of an expert system used for scheduling session requests for the Systems Engineering Simulator (SES) laboratory at the NASA Johnson Space Center (JSC) are discussed. Weekly session requests are received from astronaut crew trainers, procedures developers, engineering assessment personnel, software developers, and various others who wish to access the computers, scene generators, and other simulation equipment available to them in the SES lab. The expert system under discussion is comprised of a data acquisition portion - two Pascal programs run on a personal computer - and a CLIPS program installed on a minicomputer. A brief introduction to the SES lab and its scheduling background is given. A general overview of the system is provided, followed by a detailed description of the constraint-reduction process and of the scheduler itself. Results from a ten-week trial period using this approach are discussed. Finally, a summary of the expert system's strengths and shortcomings are provided.

  2. Re-awakening Magmatic Systems: The Mechanics of an Open-system Event

    NASA Astrophysics Data System (ADS)

    Bergantz, George; Burgisser, Alain; Schleicher, Jillian

    2016-04-01

    The re-awakening of magmatic systems requires new magma input, which often induces mixing with a resident magma existing as a crystal-rich mush. This is expressed by complex phenocryst populations, many of which preserve evidence of multiple episodes of recycling. The unlocking and mobilization of these resident mushes conditions the progress of re-awakening, however their processes are poorly understood. Crystal-rich but mobile systems, dominated by their granular mechanics, are not satisfactorily explained from either fluid or solid-like models. We will present a generalizing framework for describing the mechanics of crystal-rich mushes based on the notion of force chains. Force chains arise from crystal-crystal contacts and describe the highly non-uniform way that stress is transmitted in a crystal-rich mush. Using CFD-DEM simulations that resolve crystal-scale mechanics, we will show how the populations of crystal mush force chains and their spatial fabric change during an open-system event. We will show how the various forms of dissipation, such as: fluid drag, particle-fluid drag, particle normal and shear lubrication, and contact friction, jointly contribute to the processes of magma mush unlocking, mobilization and fabric formation. We will also describe non-intuitive constitutive behavior such as non-local and non-affine deformation as well as complex, rheological transitions from continuous to discontinuous shear thickening as a function of the dimensionless shear rate. One implication of this is that many of the commonly-invoked postulates about magma behavior such as lock-up at a critical crystallinity and suspension rheology, are better understood from a micro-physical (crystal-scale) perspective as a combination of far-field geometrical controls, local frictional thickening and shear jamming, each with distinct time scales. This kind of crystal-based unifying framework can simultaneously recover diverse processes such as strain-localization, shear

  3. Redesigned Predictive Event-Triggered Controller for Networked Control System With Delays.

    PubMed

    Wu, Di; Sun, Xi-Ming; Wen, Changyun; Wang, Wei

    2016-10-01

    Event-triggered control (ETC) is a control strategy which can effectively reduce communication traffic in control networks. In the case where communication resources are scarce, ETC plays an important role in updating and communicating data. When network-induced delays are involved, two unsynchronized phenomena will appear if the existing ETC strategy, designed for networked control systems (NCSs) free of delays, is adopted. This paper deals with the ETC problem for NCS with delays existing in both sensor-to-controller and controller-to-actuator channels. A new predictive ETC strategy is proposed to solve both unsynchronized problems. It is shown that the stability of the resulting closed-loop system can be guaranteed under such an ETC strategy. Finally, both simulation studies and experimental tests are carried out to illustrate the proposed technique and verify its effectiveness.

  4. An investigation into pilot and system response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Materials relating to the study of pilot and system response to critical in-flight events (CIFE) are given. An annotated bibliography and a trip summary outline are presented, as are knowledge surveys with accompanying answer keys. Performance profiles of pilots and performance data from the simulations of CIFE's are given. The paper and pencil testing materials are reproduced. Conditions for the use of the additive model are discussed. A master summary of data for the destination diversion scenario is given. An interview with an aircraft mechanic demonstrates the feasibility of system problem diagnosis from a verbal description of symptoms and shows the information seeking and problem solving logic used by an expert to narrow the list of probable causes of aircraft failure.

  5. Impact of cloud microphysics and cumulus parameterization on simulation of heavy rainfall event during 7-9 October 2007 over Bangladesh

    NASA Astrophysics Data System (ADS)

    Mahbub Alam, M.

    2014-03-01

    In the present study, the Advanced Research WRF (ARW) version 3.2.1 has been used to simulate the heavy rainfall event that occurred between 7 and 9 October 2007 in the southern part of Bangladesh. Weather Research and Forecast (WRF-ARW version) modelling system with six different microphysics (MP) schemes and two different cumulus parameterization (CP) schemes in a nested configuration was chosen for simulating the event. The model domains consist of outer and inner domains having 9 and 3 km horizontal resolution, respectively with 28 vertical sigma levels. The impacts of cloud microphysical processes by means of precipitation, wind and reflectivity, kinematic and thermodynamic characteristics of the event have been studied. Sensitivity experiments have been conducted with the WRF model to test the impact of microphysical and cumulus parameterization schemes in capturing the extreme weather event. NCEP FNL data were used for the initial and boundary condition. The model ran for 72 h using initial data at 0000 UTC of 7 October 2007. The simulated rainfall shows that WSM6-KF combination gives better results for all combinations and after that Lin-KF combination. WSM3-KF has simulated, less area average rainfall out of all MP schemes that were coupled with KF scheme. The sharp peak of relative humidity up to 300 hPa has been simulated along the vertical line where maximum updraft has been found for all MPs coupled with KF and BMJ schemes. The simulated rain water and cloud water mixing ratio were maximum at the position where the vertical velocity and reflectivity has also been maximum. The production of rain water mixing ratio depends on MP schemes as well as CP schemes. Rainfall depends on rain water mixing ratio between 950 and 500 hPa. Rain water mixing ratio above 500 hPa level has no effect on surface rain.

  6. Assessment of WRF microphysics schemes to simulate extreme precipitation events from the perspective of GMI radiative signatures

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Shin, D. B.; Joh, M.

    2015-12-01

    Numerical simulations of precipitation depend to a large degree on the assumed cloud microphysics schemes representing the formation, growth and fallout of cloud droplets and ice crystals. Recent studies show that assumed cloud microphysics play a major role not only in forecasting precipitation, especially in cases of extreme precipitation events, but also in the quality of the passive microwave rainfall estimation. Evaluations of the various Weather Research Forecasting (WRF) model microphysics schemes in this study are based on a method that was originally developed to construct the a-priori databases of precipitation profiles and associated brightness temperatures (TBs) for precipitation retrievals. This methodology generates three-dimensional (3D) precipitation fields by matching the GPM dual frequency radar (DPR) reflectivity profiles with those calculated from cloud resolving model (CRM)-derived hydrometeor profiles. The method eventually provides 3D simulated precipitation fields over the DPR scan swaths. That is, atmospheric and hydrometeor profiles can be generated at each DPR pixel based on CRM and DPR reflectivity profiles. The generated raining systems over DPR observation fields can be applied to any radiometers that are unaccompanied with a radar for microwave radiative calculation with consideration of each sensor's channel and field of view. Assessment of the WRF model microphysics schemes for several typhoon cases in terms of emission and scattering signals of GMI will be discussed.

  7. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    PubMed Central

    Nguyen, Vinh Hao; Suh, Young Soo

    2009-01-01

    This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD), sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen. PMID:22574063

  8. EMERGENCY BRAKING IN ADULTS VERSUS NOVICE TEEN DRIVERS: RESPONSE TO SIMULATED SUDDEN DRIVING EVENTS.

    PubMed

    Loeb, Helen S; Kandadai, Venk; McDonald, Catherine C; Winston, Flaura K

    Motor vehicle crashes remain the leading cause of death in teens in the United States. Newly licensed drivers are the group most at risk for crashes. Their driving skills are very new, still very often untested, so that their ability to properly react in an emergency situation remains a research question. Since it is impossible to expose human subjects to critical life threatening driving scenarios, researchers have been increasingly using driving simulators to assess driving skills. This paper summarizes the results of a driving scenario in a study comparing the driving performance of novice teen drivers (n=21) 16-17 year olds with 90 days of provisional licensure with that of experienced adult drivers (n=17) 25-50 year olds with at least 5 years of PA licensure, at least 100 miles driven per week and no self-reported collisions in the previous 3 years. As part of a 30 to 35 simulated drive that encompassed the most common scenarios that result in serious crashes, participants were exposed to a sudden car event. As the participant drove on a suburban road, a car surged from a driveway hidden by a fence on the right side of the road. To avoid the crash, participants must hard brake, exhibiting dynamic control over both attentional and motor resources. The results showed strong differences between the experienced adult and novice teen drivers in the brake pressure applied. When placed in the same situation, the novice teens decelerated on average 50% less than the experienced adults (p<0.01).

  9. EMERGENCY BRAKING IN ADULTS VERSUS NOVICE TEEN DRIVERS: RESPONSE TO SIMULATED SUDDEN DRIVING EVENTS

    PubMed Central

    Kandadai, Venk; McDonald, Catherine C.; Winston, Flaura K.

    2015-01-01

    Motor vehicle crashes remain the leading cause of death in teens in the United States. Newly licensed drivers are the group most at risk for crashes. Their driving skills are very new, still very often untested, so that their ability to properly react in an emergency situation remains a research question. Since it is impossible to expose human subjects to critical life threatening driving scenarios, researchers have been increasingly using driving simulators to assess driving skills. This paper summarizes the results of a driving scenario in a study comparing the driving performance of novice teen drivers (n=21) 16–17 year olds with 90 days of provisional licensure with that of experienced adult drivers (n=17) 25–50 year olds with at least 5 years of PA licensure, at least 100 miles driven per week and no self-reported collisions in the previous 3 years. As part of a 30 to 35 simulated drive that encompassed the most common scenarios that result in serious crashes, participants were exposed to a sudden car event. As the participant drove on a suburban road, a car surged from a driveway hidden by a fence on the right side of the road. To avoid the crash, participants must hard brake, exhibiting dynamic control over both attentional and motor resources. The results showed strong differences between the experienced adult and novice teen drivers in the brake pressure applied. When placed in the same situation, the novice teens decelerated on average 50% less than the experienced adults (p<0.01). PMID:26709330

  10. Simulating Astronomical Adaptive Optics Systems Using Yao

    NASA Astrophysics Data System (ADS)

    Rigaut, François; Van Dam, Marcos

    2013-12-01

    Adaptive Optics systems are at the heart of the coming Extremely Large Telescopes generation. Given the importance, complexity and required advances of these systems, being able to simulate them faithfully is key to their success, and thus to the success of the ELTs. The type of systems envisioned to be built for the ELTs cover most of the AO breeds, from NGS AO to multiple guide star Ground Layer, Laser Tomography and Multi-Conjugate AO systems, with typically a few thousand actuators. This represents a large step up from the current generation of AO systems, and accordingly a challenge for existing AO simulation packages. This is especially true as, in the past years, computer power has not been following Moore's law in its most common understanding; CPU clocks are hovering at about 3GHz. Although the use of super computers is a possible solution to run these simulations, being able to use smaller machines has obvious advantages: cost, access, environmental issues. By using optimised code in an already proven AO simulation platform, we were able to run complex ELT AO simulations on very modest machines, including laptops. The platform is YAO. In this paper, we describe YAO, its architecture, its capabilities, the ELT-specific challenges and optimisations, and finally its performance. As an example, execution speed ranges from 5 iterations per second for a 6 LGS 60x60 subapertures Shack-Hartmann Wavefront sensor Laser Tomography AO system (including full physical image formation and detector characteristics) up to over 30 iterations/s for a single NGS AO system.

  11. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria.

  12. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    NASA Technical Reports Server (NTRS)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  13. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  14. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  15. Photovoltaic-electrolyzer system transient simulation results

    SciTech Connect

    Leigh, R.W.; Metz, P.D.; Michalek, K.

    1986-05-01

    Brookhaven National Laboratory has developed a Hydrogen Technology Evaluation Center to illustrate advanced hydrogen technology. The first phase of this effort investigated the use of solar energy to produce hydrogen from water via photovoltaic-powered electrolysis. A coordinated program of system testing, computer simulation, and economic analysis has been adopted to characterize and optimize the photovoltaic-electrolyzer system. This paper presents the initial transient simulation results. Innovative features of the modeling include the use of real weather data, detailed hourly modeling of thermal characteristics of the PV array and of system control strategies, and examination of systems over a wide range of power and voltage ratings. The transient simulation system TRNSYS was used, incorporating existing, modified or new component subroutines as required. For directly coupled systems, the authors found the PV array voltage which maximizes hydrogen production to be quite near the nominal electrolyzer voltage for a wide range of PV array powers. The array voltage which maximizes excess electricity production is slightly higher. The use of an ideal (100 percent efficient) maximum power tracking system provides only a six percent increase in annual hydrogen production. An examination of the effect of the PV array tilt indicates, as expected, that annual hydrogen production is insensitive to tilt angle within +-20 deg of latitude. Summer production greatly exceeds winter generation. Tilting the array, even to 90 deg, produces no significant increase in winter hydrogen production.

  16. Theory and Simulations of Solar System Plasmas

    NASA Technical Reports Server (NTRS)

    Goldstein, Melvyn L.

    2011-01-01

    "Theory and simulations of solar system plasmas" aims to highlight results from microscopic to global scales, achieved by theoretical investigations and numerical simulations of the plasma dynamics in the solar system. The theoretical approach must allow evidencing the universality of the phenomena being considered, whatever the region is where their role is studied; at the Sun, in the solar corona, in the interplanetary space or in planetary magnetospheres. All possible theoretical issues concerning plasma dynamics are welcome, especially those using numerical models and simulations, since these tools are mandatory whenever analytical treatments fail, in particular when complex nonlinear phenomena are at work. Comparative studies for ongoing missions like Cassini, Cluster, Demeter, Stereo, Wind, SDO, Hinode, as well as those preparing future missions and proposals, like, e.g., MMS and Solar Orbiter, are especially encouraged.

  17. LHC RF System Time-Domain Simulation

    SciTech Connect

    Mastorides, T.; Rivetta, C.; /SLAC

    2010-09-14

    Non-linear time-domain simulations have been developed for the Positron-Electron Project (PEP-II) and the Large Hadron Collider (LHC). These simulations capture the dynamic behavior of the RF station-beam interaction and are structured to reproduce the technical characteristics of the system (noise contributions, non-linear elements, and more). As such, they provide useful results and insight for the development and design of future LLRF feedback systems. They are also a valuable tool for the study of diverse longitudinal beam dynamics effects such as coupled-bunch impedance driven instabilities and single bunch longitudinal emittance growth. Results from these studies and related measurements from PEP-II and LHC have been presented in multiple places. This report presents an example of the time-domain simulation implementation for the LHC.

  18. Electric System Intra-hour Operation Simulator

    SciTech Connect

    Lu, Shuai; Meng, PNNL Da; Guillen, PNNL Zoe; PNNL,

    2014-03-07

    ESIOS is a software program developed at Pacific Northwest National Laboratory (PNNL) that performs intra-hour dispatch and automatic generation control (AGC) simulations for electric power system frequency regulation and load/variable generation following. The program dispatches generation resources at minute interval to meet control performance requirements, while incorporating stochastic models of forecast errors and variability with generation, load, interchange and market behaviors. The simulator also contains an operator model that mimics manual actions to adjust resource dispatch and maintain system reserves. Besides simulating generation fleet intra-hour dispatch, ESIOS can also be used as a test platform for the design and verification of energy storage, demand response, and other technologies helping to accommodate variable generation.

  19. Simulation of Regional-scale Nucleation Events and Prediction of Aerosol Number Concentration in a Regional Air Quality Model

    NASA Astrophysics Data System (ADS)

    Jung, J.; Adams, P.; Pandis, S.

    2006-12-01

    Nanoparticles can perturb Earth's climate by growing to cloud condensation nuclei sizes and also may be harmful to human health. Accurate simulation of the nucleation, growth, and removal of multicomponent nanoparticles demands enormous computational resources. Most regional-scale three-dimensional chemical transport models do not include nanoparticles and do not conserve number concentrations. A major challenge associated with the simulation of nucleation events is the uncertainty regarding the controlling nucleation mechanism under typical atmospheric conditions. Previous work indicates that nucleation events in the Pittsburgh area are well predicted using ternary (H2O-H2SO4-NH3) nucleation theory, which was successful in predicting on which days nucleation events occurred during summer and winter, as well as the beginning and end of the events. To predict the composition and growth of nanoparticles, we have developed a computationally efficient new approach based on the Two-Moment Aerosol Sectional (TOMAS) microphysics module. This model simulates inorganic and organic components of the nanoparticles describing both the number and the mass distribution of the particulate matter from approximately 1 nm to 10 micrometers. The model explains why nanoparticles were observed to be acidic during nucleation events that appear to involve ammonia. The simulation suggests that nanoparticles produced by ternary nucleation can be acidic due to depletion of ammonia vapor during the growth of the particles out of the nucleation sizes. The low CPU time requirements of the model using TOMAS make it suitable for incorporation in three- dimensional chemical transport models. The nucleation/coagulation/growth model has been added to the PMCAMx regional air quality model and is used for the investigation of nucleation events in the Eastern U.S. We can estimate number budget in the Eastern U.S. and predict frequency/size of nucleation events.

  20. On the use of Paleo DEMS for Simulation of historical Tsunami Events

    NASA Astrophysics Data System (ADS)

    Wronna, Martin; Baptista, Maria Ana; Götz, Joachim

    2016-04-01

    In this study, we present a methodology to reconstruct a Paleo Digital Elevation Model (PDEM) to alter geomorphological contexts between the present and the desired paleo period. We aim to simulate a historical tsunami propagation in the same geomorphological contexts of the time of the event. The methodology uses a combination of historical data, GPS-measurements with more recent LIDAR data to build PDEMs. Antique maps are georeferenced; altitude elevations are attributed through descriptions, and old pictures are used to estimate the original outline of a given site. Antique maps are georeferenced to obtain the location of landform and building features. Analysis and interpretation of the historical accounts, descriptions and old pictures serve to attribute an approximate elevation to landform and building features. River mouths and water courses outline can be rebuilt by the boundaries as given in the antique maps. Analysis of present day river mouths with similar characteristics permits the reconstruction of the antique water courses. GPS-RTK measurements along chosen river mouths' in similar geomorphologic environments is used to derive their inclination. We applied this methodology to the 1st November 1755 flooding of Cascais-Portugal. Our results show that using the PDEM we can reproduce the inundation described in most of the historical accounts. This study received funding from project ASTARTE- Assessment Strategy and Risk Reduction for Tsunamis in Europe a collaborative project Grant 603839, FP7-ENV2013 6.4-3

  1. The Impact of Inpatient Boarding on ED Efficiency: A Discrete-Event Simulation Study

    PubMed Central

    Bair, Aaron E.; Chen, Yi-Chun; Morris, Beth A.

    2009-01-01

    In this study, a discrete-event simulation approach was used to model Emergency Department’s (ED) patient flow to investigate the effect of inpatient boarding on the ED efficiency in terms of the National Emergency Department Crowding Scale (NEDOCS) score and the rate of patients who leave without being seen (LWBS). The decision variable in this model was the boarder-released-ratio defined as the ratio of admitted patients whose boarding time is zero to all admitted patients. Our analysis shows that the Overcrowded+ (a NEDOCS score over 100) ratio decreased from 88.4% to 50.4%, and the rate of LWBS patients decreased from 10.8% to 8.4% when the boarder-released-ratio changed from 0% to 100%. These results show that inpatient boarding significantly impacts both the NEDOCS score and the rate of LWBS patient and this analysis provides a quantification of the impact of boarding on emergency department patient crowding. PMID:20703616

  2. Assessing polyglutamine conformation in the nucleating event by molecular dynamics simulations.

    PubMed

    Miettinen, Markus S; Knecht, Volker; Monticelli, Luca; Ignatova, Zoya

    2012-08-30

    Polyglutamine (polyQ) diseases comprise a group of dominantly inherited pathology caused by an expansion of an unstable polyQ stretch which is presumed to form β-sheets. Similar to other amyloid pathologies, polyQ amyloidogenesis occurs via a nucleated polymerization mechanism, and proceeds through energetically unfavorable nucleus whose existence and structure are difficult to detect. Here, we use atomistic molecular dynamics simulations in explicit solvent to assess the conformation of the polyQ stretch in the nucleus that initiates polyQ fibrillization. Comparison of the kinetic stability of various structures of polyQ peptide with a Q-length in the pathological range (Q40) revealed that steric zipper or nanotube-like structures (β-nanotube or β-pseudohelix) are not kinetically stable enough to serve as a template to initiate polyQ fibrillization as opposed to β-hairpin-based (β-sheet and β-sheetstack) or α-helical conformations. The selection of different structures of the polyQ stretch in the aggregation-initiating event may provide an alternative explanation for polyQ aggregate polymorphism.

  3. Topological events in two-dimensional grain growth: Experiments and simulations

    SciTech Connect

    Fradkov, V.E.; Glicksman, M.E.; Palmer, M.; Rajan, K. . Materials Engineering Dept.)

    1994-08-01

    Grain growth in polycrystals is a process that occurs as a result of the vanishing of small grains. The mean topological class of vanishing two-dimensional (2-D) grains was found experimentally to be about 4.5. This result suggests that most vanishing grains are either 4- or 5-sided. A recent theory of 2-D grain growth is explicitly based on this fact, treating the switching as random events. The process of shrinking of 4- and 5-sided two-dimensional grains was observed experimentally on polycrystalline films of transparent, pure succinonitrile (SCN). Grain shrinking was studied theoretically and simulated by computer (both dynamic and Monte Carlo). It was found that most shrinking grains are topologically stable and remain within their topological class until they are much smaller than their neighbors. They discuss differences which were found with respect to the behavior of 2-D polycrystals, a 2-D ideal soap froth, and a 2-D section of a 3-D grain structure.

  4. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; Vlachoudis, Vasilis

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  5. Water contamination events in UK drinking-water supply systems.

    PubMed

    Gray, John

    2008-01-01

    Water supply companies in the UK have a duty under prime UK legislation to notify the Drinking Water Inspectorate of events affecting or potentially affecting the quality of drinking-water supplies. Under the same legislation, the Inspectorate has a duty to investigate each event. After assessing all of the information available, including companies' reports, the Inspectorate advises on the way in which the event was handled and whether any statutory requirements were contravened. If appropriate, a prosecution of the water company may be initiated. Copies of the assessment are sent to the water company, relevant local and health authorities, Ofwat (the economic regulator), the regional Consumer Council for Water and any other interested parties, including consumers who request it. Generic guidance may be issued to the industry on matters of wider concern. This paper considers the role of the Inspectorate, the powers available to it and reporting arrangements. An overview is presented of events that occurred between 1990 and 2005 and common features are identified. Causes of different types of event are discussed. The importance of well-established contacts between the various interested parties involved in protecting public health is emphasised through discussion of example incidents.

  6. Uncertainty analysis of numerical model simulations and HFR measurements during high energy events

    NASA Astrophysics Data System (ADS)

    Donncha, Fearghal O.; Ragnoli, Emanuele; Suits, Frank; Updyke, Teresa; Roarty, Hugh

    2013-04-01

    The identification and decomposition of sensor and model shortcomings is a fundamental component of any coastal monitoring and predictive system. In this research, numerical model simulations are combined with high-frequency radar (HFR) measurements to provide insights into the statistical accuracy of the remote sensing unit. A combination of classical tidal analysis and quantitative measures of correlation evaluate the performance of both across the bay. A network of high frequency radars is deployed within the Chesapeake study site, on the East coast of the United States, as a backbone component of the Integrated Ocean Observing System (IOOS). This system provides real-time synoptic measurements of surface currents in the zonal and meridional direction at hourly intervals in areas where at least two stations overlap, and radial components elsewhere. In conjunction with this numerical simulations using EFDC (Environmental Fluid Dynamics Code), an advanced three-dimensional model, provide additional details on flows, encompassing both surface dynamics and volumetric transports, while eliminating certain fundamental error inherent in the HFR system such as geometric dilution of precision (GDOP) and range dependencies. The aim of this research is an uncertainty estimate of both these datasets allowing for a degree of inaccuracy in both. The analysis focuses on comparisons between both the vector and radial component of flows returned by the HFR relative to numerical predictions. The analysis provides insight into the reported accuracy of both the raw radial data and the post-processed vector current data computed from combining the radial data. Of interest is any loss of accuracy due to this post-processing. Linear regression techniques decompose the surface currents based on dominant flow processes (tide and wind); statistical analysis and cross-correlation techniques measure agreement between the processed signal and dominant forcing parameters. The tidal signal

  7. Simulation of a persistent medium-term precipitation event over the Western Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Pereira, S. C.; Carvalho, A. C.; Ferreira, J.; Nunes, J. P.; Keizer, J. J.; Rocha, A.

    2013-01-01

    This study evaluates the performance of the WRF-ARW numerical weather model in simulating the spatial and temporal patterns of an extreme rainfall period over a complex orographic region in north-central Portugal. The analysis was performed for the month of December 2009, during the rainy season in Mainland Portugal. The heavy to extreme rainfall periods were caused by several low surface pressure systems associated with frontal surfaces. Three model runs, forced with the initial fields from a global domain model, were conducted. The model experiments were conducted to compare model performance using different approaches: (1) a reference experiment with no nudging (RunRef); (2) observational nudging for a specific location (RunObsN) is included; (3) nudging is used to adjust the analysis field (RunGridN). Model performance was evaluated against an observed hourly precipitation dataset of 27 rainfall stations, grouped by altitude, using several statistical parameters. The WRF model did not show skill in reproducing the precipitation intensities but simulated reasonably the periods of precipitation occurrence. The best performance was reached for the grid-nudging experiment (RunGridN). The overall model accuracy (RMSE) was similar for all altitude classes, for the three experiments: highest for lowlands and highlands. Precipitation simulated in areas located in rough terrain and deep valleys tend to be less accurate.

  8. A simulation of data acquisition system for SSC experiments

    SciTech Connect

    Watase, Y.; Ikeda, H.

    1989-04-01

    A simulation on some parts of the data acquisition system was performed using a general purpose simulation language GPSS. Several results of the simulation are discussed for the data acquisition system for the SSC experiment.

  9. Introduction to Observing System Simulation Experiments (OSSEs)

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.

    2014-01-01

    This presentation gives a brief overview of Observing System Simulation Experiments (OSSEs), including what OSSEs are, and how and why they are performed. The intent is to educate the audience in light of the OSSE-related sections of the Forecast Improvement Act (H.R. 2413).

  10. The systems biology simulation core algorithm

    PubMed Central

    2013-01-01

    Background With the increasing availability of high dimensional time course data for metabolites, genes, and fluxes, the mathematical description of dynamical systems has become an essential aspect of research in systems biology. Models are often encoded in formats such as SBML, whose structure is very complex and difficult to evaluate due to many special cases. Results This article describes an efficient algorithm to solve SBML models that are interpreted in terms of ordinary differential equations. We begin our consideration with a formal representation of the mathematical form of the models and explain all parts of the algorithm in detail, including several preprocessing steps. We provide a flexible reference implementation as part of the Systems Biology Simulation Core Library, a community-driven project providing a large collection of numerical solvers and a sophisticated interface hierarchy for the definition of custom differential equation systems. To demonstrate the capabilities of the new algorithm, it has been tested with the entire SBML Test Suite and all models of BioModels Database. Conclusions The formal description of the mathematics behind the SBML format facilitates the implementation of the algorithm within specifically tailored programs. The reference implementation can be used as a simulation backend for Java™-based programs. Source code, binaries, and documentation can be freely obtained under the terms of the LGPL version 3 from http://simulation-core.sourceforge.net. Feature requests, bug reports, contributions, or any further discussion can be directed to the mailing list simulation-core-development@lists.sourceforge.net. PMID:23826941

  11. Rotor systems research aircraft simulation mathematical model

    NASA Technical Reports Server (NTRS)

    Houck, J. A.; Moore, F. L.; Howlett, J. J.; Pollock, K. S.; Browne, M. M.

    1977-01-01

    An analytical model developed for evaluating and verifying advanced rotor concepts is discussed. The model was used during in both open loop and real time man-in-the-loop simulation during the rotor systems research aircraft design. Future applications include: pilot training, preflight of test programs, and the evaluation of promising concepts before their implementation on the flight vehicle.

  12. Plans for wind energy system simulation

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.

    1978-01-01

    A digital computer code and a special purpose hybrid computer, were introduced. The digital computer program, the Root Perturbation Method or RPM, is an implementation of the classic floquet procedure which circumvents numerical problems associated with the extraction of Floquet roots. The hybrid computer, the Wind Energy System Time domain simulator (WEST), yields real time loads and deformation information essential to design and system stability investigations.

  13. Fidelity Optimization of Microprocessor System Simulations.

    DTIC Science & Technology

    1981-03-01

    I. TIT LE (end Su.btitle) 5 TYPE OF REPORT A PERIOD COVERED " Fidelity Optimization of Microprocessor THESIS /DgW&YON/ System Simulations...MICROPROCESSOR SYSTEM SIHULATIONS Earnest Taylor Landrum, Jr. A Thesis Submitted to the Graduate Faculty of Auburn University in Partial Fulfillment of the...Taylor Landrum, Jr. Permission is herewith granted to Auburn University to make copies of this thesis at its discretion, upon the request of

  14. SIMEDIS: a Discrete-Event Simulation Model for Testing Responses to Mass Casualty Incidents.

    PubMed

    Debacker, Michel; Van Utterbeeck, Filip; Ullrich, Christophe; Dhondt, Erwin; Hubloue, Ives

    2016-12-01

    It is recognized that the study of the disaster medical response (DMR) is a relatively new field. To date, there is no evidence-based literature that clearly defines the best medical response principles, concepts, structures and processes in a disaster setting. Much of what is known about the DMR results from descriptive studies and expert opinion. No experimental studies regarding the effects of DMR interventions on the health outcomes of disaster survivors have been carried out. Traditional analytic methods cannot fully capture the flow of disaster victims through a complex disaster medical response system (DMRS). Computer modelling and simulation enable to study and test operational assumptions in a virtual but controlled experimental environment. The SIMEDIS (Simulation for the assessment and optimization of medical disaster management) simulation model consists of 3 interacting components: the victim creation model, the victim monitoring model where the health state of each victim is monitored and adapted to the evolving clinical conditions of the victims, and the medical response model, where the victims interact with the environment and the resources at the disposal of the healthcare responders. Since the main aim of the DMR is to minimize as much as possible the mortality and morbidity of the survivors, we designed a victim-centred model in which the casualties pass through the different components and processes of a DMRS. The specificity of the SIMEDIS simulation model is the fact that the victim entities evolve in parallel through both the victim monitoring model and the medical response model. The interaction between both models is ensured through a time or medical intervention trigger. At each service point, a triage is performed together with a decision on the disposition of the victims regarding treatment and/or evacuation based on a priority code assigned to the victim and on the availability of resources at the service point. The aim of the case

  15. Looking Back: Events That Have Shaped Our Current Child Care Delivery System.

    ERIC Educational Resources Information Center

    Neugebauer, Roger

    2000-01-01

    Reports findings of an unscientific survey of early childhood professionals asked to reflect upon the history, landmark events, and significant trends in the child care delivery system. Three events viewed as most influential are highlighted: (1) World War II; (2) women's movement; and (3) Head Start. Eleven other events also cited are discussed.…

  16. Events Management Education through CD-ROM Simulation at Victoria University of Technology.

    ERIC Educational Resources Information Center

    Perry, Marcia; And Others

    There has been a rapid growth in the events industry in Victoria and Australia over the past five years with an increase in large scale events--resulting in substantive economic impact. The growth in events in Australia is projected to continue to beyond 2001. The Department of Management at Victoria University of Technology (VU) received a…

  17. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  18. Modular Aero-Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2006-01-01

    The Modular Aero-Propulsion System Simulation (MAPSS) is a graphical simulation environment designed for the development of advanced control algorithms and rapid testing of these algorithms on a generic computational model of a turbofan engine and its control system. MAPSS is a nonlinear, non-real-time simulation comprising a Component Level Model (CLM) module and a Controller-and-Actuator Dynamics (CAD) module. The CLM module simulates the dynamics of engine components at a sampling rate of 2,500 Hz. The controller submodule of the CAD module simulates a digital controller, which has a typical update rate of 50 Hz. The sampling rate for the actuators in the CAD module is the same as that of the CLM. MAPSS provides a graphical user interface that affords easy access to engine-operation, engine-health, and control parameters; is used to enter such input model parameters as power lever angle (PLA), Mach number, and altitude; and can be used to change controller and engine parameters. Output variables are selectable by the user. Output data as well as any changes to constants and other parameters can be saved and reloaded into the GUI later.

  19. Java simulations of embedded control systems.

    PubMed

    Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt.

  20. Toward simulating complex systems with quantum effects

    NASA Astrophysics Data System (ADS)

    Kenion-Hanrath, Rachel Lynn

    Quantum effects like tunneling, coherence, and zero point energy often play a significant role in phenomena on the scales of atoms and molecules. However, the exact quantum treatment of a system scales exponentially with dimensionality, making it impractical for characterizing reaction rates and mechanisms in complex systems. An ongoing effort in the field of theoretical chemistry and physics is extending scalable, classical trajectory-based simulation methods capable of capturing quantum effects to describe dynamic processes in many-body systems; in the work presented here we explore two such techniques. First, we detail an explicit electron, path integral (PI)-based simulation protocol for predicting the rate of electron transfer in condensed-phase transition metal complex systems. Using a PI representation of the transferring electron and a classical representation of the transition metal complex and solvent atoms, we compute the outer sphere free energy barrier and dynamical recrossing factor of the electron transfer rate while accounting for quantum tunneling and zero point energy effects. We are able to achieve this employing only a single set of force field parameters to describe the system rather than parameterizing along the reaction coordinate. Following our success in describing a simple model system, we discuss our next steps in extending our protocol to technologically relevant materials systems. The latter half focuses on the Mixed Quantum-Classical Initial Value Representation (MQC-IVR) of real-time correlation functions, a semiclassical method which has demonstrated its ability to "tune'' between quantum- and classical-limit correlation functions while maintaining dynamic consistency. Specifically, this is achieved through a parameter that determines the quantumness of individual degrees of freedom. Here, we derive a semiclassical correction term for the MQC-IVR to systematically characterize the error introduced by different choices of simulation

  1. An Investigation of System Identification Techniques for Simulation Model Abstraction

    DTIC Science & Technology

    2000-02-01

    This report summarizes research into the application of system identification techniques to simulation model abstraction. System identification produces...34Mission Simulation," a simulation of a squadron of aircraft performing battlefield air interdiction. The system identification techniques were...simplified mathematical models that approximate the dynamic behaviors of the underlying stochastic simulations. Four state-space system

  2. Simulation of the infrared signature of transient luminous events in the middle atmosphere for a limb line of sight

    NASA Astrophysics Data System (ADS)

    Romand, Frédéric; Croizé, Laurence; Payan, Sébastien; Huret, Nathalie

    2016-04-01

    Transient Luminous Events (TLE) are electrical and optical events which occurs above thunderstorms. Visual signatures are reported since the beginning of the 20th century but the first picture is accidentally recorded from a television camera in 1989. Their occurrence is closely linked with the lightning activity below thunderstorms. TLEs are observed from the base of the stratosphere to the thermosphere (15 - 110 km). They are a very brief phenomenon which lasts from 1 to 300 milliseconds. At a worldwide scale, four TLEs occur each minute. The energy deposition, about some tenth of megajoules, is able to ionize, dissociate and excite the molecules of the atmosphere. Atmospheric discharges in the troposphere are important sources of NO and NO2. TLEs might have the same effects at higher altitudes, in the stratosphere. NOx then can affect the concentration of O3 and OH. Consequently, TLEs could be locally important contributors to the chemical budget of the middle atmosphere. The perturbation of the atmospheric chemistry induced by TLEs has the consequence to locally modify the radiations in the infrared during the minutes following the event. The interest of studying the infrared signature of a TLE is twofold. For the atmospheric sciences it allows to link the perturbed composition to the resulting infrared spectrum. Then, some Defense systems like detection and guiding devices are equipped with airborne infrared sensors so that the TLE infrared signature might disturb them. We want to obtain a quantitative and kinetic evaluation of the infrared signature of the atmosphere locally perturbed by a TLE. In order to do so we must model three phenomena. 1) The plasma/chemistry coupling, which describes how the different energetic levels of atmospheric molecules are populated by the energetic deposition of the TLE. This step lasts the time of the lightning itself. 2) The chemical kinetics which describes how these populations will evolve in the following minutes. 3) The

  3. Improving Customer Waiting Time at a DMV Center Using Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Arnaout, Georges M.; Bowling, Shannon

    2010-01-01

    Virginia's Department of Motor Vehicles (DMV) serves a customer base of approximately 5.6 million licensed drivers and ID card holders and 7 million registered vehicle owners. DMV has more daily face-to-face contact with Virginia's citizens than any other state agency [1]. The DMV faces a major difficulty in keeping up with the excessively large customers' arrival rate. The consequences are queues building up, stretching out to the entrance doors (and sometimes even outside) and customers complaining. While the DMV state employees are trying to serve at their fastest pace, the remarkably large queues indicate that there is a serious problem that the DMV faces in its services, which must be dealt with rapidly. Simulation is considered as one of the best tools for evaluating and improving complex systems. In this paper, we use it to model one of the DMV centers located in Norfolk, VA. The simulation model is modeled in Arena 10.0 from Rockwell systems. The data used is collected from experts of the DMV Virginia headquarter located in Richmond. The model created was verified and validated. The intent of this study is to identify key problems causing the delays at the DMV centers and suggest possible solutions to minimize the customers' waiting time. In addition, two tentative hypotheses aiming to improve the model's design are tested and validated.

  4. Exupery volcano fast response system - The event detection and waveform classification system

    NASA Astrophysics Data System (ADS)

    Hammer, Conny; Ohrnberger, Matthias

    2010-05-01

    Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity since the number and the size of certain types of seismic events usually increase before periods of volcanic crisis. The implementation of an automatic detection and classification system for seismic signals of volcanic origin allows not only for the processing of large amounts of data in short time, but also provides consistent and time-invariant results. Here, we have developed a system based upon a combination of different methods. To enable a first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is widely distributed in active volcano monitoring observatories worldwide. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters which are also used to quantify the volcanic activity. Most automatic classification systems need a sufficiently large pre-classified data set for training the system. However in case of a volcanic crisis we are often confronted with a lack of training data due to insufficient prior observations because prior data acquisition might be carried out with different equipment at a low number of sites and due to the imminent crisis there might be no time for the time-consuming and tedious process of preparing a training data set. For this reason we have developed a novel seismic event spotting technique in order to be less dependent on the existence of previously acquired data bases of event classes. One main goal is therefore to provide observatory staff with a robust event classification based on a minimum number of reference waveforms. By using a "learning-while-recording" approach we are allowing for the fast build-up of a

  5. Simulation of Flywheel Energy Storage System Controls

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Wolff, Frederick J.; Dravid, Narayan

    2001-01-01

    This paper presents the progress made in the controller design and operation of a flywheel energy storage system. The switching logic for the converter bridge circuit has been redefined to reduce line current harmonics, even at the highest operating speed of the permanent magnet motor-generator. An electromechanical machine model is utilized to simulate charge and discharge operation of the inertial energy in the flywheel. Controlling the magnitude of phase currents regulates the rate of charge and discharge. The resulting improvements are demonstrated by simulation.

  6. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  7. Hidden Conformation Events in DNA Base Extrusions: A Generalized Ensemble Path Optimization and Equilibrium Simulation Study

    PubMed Central

    Cao, Liaoran; Lv, Chao; Yang, Wei

    2013-01-01

    DNA base extrusion is a crucial component of many biomolecular processes. Elucidating how bases are selectively extruded from the interiors of double-strand DNAs is pivotal to accurately understanding and efficiently sampling this general type of conformational transitions. In this work, the on-the-path random walk (OTPRW) method, which is the first generalized ensemble sampling scheme designed for finite-temperature-string path optimizations, was improved and applied to obtain the minimum free energy path (MFEP) and the free energy profile of a classical B-DNA major-groove base extrusion pathway. Along the MFEP, an intermediate state and the corresponding transition state were located and characterized. The MFEP result suggests that a base-plane-elongation event rather than the commonly focused base-flipping event is dominant in the transition state formation portion of the pathway; and the energetic penalty at the transition state is mainly introduced by the stretching of the Watson-Crick base pair. Moreover to facilitate the essential base-plane-elongation dynamics, the surrounding environment of the flipped base needs to be intimately involved. Further taking the advantage of the extended-dynamics nature of the OTPRW Hamiltonian, an equilibrium generalized ensemble simulation was performed along the optimized path; and based on the collected samples, several base-flipping (opening) angle collective variables were evaluated. In consistence with the MFEP result, the collective variable analysis result reveals that none of these commonly employed flipping (opening) angles alone can adequately represent the base extrusion pathway, especially in the pre-transition-state portion. As further revealed by the collective variable analysis, the base-pairing partner of the extrusion target undergoes a series of in-plane rotations to facilitate the base-plane-elongation dynamics. A base-plane rotation angle is identified to be a possible reaction coordinate to represent

  8. Simulating Complex Window Systems using BSDF Data

    SciTech Connect

    Konstantoglou, Maria; Jonsson, Jacob; Lee, Eleanor

    2009-06-22

    Nowadays, virtual models are commonly used to evaluate the performance of conventional window systems. Complex fenestration systems can be difficult to simulate accurately not only because of their geometry but also because of their optical properties that scatter light in an unpredictable manner. Bi-directional Scattering Distribution Functions (BSDF) have recently been developed based on a mixture of measurements and modelling to characterize the optics of such systems. This paper describes the workflow needed to create then use these BSDF datasets in the Radiance lighting simulation software. Limited comparisons are made between visualizations produced using the standard ray-tracing method, the BSDF method, and that taken in a full-scale outdoor mockup.

  9. A system of infrared scene simulation

    NASA Astrophysics Data System (ADS)

    Hu, Haihe; Li, Yujian; Huo, Yi; Kuang, Wenqing; Zhang, Ting

    2016-10-01

    We propose an integral infrared scene simulation system. The proposed system, which is based on the parameters of the thermal physical property and optical property, computes the radiation distribution of the scenery on the focus plane of the camera according to the scene of the geometrical parameter, the position and intensity of the light source, the location and direction of the camera and so on. Then the radiation distribution is mapped to the space of gray, and we finally obtain the virtual image of the scene. The proposed system includes eight modules namely basic data maintaining, model importing, scene saving, geometry parameters setting and infrared property parameters of the scene, data pre-processing, infrared scene simulation, and scene loading. The proposed system organizes all the data by the mode of database lookup table that stores all relative parameters and computation results of different states to avoid repetitive computation. Experimental results show that the proposed system produces three dimension infrared images in real time to some extent, and can reach 60 frames/second in simple scene drawing and 20 frames/second in complex scene drawing. Experimental results also show that the simulated images can represent infrared features of the scenery to a certain degree.

  10. System-Level Reuse of Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Hazen, Michael R.; Williams, Joseph C.

    2004-01-01

    One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.

  11. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  12. Effects of Solar Particle Event-Like Proton Radiation and/or Simulated Microgravity on Circulating Mouse Blood Cells.

    PubMed

    Romero-Weaver, Ana L; Lin, Liyong; Carabe-Fernandez, Alejandro; Kennedy, Ann R

    2014-08-01

    Astronauts traveling in space missions outside of low Earth orbit will be exposed for longer times to a microgravity environment. In addition, the increased travel time involved in exploration class missions will result in an increased risk of exposure to significant doses of solar particle event (SPE) radiation. Both conditions could significantly affect the number of circulating blood cells. Therefore, it is critical to determine the combined effects of exposure to both microgravity and SPE radiation. The purpose of the present study was to assess these risks by evaluating the effects of SPE-like proton radiation and/or microgravity, as simulated with the hindlimb unloading (HU) system, on circulating blood cells using mouse as a model system. The results indicate that exposure to HU alone caused minimal or no significant changes in mouse circulating blood cell numbers. The exposure of mice to SPE-like proton radiation with or without HU treatment caused a significant decrease in the number of circulating lymphocytes, granulocytes and platelets. The reduced numbers of circulating lymphocytes, granulocytes, and platelets, resulting from the SPE-like proton radiation exposure, with or without HU treatment, in mice suggest that astronauts participating in exploration class missions may be at greater risk of developing infections and thrombotic diseases; thus, countermeasures may be necessary for these biological endpoints.

  13. A simulation system for biomarker evolution in neurodegenerative disease.

    PubMed

    Young, Alexandra L; Oxtoby, Neil P; Ourselin, Sebastien; Schott, Jonathan M; Alexander, Daniel C

    2015-12-01

    We present a framework for simulating cross-sectional or longitudinal biomarker data sets from neurodegenerative disease cohorts that reflect the temporal evolution of the disease and population diversity. The simulation system provides a mechanism for evaluating the performance of data-driven models of disease progression, which bring together biomarker measurements from large cross-sectional (or short term longitudinal) cohorts to recover the average population-wide dynamics. We demonstrate the use of the simulation framework in two different ways. First, to evaluate the performance of the Event Based Model (EBM) for recovering biomarker abnormality orderings from cross-sectional datasets. Second, to evaluate the performance of a differential equation model (DEM) for recovering biomarker abnormality trajectories from short-term longitudinal datasets. Results highlight several important considerations when applying data-driven models to sporadic disease datasets as well as key areas for future work. The system reveals several important insights into the behaviour of each model. For example, the EBM is robust to noise on the underlying biomarker trajectory parameters, under-sampling of the underlying disease time course and outliers who follow alternative event sequences. However, the EBM is sensitive to accurate estimation of the distribution of normal and abnormal biomarker measurements. In contrast, we find that the DEM is sensitive to noise on the biomarker trajectory parameters, resulting in an over estimation of the time taken for biomarker trajectories to go from normal to abnormal. This over estimate is approximately twice as long as the actual transition time of the trajectory for the expected noise level in neurodegenerative disease datasets. This simulation framework is equally applicable to a range of other models and longitudinal analysis techniques.

  14. Collaborative Project: Understanding Climate Model Biases in Tropical Atlantic and Their Impact on Simulations of Extreme Climate Events

    SciTech Connect

    Chang, Ping

    2016-01-04

    Recent studies have revealed that among all the tropical oceans, the tropical Atlantic has experienced the most pronounced warming trend over the 20th century. Many extreme climate events affecting the U.S., such as hurricanes, severe precipitation and drought events, are influenced by conditions in the Gulf of Mexico and the Atlantic Ocean. It is therefore imperative to have accurate simulations of the climatic mean and variability in the Atlantic region to be able to make credible projections of future climate change affecting the U.S. and other countries adjoining the Atlantic Ocean. Unfortunately, almost all global climate models exhibit large biases in their simulations of tropical Atlantic climate. The atmospheric convection simulation errors in the Amazon region and the associated errors in the trade wind simulations are hypothesized to be a leading cause of the tropical Atlantic biases in climate models. As global climate models have resolutions that are too coarse to resolve some of the atmospheric and oceanic processes responsible for the model biases, we propose to use a high-resolution coupled regional climate model (CRCM) framework to address the tropical bias issue. We propose to combine the expertise in tropical coupled atmosphere-ocean modeling at Texas A&M University (TAMU) and the coupled land-atmosphere modeling expertise at Pacific Northwest National Laboratory (PNNL) to develop a comprehensive CRCM for the Atlantic sector within a general and flexible modeling framework. The atmospheric component of the CRCM will be the NCAR WRF model and the oceanic component will be the Rutgers/UCLA ROMS. For the land component, we will use CLM modified at PNNL to include more detailed representations of vegetation and soil hydrology processes. The combined TAMU-PNNL CRCM model will be used to simulate the Atlantic climate, and the associated land-atmosphere-ocean interactions at a horizontal resolution of 9 km or finer. A particular focus of the model

  15. Problem reporting management system performance simulation

    NASA Technical Reports Server (NTRS)

    Vannatta, David S.

    1993-01-01

    This paper proposes the Problem Reporting Management System (PRMS) model as an effective discrete simulation tool that determines the risks involved during the development phase of a Trouble Tracking Reporting Data Base replacement system. The model considers the type of equipment and networks which will be used in the replacement system as well as varying user loads, size of the database, and expected operational availability. The paper discusses the dynamics, stability, and application of the PRMS and addresses suggested concepts to enhance the service performance and enrich them.

  16. Runway Incursion Prevention System Simulation Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.

    2002-01-01

    A Runway Incursion Prevention System (RIPS) was evaluated in a full mission simulation study at the NASA Langley Research center in March 2002. RIPS integrates airborne and ground-based technologies to provide (1) enhanced surface situational awareness to avoid blunders and (2) alerts of runway conflicts in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted in a high fidelity simulator. The purpose of the study was to evaluate the RIPS airborne incursion detection algorithms and associated alerting and airport surface display concepts. Eight commercial airline crews participated as test subjects completing 467 test runs. This paper gives an overview of the RIPS, simulation study, and test results.

  17. Quantum Simulation of Tunneling in Small Systems