Sample records for event simulation model

  1. A Simulation of Alternatives for Wholesale Inventory Replenishment

    DTIC Science & Technology

    2016-03-01

    algorithmic details. The last method is a mixed-integer, linear optimization model. Comparative Inventory Simulation, a discrete event simulation model, is...simulation; event graphs; reorder point; fill-rate; backorder; discrete event simulation; wholesale inventory optimization model 15. NUMBER OF PAGES...model. Comparative Inventory Simulation, a discrete event simulation model, is designed to find fill rates achieved for each National Item

  2. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  3. USMC Inventory Control Using Optimization Modeling and Discrete Event Simulation

    DTIC Science & Technology

    2016-09-01

    release. Distribution is unlimited. USMC INVENTORY CONTROL USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION by Timothy A. Curling...USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION 5. FUNDING NUMBERS 6. AUTHOR(S) Timothy A. Curling 7. PERFORMING ORGANIZATION NAME(S...optimization and discrete -event simulation. This construct can potentially provide an effective means in improving order management decisions. However

  4. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  5. Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations

    DTIC Science & Technology

    2016-06-01

    WARFARE WITH DISCRETE EVENT SIMULATION AND ANALYZING NAVAL CONVOY OPERATIONS by Ali E. Opcin June 2016 Thesis Advisor: Arnold H. Buss Co...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MODELING ANTI-AIR WARFARE WITH DISCRETE EVENT...In this study, a discrete event simulation (DES) was built by modeling ships, and their sensors and weapons, to simulate convoy operations under

  6. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  7. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  8. Evaluation of the Navys Sea/Shore Flow Policy

    DTIC Science & Technology

    2016-06-01

    Std. Z39.18 i Abstract CNA developed an independent Discrete -Event Simulation model to evaluate and assess the effect of...a more steady manning level, but the variability remains, even if the system is optimized. In building a Discrete -Event Simulation model, we...steady-state model. In FY 2014, CNA developed a Discrete -Event Simulation model to evaluate the impact of sea/shore flow policy (the DES-SSF model

  9. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  10. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  11. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  12. Parallel discrete event simulation using shared memory

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  13. Evaluation of the Navys Sea/Shore Flow Policy

    DTIC Science & Technology

    2016-06-01

    CNA developed an independent Discrete -Event Simulation model to evaluate and assess the effect of alternative sea/shore flow policies. In this study...remains, even if the system is optimized. In building a Discrete -Event Simulation model, we discovered key factors that should be included in the... Discrete -Event Simulation model to evaluate the impact of sea/shore flow policy (the DES-SSF model) and compared the results with the SSFM for one

  14. Simulation studies on the differences between spontaneous and triggered seismicity and on foreshock probabilities

    NASA Astrophysics Data System (ADS)

    Zhuang, J.; Vere-Jones, D.; Ogata, Y.; Christophersen, A.; Savage, M. K.; Jackson, D. D.

    2008-12-01

    In this study we investigate the foreshock probabilities calculated from earthquake catalogs from Japan, Southern California and New Zealand. Unlike conventional studies on foreshocks, we use a probability-based declustering method to separate each catalog into stochastic versions of family trees, such that each event is classified as either having been triggered by a preceding event, or being a spontaneous event. The probabilities are determined from parameters that provide the best fit of the real catalogue using a space- time epidemic-type aftershock sequence (ETAS) model. The model assumes that background and triggered earthquakes have the same magnitude dependent triggering capability. A foreshock here is defined as a spontaneous event that has one or more larger descendants, and a triggered foreshock is a triggered event that has one or more larger descendants. The proportion of foreshocks in spontaneous events of each catalog is found to be lower than the proportion of triggered foreshocks in triggered events. One possibility is that this is due to different triggering productivity in spontaneous versus triggered events, i.e., a triggered event triggers more children than a spontaneous events of the same magnitude. To understand what causes the above differences between spontaneous and triggered events, we apply the same procedures to several synthetic catalogs simulated by using different models. The first simulation is done by using the ETAS model with parameters and spontaneous rate fitted from the JMA catalog. The second synthetic catalog is simulated by using an adjusted ETAS model that takes into account the triggering effect from events lower than the magnitude. That is, we simulated the catalog with a low magnitude threshold with the original ETAS model, and then we remove the events smaller than a higher magnitude threshold. The third model for simulation assumes that different triggering behaviors exist between spontaneous event and triggered events. We repeat the fitting and reconstruction procedures to all those simulated catalogs. The reconstruction results for the first synthetic catalog do not show the difference between spontaneous events and triggered event or the differences in foreshock probabilities. On the other hand, results from the synthetic catalogs simulated with the second and the third models clearly reconstruct such differences. In summary our results implies that one of the causes of such differences may be neglecting the triggering effort from events smaller than the cut-off magnitude or magnitude errors. For the objective of forecasting seismicity, we can use a clustering model in which spontaneous events trigger child events in a different way from triggered events to avoid over-predicting earthquake risks with foreshocks. To understand the physical implication of this study, we need further careful studies to compare the real seismicity and the adjusted ETAS model, which takes the triggering effect from events below the cut-off magnitude into account.

  15. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  16. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  17. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  18. The development of a simulation model of the treatment of coronary heart disease.

    PubMed

    Cooper, Keith; Davies, Ruth; Roderick, Paul; Chase, Debbie; Raftery, James

    2002-11-01

    A discrete event simulation models the progress of patients who have had a coronary event, through their treatment pathways and subsequent coronary events. The main risk factors in the model are age, sex, history of previous events and the extent of the coronary vessel disease. The model parameters are based on data collected from epidemiological studies of incidence and prognosis, efficacy studies. national surveys and treatment audits. The simulation results were validated against different sources of data. The initial results show that increasing revascularisation has considerable implications for resource use but has little impact on patient mortality.

  19. Manual for the Jet Event and Background Simulation Library(JEBSimLib)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinz, Matthias; Soltz, Ron; Angerami, Aaron

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momentamore » are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.« less

  20. Manual for the Jet Event and Background Simulation Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinz, M.; Soltz, R.; Angerami, A.

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momentamore » are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.« less

  1. Modelling and Simulation as a Recognizing Method in Education

    ERIC Educational Resources Information Center

    Stoffa, Veronika

    2004-01-01

    Computer animation-simulation models of complex processes and events, which are the method of instruction, can be an effective didactic device. Gaining deeper knowledge about objects modelled helps to plan simulation experiments oriented on processes and events researched. Animation experiments realized on multimedia computers can aid easier…

  2. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, Naveed; Brauch, Jennifer; Ahrens, Bodo

    2014-05-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine and warm core Mediterranean cyclones which exhibit some similarities with tropical cyclones. The strong cyclonic winds associated with them are a potential thread for highly populated coastal areas around the Mediterranean basin. In this study we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (NEMO-1d) to simulate medicanes. The goal of this study is to assess the robustness of the coupled model to simulate these extreme events. For this purpose 11 historical medicane events are simulated by the atmosphere-only and the coupled models using different set-ups (horizontal grid-spacings: 0.44o, 0.22o, 0.088o; with/with-out spectral nudging). The results show that at high resolution the coupled model is not only able to simulate all medicane events but also improves the simulated track length, warm core, and wind speed of simulated medicanes compared to atmosphere-only simulations. In most of the cases the medicanes trajectories and structures are better represented in coupled simulations compared to atmosphere-only simulations. We conclude that the coupled model is a suitable tool for systemic and detailed study of historical medicane events and also for future projections.

  3. A comparison of human cadaver and augmented reality simulator models for straight laparoscopic colorectal skills acquisition training.

    PubMed

    LeBlanc, Fabien; Champagne, Bradley J; Augestad, Knut M; Neary, Paul C; Senagore, Anthony J; Ellis, Clyde N; Delaney, Conor P

    2010-08-01

    The aim of this study was to compare the human cadaver model with an augmented reality simulator for straight laparoscopic colorectal skills acquisition. Thirty-five sigmoid colectomies were performed on a cadaver (n = 7) or an augmented reality simulator (n = 28) during a laparoscopic training course. Prior laparoscopic colorectal experience was assessed. Objective structured technical skills assessment forms were completed by trainers and trainees independently. Groups were compared according to technical skills and events scores and satisfaction with training model. Prior laparoscopic experience was similar in both groups. For trainers and trainees, technical skills scores were considerably better on the simulator than on the cadaver. For trainers, generic events score was also considerably better on the simulator than on the cadaver. The main generic event occurring on both models was errors in the use of retraction. The main specific event occurring on both models was bowel perforation. Global satisfaction was better for the cadaver than for the simulator model (p < 0.001). The human cadaver model was more difficult but better appreciated than the simulator for laparoscopic sigmoid colectomy training. Simulator training followed by cadaver training can appropriately integrate simulators into the learning curve and maintain the benefits of both training methodologies. Published by Elsevier Inc.

  4. Hand-assisted laparoscopic sigmoid colectomy skills acquisition: augmented reality simulator versus human cadaver training models.

    PubMed

    Leblanc, Fabien; Senagore, Anthony J; Ellis, Clyde N; Champagne, Bradley J; Augestad, Knut M; Neary, Paul C; Delaney, Conor P

    2010-01-01

    The aim of this study was to compare a simulator with the human cadaver model for hand-assisted laparoscopic colorectal skills acquisition training. An observational prospective comparative study was conducted to compare the laparoscopic surgery training models. The study took place during the laparoscopic colectomy training course performed at the annual scientific meeting of the American Society of Colon and Rectal Surgeons. Thirty four practicing surgeons performed hand-assisted laparoscopic sigmoid colectomy on human cadavers (n = 7) and on an augmented reality simulator (n = 27). Prior laparoscopic colorectal experience was assessed. Trainers and trainees completed independently objective structured assessment forms. Training models were compared by trainees' technical skills scores, events scores, and satisfaction. Prior laparoscopic experience was similar in both surgeon groups. Generic and specific skills scores were similar on both training models. Generic events scores were significantly better on the cadaver model. The 2 most frequent generic events occurring on the simulator were poor hand-eye coordination and inefficient use of retraction. Specific events were scored better on the simulator and reached the significance limit (p = 0.051) for trainers. The specific events occurring on the cadaver were intestinal perforation and left ureter identification difficulties. Overall satisfaction was better for the cadaver than for the simulator model (p = 0.009). With regard to skills scores, the augmented reality simulator had adequate qualities for the hand-assisted laparoscopic colectomy training. Nevertheless, events scores highlighted weaknesses of the anatomical replication on the simulator. Although improvements likely will be required to incorporate the simulator more routinely into the colorectal training, it may be useful in its current form for more junior trainees or those early on their learning curve. Copyright 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. An agent-based stochastic Occupancy Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  6. An agent-based stochastic Occupancy Simulator

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    2017-06-01

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  7. Cross-Paradigm Simulation Modeling: Challenges and Successes

    DTIC Science & Technology

    2011-12-01

    is also highlighted. 2.1 Discrete-Event Simulation Discrete-event simulation ( DES ) is a modeling method for stochastic, dynamic models where...which almost anything can be coded; models can be incredibly detailed. Most commercial DES software has a graphical interface which allows the user to...results. Although the above definition is the commonly accepted definition of DES , there are two different worldviews that dominate DES modeling today: a

  8. Hydrologic modeling of two glaciated watersheds in Northeast Pennsylvania

    USGS Publications Warehouse

    Srinivasan, M.S.; Hamlett, J.M.; Day, R.L.; Sams, J.I.; Petersen, G.W.

    1998-01-01

    A hydrologic modeling study, using the Hydrologic Simulation Program - FORTRAN (HSPF), was conducted in two glaciated watersheds, Purdy Creek and Ariel Creek in northeastern Pennsylvania. Both watersheds have wetlands and poorly drained soils due to low hydraulic conductivity and presence of fragipans. The HSPF model was calibrated in the Purdy Creek watershed and verified in the Ariel Creek watershed for June 1992 to December 1993 period. In Purdy Creek, the total volume of observed streamflow during the entire simulation period was 13.36 x 106 m3 and the simulated streamflow volume was 13.82 x 106 m3 (5 percent difference). For the verification simulation in Ariel Creek, the difference between the total observed and simulated flow volumes was 17 percent. Simulated peak flow discharges were within two hours of the observed for 30 of 46 peak flow events (discharge greater than 0.1 m3/sec) in Purdy Creek and 27 of 53 events in Ariel Creek. For 22 of the 46 events in Purdy Creek and 24 of 53 in Ariel Creek, the differences between the observed and simulated peak discharge rates were less than 30 percent. These 22 events accounted for 63 percent of total volume of streamflow observed during the selected 46 peak flow events in Purdy Creek. In Ariel Creek, these 24 peak flow events accounted for 62 percent of the total flow observed during all peak flow events. Differences in observed and simulated peak flow rates and volumes (on a percent basis) were greater during the snowmelt runoff events and summer periods than for other times.A hydrologic modeling study, using the Hydrologic Simulation Program - FORTRAN (HSPF), was conducted in two glaciated watersheds, Purdy Creek and Ariel Creek in northeastern Pennsylvania. Both watersheds have wetlands and poorly drained soils due to low hydraulic conductivity and presence of fragipans. The HSPF model was calibrated in the Purdy Creek watershed and verified in the Ariel Creek watershed for June 1992 to December 1993 period. In Purdy Creek, the total volume of observed streamflow during the entire simulation period was 13.36??106 m3 and the simulated streamflow volume was 13.82??106 m3 (5 percent difference). For the verification simulation in Ariel Creek, the difference between the total observed and simulated flow volumes was 17 percent. Simulated peak flow discharges were within two hours of the observed for 30 of 46 peak flow events (discharge greater than 0.1 m3/sec) in Purdy Creek and 27 of 53 events in Ariel Creek. For 22 of the 46 events in Purdy Creek and 24 of 53 in Ariel Creek, the differences between the observed and simulated peak discharge rates were less than 30 percent. These 22 events accounted for 63 percent of total volume of streamflow observed during the selected 46 peak flow events in Purdy Creek. In Ariel Creek, these 24 peak flow events accounted for 62 percent of the total flow observed during all peak flow events. Differences in observed and simulated peak flow rates and volumes (on a percent basis) were greater during the snowmelt runoff events and summer periods than for other times.

  9. Heinrich events simulated across the glacial

    NASA Astrophysics Data System (ADS)

    Ziemen, F. A.; Mikolajewicz, U.

    2015-12-01

    Heinrich events are among the most prominent climate change events recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. We analyze simulations where the ISM is coupled asynchronously to the AOVGCM and simulations where the ISM and the ocean model are coupled synchronously and the atmosphere model is coupled asynchronously to them. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport.

  10. El Nino - La Nina events simulated with Cane and Zebiak`s model and observed with satellite or in situ data. Part I: Model data comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perigaud C.; Dewitte, B.

    The Zebiak and Cane model is used in its {open_quotes}uncoupled mode,{close_quotes} meaning that the oceanic model component is driven by the Florida State University (FSU) wind stress anomalies over 1980-93 to simulate sea surface temperature anomalies, and these are used in the atmospheric model component to generate wind anomalies. Simulations are compared with data derived from FSU winds, International Satellite Cloud Climatology Project cloud convection, Advanced Very High Resolution Radiometer SST, Geosat sea level, 20{degrees}C isotherm depth derived from an expendable bathythermograph, and current velocities estimated from drifters or current-meter moorings. Forced by the simulated SST, the atmospheric model ismore » fairly successful in reproducing the observed westerlies during El Nino events. The model fails to simulate the easterlies during La Nina 1988. The simulated forcing of the atmosphere is in very poor agreement with the heating derived from cloud convection data. Similarly, the model is fairly successful in reproducing the warm anomalies during El Nino events. However, it fails to simulate the observed cold anomalies. Simulated variations of thermocline depth agree reasonably well with observations. The model simulates zonal current anomalies that are reversing at a dominant 9-month frequency. Projecting altimetric observations on Kelvin and Rossby waves provides an estimate of zonal current anomalies, which is consistent with the ones derived from drifters or from current meter moorings. Unlike the simulated ones, the observed zonal current anomalies reverse from eastward during El Nino events to westward during La Nina events. The simulated 9-month oscillations correspond to a resonant mode of the basin. They can be suppressed by cancelling the wave reflection at the boundaries, or they can be attenuated by increasing the friction in the ocean model. 58 refs., 14 figs., 6 tabs.« less

  11. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  12. Modelling approaches: the case of schizophrenia.

    PubMed

    Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A

    2008-01-01

    Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.

  13. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  14. Modeling and Simulation with INS.

    ERIC Educational Resources Information Center

    Roberts, Stephen D.; And Others

    INS, the Integrated Network Simulation language, puts simulation modeling into a network framework and automatically performs such programming activities as placing the problem into a next event structure, coding events, collecting statistics, monitoring status, and formatting reports. To do this, INS provides a set of symbols (nodes and branches)…

  15. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  16. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    PubMed

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  17. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  18. Numerical simulations of an advection fog event over Shanghai Pudong International Airport with the WRF model

    NASA Astrophysics Data System (ADS)

    Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun

    2017-10-01

    A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.

  19. SIMULATING SUB-DECADAL CHANNEL MORPHOLOGIC CHANGE IN EPHEMERAL STREAM NETWORKS

    EPA Science Inventory

    A distributed watershed model was modified to simulate cumulative channel morphologic
    change from multiple runoff events in ephemeral stream networks. The model incorporates the general design of the event-based Kinematic Runoff and" Erosion Model (KINEROS), which describes t...

  20. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  1. Discrete event simulation: the preferred technique for health economic evaluations?

    PubMed

    Caro, Jaime J; Möller, Jörgen; Getsios, Denis

    2010-12-01

    To argue that discrete event simulation should be preferred to cohort Markov models for economic evaluations in health care. The basis for the modeling techniques is reviewed. For many health-care decisions, existing data are insufficient to fully inform them, necessitating the use of modeling to estimate the consequences that are relevant to decision-makers. These models must reflect what is known about the problem at a level of detail sufficient to inform the questions. Oversimplification will result in estimates that are not only inaccurate, but potentially misleading. Markov cohort models, though currently popular, have so many limitations and inherent assumptions that they are inadequate to inform most health-care decisions. An event-based individual simulation offers an alternative much better suited to the problem. A properly designed discrete event simulation provides more accurate, relevant estimates without being computationally prohibitive. It does require more data and may be a challenge to convey transparently, but these are necessary trade-offs to provide meaningful and valid results. In our opinion, discrete event simulation should be the preferred technique for health economic evaluations today. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).

  2. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  3. New Perspectives on Long Run-out Rock Avalanches: A Dynamic Analysis of 20 Events in the Vaigat Strait, West Greenland

    NASA Astrophysics Data System (ADS)

    Benjamin, J.; Rosser, N. J.; Dunning, S.; Hardy, R. J.; Karim, K.; Szczucinski, W.; Norman, E. C.; Strzelecki, M.; Drewniak, M.

    2014-12-01

    Risk assessments of the threat posed by rock avalanches rely upon numerical modelling of potential run-out and spreading, and are contingent upon a thorough understanding of the flow dynamics inferred from deposits left by previous events. Few records exist of multiple rock avalanches with boundary conditions sufficiently consistent to develop a set of more generalised rules for behaviour across events. A unique cluster of 20 large (3 x 106 - 94 x 106 m3) rock avalanche deposits along the Vaigat Strait, West Greenland, offers a unique opportunity to model a large sample of adjacent events sourced from a stretch of coastal mountains of relatively uniform geology and structure. Our simulations of these events were performed using VolcFlow, a geophysical mass flow code developed to simulate volcanic debris avalanches. Rheological calibration of the model was performed using a well-constrained event at Paatuut (AD 2000). The best-fit simulation assumes a constant retarding stress with a collisional stress coefficient (T0 = 250 kPa, ξ = 0.01), and simulates run-out to within ±0.3% of that observed. Despite being widely used to simulate rock avalanche propagation, other models, that assume either a Coulomb frictional or a Voellmy rheology, failed to reproduce the observed event characteristics and deposit distribution at Paatuut. We applied this calibration to 19 other events, simulating rock avalanche motion across 3D terrain of varying levels of complexity. Our findings illustrate the utility and sensitivity of modelling a single rock avalanche satisfactorily as a function of rheology, alongside the validity of applying the same parameters elsewhere, even within similar boundary conditions. VolcFlow can plausibly account for the observed morphology of a series of deposits emplaced by events of different types, although its performance is sensitive to a range of topographic and geometric factors. These exercises show encouraging results in the model's ability to simulate a series of events using a single set of parameters obtained by back-analysis of the Paatuut event alone. The results also hold important implications for our process understanding of rock avalanches in confined fjord settings, where correctly modelling material flux at the point of entry into the water is critical in tsunami generation.

  4. Simulation of the Tsunami Resulting from the M 9.2 2004 Sumatra-Andaman Earthquake - Dynamic Rupture vs. Seismic Inversion Source Model

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Behrens, Jörn

    2017-04-01

    Simulations of historic tsunami events such as the 2004 Sumatra or the 2011 Tohoku event are usually initialized using earthquake sources resulting from inversion of seismic data. Also, other data from ocean buoys etc. is sometimes included in the derivation of the source model. The associated tsunami event can often be well simulated in this way, and the results show high correlation with measured data. However, it is unclear how the derived source model compares to the particular earthquake event. In this study we use the results from dynamic rupture simulations obtained with SeisSol, a software package based on an ADER-DG discretization solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. The tsunami model is based on a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme on triangular grids and features a robust wetting and drying scheme for the simulation of inundation events at the coast. Adaptive mesh refinement enables the efficient computation of large domains, while at the same time it allows for high local resolution and geometric accuracy. The results are compared to measured data and results using earthquake sources based on inversion. With the approach of using the output of actual dynamic rupture simulations, we can estimate the influence of different earthquake parameters. Furthermore, the comparison to other source models enables a thorough comparison and validation of important tsunami parameters, such as the runup at the coast. This work is part of the ASCETE (Advanced Simulation of Coupled Earthquake and Tsunami Events) project, which aims at an improved understanding of the coupling between the earthquake and the generated tsunami event.

  5. Assessment of the Weather Research and Forecasting (WRF) model for simulation of extreme rainfall events in the upper Ganga Basin

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Osuri, Krishna K.; Mujumdar, Pradeep P.; Niyogi, Dev

    2018-02-01

    Reliable estimates of extreme rainfall events are necessary for an accurate prediction of floods. Most of the global rainfall products are available at a coarse resolution, rendering them less desirable for extreme rainfall analysis. Therefore, regional mesoscale models such as the advanced research version of the Weather Research and Forecasting (WRF) model are often used to provide rainfall estimates at fine grid spacing. Modelling heavy rainfall events is an enduring challenge, as such events depend on multi-scale interactions, and the model configurations such as grid spacing, physical parameterization and initialization. With this background, the WRF model is implemented in this study to investigate the impact of different processes on extreme rainfall simulation, by considering a representative event that occurred during 15-18 June 2013 over the Ganga Basin in India, which is located at the foothills of the Himalayas. This event is simulated with ensembles involving four different microphysics (MP), two cumulus (CU) parameterizations, two planetary boundary layers (PBLs) and two land surface physics options, as well as different resolutions (grid spacing) within the WRF model. The simulated rainfall is evaluated against the observations from 18 rain gauges and the Tropical Rainfall Measuring Mission Multi-Satellite Precipitation Analysis (TMPA) 3B42RT version 7 data. From the analysis, it should be noted that the choice of MP scheme influences the spatial pattern of rainfall, while the choice of PBL and CU parameterizations influences the magnitude of rainfall in the model simulations. Further, the WRF run with Goddard MP, Mellor-Yamada-Janjic PBL and Betts-Miller-Janjic CU scheme is found to perform best in simulating this heavy rain event. The selected configuration is evaluated for several heavy to extremely heavy rainfall events that occurred across different months of the monsoon season in the region. The model performance improved through incorporation of detailed land surface processes involving prognostic soil moisture evolution in Noah scheme compared to the simple Slab model. To analyse the effect of model grid spacing, two sets of downscaling ratios - (i) 1 : 3, global to regional (G2R) scale and (ii) 1 : 9, global to convection-permitting scale (G2C) - are employed. Results indicate that a higher downscaling ratio (G2C) causes higher variability and consequently large errors in the simulations. Therefore, G2R is adopted as a suitable choice for simulating heavy rainfall event in the present case study. Further, the WRF-simulated rainfall is found to exhibit less bias when compared with the NCEP FiNaL (FNL) reanalysis data.

  6. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  7. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    PubMed

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.

  8. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.

    2014-03-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid-spacings of 0.44°, 0.22°, and 0.08°; with/without spectral nudging, and an ocean grid-spacing of 1/12°). The results show that at high-resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.

  9. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.

    2014-08-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid spacings of 0.44, 0.22, and 0.08°; with/without spectral nudging, and an ocean grid spacing of 1/12°). The results show that at high resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.

  10. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the modelmore » size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.« less

  11. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  12. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  13. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  14. Comparison of ground motions from hybrid simulations to nga prediction equations

    USGS Publications Warehouse

    Star, L.M.; Stewart, J.P.; Graves, R.W.

    2011-01-01

    We compare simulated motions for a Mw 7.8 rupture scenario on the San Andreas Fault known as the ShakeOut event, two permutations with different hypocenter locations, and a Mw 7.15 Puente Hills blind thrust scenario, to median and dispersion predictions from empirical NGA ground motion prediction equations. We find the simulated motions attenuate faster with distance than is predicted by the NGA models for periods less than about 5.0 s After removing this distance attenuation bias, the average residuals of the simulated events (i.e., event terms) are generally within the scatter of empirical event terms, although the ShakeOut simulation appears to be a high static stress drop event. The intraevent dispersion in the simulations is lower than NGA values at short periods and abruptly increases at 1.0 s due to different simulation procedures at short and long periods. The simulated motions have a depth-dependent basin response similar to the NGA models, and also show complex effects in which stronger basin response occurs when the fault rupture transmits energy into a basin at low angle, which is not predicted by the NGA models. Rupture directivity effects are found to scale with the isochrone parameter ?? 2011, Earthquake Engineering Research Institute.

  15. Hybrid Architectural Framework for C4ISR and Discrete-Event Simulation (DES) to Support Sensor-Driven Model Synthesis in Real-World Scenarios

    DTIC Science & Technology

    2013-09-01

    which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM

  16. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  17. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  18. Simulation of earthquake ground motions in the eastern United States using deterministic physics‐based and site‐based stochastic approaches

    USGS Publications Warehouse

    Rezaeian, Sanaz; Hartzell, Stephen; Sun, Xiaodan; Mendoza, Carlos

    2017-01-01

    Earthquake ground‐motion recordings are scarce in the central and eastern United States (CEUS) for large‐magnitude events and at close distances. We use two different simulation approaches, a deterministic physics‐based method and a site‐based stochastic method, to simulate ground motions over a wide range of magnitudes. Drawing on previous results for the modeling of recordings from the 2011 Mw 5.8 Mineral, Virginia, earthquake and using the 2001 Mw 7.6 Bhuj, India, earthquake as a tectonic analog for a large magnitude CEUS event, we are able to calibrate the two simulation methods over this magnitude range. Both models show a good fit to the Mineral and Bhuj observations from 0.1 to 10 Hz. Model parameters are then adjusted to obtain simulations for Mw 6.5, 7.0, and 7.6 events in the CEUS. Our simulations are compared with the 2014 U.S. Geological Survey weighted combination of existing ground‐motion prediction equations in the CEUS. The physics‐based simulations show comparable response spectral amplitudes and a fairly similar attenuation with distance. The site‐based stochastic simulations suggest a slightly faster attenuation of the response spectral amplitudes with distance for larger magnitude events and, as a result, slightly lower amplitudes at distances greater than 200 km. Both models are plausible alternatives and, given the few available data points in the CEUS, can be used to represent the epistemic uncertainty in modeling of postulated CEUS large‐magnitude events.

  19. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  20. Modeling and simulation of count data.

    PubMed

    Plan, E L

    2014-08-13

    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.

  1. Scavenging and recombination kinetics in a radiation spur: The successive ordered scavenging events

    NASA Astrophysics Data System (ADS)

    Al-Samra, Eyad H.; Green, Nicholas J. B.

    2018-03-01

    This study describes stochastic models to investigate the successive ordered scavenging events in a spur of four radicals, a model system based on a radiation spur. Three simulation models have been developed to obtain the probabilities of the ordered scavenging events: (i) a Monte Carlo random flight (RF) model, (ii) hybrid simulations in which the reaction rate coefficient is used to generate scavenging times for the radicals and (iii) the independent reaction times (IRT) method. The results of these simulations are found to be in agreement with one another. In addition, a detailed master equation treatment is also presented, and used to extract simulated rate coefficients of the ordered scavenging reactions from the RF simulations. These rate coefficients are transient, the rate coefficients obtained for subsequent reactions are effectively equal, and in reasonable agreement with the simple correction for competition effects that has recently been proposed.

  2. Comparison of holstein and jersey milk production with a new stochastic animal reproduction model

    USDA-ARS?s Scientific Manuscript database

    Holsteins and Jerseys are the most popular breeds in the US dairy industry. We built a stochastic, Monte Carlo life events simulation model in Python to test if Jersey cattle’s higher conception rate offsets their lower milk production. The model simulates individual cows and their life events such ...

  3. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  4. Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model

    NASA Astrophysics Data System (ADS)

    Mikolajewicz, Uwe; Ziemen, Florian

    2016-04-01

    Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.

  5. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  6. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  7. The development of a simulation model of primary prevention strategies for coronary heart disease.

    PubMed

    Babad, Hannah; Sanderson, Colin; Naidoo, Bhash; White, Ian; Wang, Duolao

    2002-11-01

    This paper describes the present state of development of a discrete-event micro-simulation model for coronary heart disease prevention. The model is intended to support health policy makers in assessing the impacts on health care resources of different primary prevention strategies. For each person, a set of times to disease events, conditional on the individual's risk factor profile, is sampled from a set of probability distributions that are derived from a new analysis of the Framingham cohort study on coronary heart disease. Methods used to model changes in behavioural and physiological risk factors are discussed and a description of the simulation logic is given. The model incorporates POST (Patient Oriented Simulation Technique) simulation routines.

  8. A Two-Step Method to Select Major Surge-Producing Extratropical Cyclones from a 10,000-Year Stochastic Catalog

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Yablonsky, R. M.

    2016-12-01

    Extratropical cyclones (ETCs) are the primary driver of storm surge events along the UK and northwest mainland Europe coastlines. In an effort to evaluate the storm surge risk in coastal communities in this region, a stochastic catalog is developed by perturbing the historical storm seeds of European ETCs to account for 10,000 years of possible ETCs. Numerical simulation of the storm surge generated by the full 10,000-year stochastic catalog, however, is computationally expensive and may take several months to complete with available computational resources. A new statistical regression model is developed to select the major surge-generating events from the stochastic ETC catalog. This regression model is based on the maximum storm surge, obtained via numerical simulations using a calibrated version of the Delft3D-FM hydrodynamic model with a relatively coarse mesh, of 1750 historical ETC events that occurred over the past 38 years in Europe. These numerically-simulated surge values were regressed to the local sea level pressure and the U and V components of the wind field at the location of 196 tide gauge stations near the UK and northwest mainland Europe coastal areas. The regression model suggests that storm surge values in the area of interest are highly correlated to the U- and V-component of wind speed, as well as the sea level pressure. Based on these correlations, the regression model was then used to select surge-generating storms from the 10,000-year stochastic catalog. Results suggest that roughly 105,000 events out of 480,000 stochastic storms are surge-generating events and need to be considered for numerical simulation using a hydrodynamic model. The selected stochastic storms were then simulated in Delft3D-FM, and the final refinement of the storm population was performed based on return period analysis of the 1750 historical event simulations at each of the 196 tide gauges in preparation for Delft3D-FM fine mesh simulations.

  9. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less

  10. Modeling of extreme freshwater outflow from the north-eastern Japanese river basins to western Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Troselj, Josko; Sayama, Takahiro; Varlamov, Sergey M.; Sasaki, Toshiharu; Racault, Marie-Fanny; Takara, Kaoru; Miyazawa, Yasumasa; Kuroki, Ryusuke; Yamagata, Toshio; Yamashiki, Yosuke

    2017-12-01

    This study demonstrates the importance of accurate extreme discharge input in hydrological and oceanographic combined modeling by introducing two extreme typhoon events. We investigated the effects of extreme freshwater outflow events from river mouths on sea surface salinity distribution (SSS) in the coastal zone of the north-eastern Japan. Previous studies have used observed discharge at the river mouth, as well as seasonally averaged inter-annual, annual, monthly or daily simulated data. Here, we reproduced the hourly peak discharge during two typhoon events for a targeted set of nine rivers and compared their impact on SSS in the coastal zone based on observed, climatological and simulated freshwater outflows in conjunction with verification of the results using satellite remote-sensing data. We created a set of hourly simulated freshwater outflow data from nine first-class Japanese river basins flowing to the western Pacific Ocean for the two targeted typhoon events (Chataan and Roke) and used it with the integrated hydrological (CDRMV3.1.1) and oceanographic (JCOPE-T) model, to compare the case using climatological mean monthly discharges as freshwater input from rivers with the case using our hydrological model simulated discharges. By using the CDRMV model optimized with the SCE-UA method, we successfully reproduced hindcasts for peak discharges of extreme typhoon events at the river mouths and could consider multiple river basin locations. Modeled SSS results were verified by comparison with Chlorophyll-a distribution, observed by satellite remote sensing. The projection of SSS in the coastal zone became more realistic than without including extreme freshwater outflow. These results suggest that our hydrological models with optimized model parameters calibrated to the Typhoon Roke and Chataan cases can be successfully used to predict runoff values from other extreme precipitation events with similar physical characteristics. Proper simulation of extreme typhoon events provides more realistic coastal SSS and may allow a different scenario analysis with various precipitation inputs for developing a nowcasting analysis in the future.

  11. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  12. Simulation modeling of route guidance concept

    DOT National Transportation Integrated Search

    1997-01-01

    The methodology of a simulation model developed at the University of New South Wales, Australia, for the evaluation of performance of Dynamic Route Guidance Systems (DRGS) is described. The microscopic simulation model adopts the event update simulat...

  13. Event-driven simulations of nonlinear integrate-and-fire neurons.

    PubMed

    Tonnelier, Arnaud; Belmabrouk, Hana; Martinez, Dominique

    2007-12-01

    Event-driven strategies have been used to simulate spiking neural networks exactly. Previous work is limited to linear integrate-and-fire neurons. In this note, we extend event-driven schemes to a class of nonlinear integrate-and-fire models. Results are presented for the quadratic integrate-and-fire model with instantaneous or exponential synaptic currents. Extensions to conductance-based currents and exponential integrate-and-fire neurons are discussed.

  14. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    NASA Astrophysics Data System (ADS)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  15. Detection and Attribution of Simulated Climatic Extreme Events and Impacts: High Sensitivity to Bias Correction

    NASA Astrophysics Data System (ADS)

    Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.

    2015-12-01

    Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/

  16. Passenger rail security, planning, and resilience: application of network, plume, and economic simulation models as decision support tools.

    PubMed

    Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer

    2013-11-01

    We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. © 2013 Society for Risk Analysis.

  17. Passenger Rail Security, Planning, and Resilience: Application of Network, Plume, and Economic Simulation Models as Decision Support Tools

    PubMed Central

    Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer

    2014-01-01

    We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. PMID:23718133

  18. An improved simulation of the 2015 El Niño event by optimally correcting the initial conditions and model parameters in an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan

    2017-09-01

    Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.

  19. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  1. Investigation of 2‐stage meta‐analysis methods for joint longitudinal and time‐to‐event data through simulation and real data application

    PubMed Central

    Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi

    2017-01-01

    Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814

  2. Simulation of seismic events induced by CO2 injection at In Salah, Algeria

    NASA Astrophysics Data System (ADS)

    Verdon, James P.; Stork, Anna L.; Bissell, Rob C.; Bond, Clare E.; Werner, Maximilian J.

    2015-09-01

    Carbon capture and storage technology has the potential to reduce anthropogenic CO2 emissions. However, the geomechanical response of the reservoir and sealing caprocks must be modelled and monitored to ensure that injected CO2 is safely stored. To ensure confidence in model results, there is a clear need to develop ways of comparing model predictions with observations from the field. In this paper we develop an approach to simulate microseismic activity induced by injection, which allows us to compare geomechanical model predictions with observed microseismic activity. We apply this method to the In Salah CCS project, Algeria. A geomechanical reconstruction is used to simulate the locations, orientations and sizes of pre-existing fractures in the In Salah reservoir. The initial stress conditions, in combination with a history matched reservoir flow model, are used to determine when and where these fractures exceed Mohr-Coulomb limits, triggering failure. The sizes and orientations of fractures, and the stress conditions thereon, are used to determine the resulting micro-earthquake focal mechanisms and magnitudes. We compare our simulated event population with observations made at In Salah, finding good agreement between model and observations in terms of event locations, rates of seismicity, and event magnitudes.

  3. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  4. Understanding Uncertainties and Biases in Jet Quenching in High-Energy Nucleus-Nucleus Collisions

    NASA Astrophysics Data System (ADS)

    Heinz, Matthias

    2017-09-01

    Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by the same fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model from which to sample particle types and a 3-dimensional Boltzmann distribution from which to sample particle momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested first on pure jet events and later on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  5. Why continuous simulation? The role of antecedent moisture in design flood estimation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S.; Westra, S.; Sharma, A.

    2012-06-01

    Continuous simulation for design flood estimation is increasingly becoming a viable alternative to traditional event-based methods. The advantage of continuous simulation approaches is that the catchment moisture state prior to the flood-producing rainfall event is implicitly incorporated within the modeling framework, provided the model has been calibrated and validated to produce reasonable simulations. This contrasts with event-based models in which both information about the expected sequence of rainfall and evaporation preceding the flood-producing rainfall event, as well as catchment storage and infiltration properties, are commonly pooled together into a single set of "loss" parameters which require adjustment through the process of calibration. To identify the importance of accounting for antecedent moisture in flood modeling, this paper uses a continuous rainfall-runoff model calibrated to 45 catchments in the Murray-Darling Basin in Australia. Flood peaks derived using the historical daily rainfall record are compared with those derived using resampled daily rainfall, for which the sequencing of wet and dry days preceding the heavy rainfall event is removed. The analysis shows that there is a consistent underestimation of the design flood events when antecedent moisture is not properly simulated, which can be as much as 30% when only 1 or 2 days of antecedent rainfall are considered, compared to 5% when this is extended to 60 days of prior rainfall. These results show that, in general, it is necessary to consider both short-term memory in rainfall associated with synoptic scale dependence, as well as longer-term memory at seasonal or longer time scale variability in order to obtain accurate design flood estimates.

  6. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  7. Simulation of EAST vertical displacement events by tokamak simulation code

    NASA Astrophysics Data System (ADS)

    Qiu, Qinglai; Xiao, Bingjia; Guo, Yong; Liu, Lei; Xing, Zhe; Humphreys, D. A.

    2016-10-01

    Vertical instability is a potentially serious hazard for elongated plasma. In this paper, the tokamak simulation code (TSC) is used to simulate vertical displacement events (VDE) on the experimental advanced superconducting tokamak (EAST). Key parameters from simulations, including plasma current, plasma shape and position, flux contours and magnetic measurements match experimental data well. The growth rates simulated by TSC are in good agreement with TokSys results. In addition to modeling the free drift, an EAST fast vertical control model enables TSC to simulate the course of VDE recovery. The trajectories of the plasma current center and control currents on internal coils (IC) fit experimental data well.

  8. A Systems Approach to Scalable Transportation Network Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S

    2006-01-01

    Emerging needs in transportation network modeling and simulation are raising new challenges with respect to scal-ability of network size and vehicular traffic intensity, speed of simulation for simulation-based optimization, and fidel-ity of vehicular behavior for accurate capture of event phe-nomena. Parallel execution is warranted to sustain the re-quired detail, size and speed. However, few parallel simulators exist for such applications, partly due to the challenges underlying their development. Moreover, many simulators are based on time-stepped models, which can be computationally inefficient for the purposes of modeling evacuation traffic. Here an approach is presented to de-signing a simulator with memory andmore » speed efficiency as the goals from the outset, and, specifically, scalability via parallel execution. The design makes use of discrete event modeling techniques as well as parallel simulation meth-ods. Our simulator, called SCATTER, is being developed, incorporating such design considerations. Preliminary per-formance results are presented on benchmark road net-works, showing scalability to one million vehicles simu-lated on one processor.« less

  9. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  10. Using observed postconstruction peak discharges to evaluate a hydrologic and hydraulic design model, Boneyard Creek, Champaign and Urbana, Illinois

    USGS Publications Warehouse

    Over, Thomas M.; Soong, David T.; Holmes, Robert R.

    2011-01-01

    Boneyard Creek—which drains an urbanized watershed in the cities of Champaign and Urbana, Illinois, including part of the University of Illinois at Urbana-Champaign (UIUC) campus—has historically been prone to flooding. Using the Stormwater Management Model (SWMM), a hydrologic and hydraulic model of Boneyard Creek was developed for the design of the projects making up the first phase of a long-term plan for flood control on Boneyard Creek, and the construction of the projects was completed in May 2003. The U.S. Geological Survey, in cooperation with the Cities of Champaign and Urbana and UIUC, installed and operated stream and rain gages in order to obtain data for evaluation of the design-model simulations. In this study, design-model simulations were evaluated by using observed postconstruction precipitation and peak-discharge data. Between May 2003 and September 2008, five high-flow events on Boneyard Creek satisfied the study criterion. The five events were simulated with the design model by using observed precipitation. The simulations were run with two different values of the parameter controlling the soil moisture at the beginning of the storms and two different ways of spatially distributing the precipitation, making a total of four simulation scenarios. The simulated and observed peak discharges and stages were compared at gaged locations along the Creek. The discharge at one of these locations was deemed to be critical for evaluating the design model. The uncertainty of the measured peak discharge was also estimated at the critical location with a method based on linear regression of the stage and discharge relation, an estimate of the uncertainty of the acoustic Doppler velocity meter measurements, and the uncertainty of the stage measurements. For four of the five events, the simulated peak discharges lie within the 95-percent confidence interval of the observed peak discharges at the critical location; the fifth was just outside the upper end of this interval. For two of the four simulation scenarios, the simulation results for one event at the critical location were numerically unstable in the vicinity of the discharge peak. For the remaining scenarios, the simulated peak discharges over the five events at the critical location differ from the observed peak discharges (simulated minus observed) by an average of 7.7 and -1.5 percent, respectively. The simulated peak discharges over the four events for which all scenarios have numerically stable results at the critical location differs from the observed peak discharges (simulated minus observed) by an average of -6.8, 4.0, -5.4, and 1.5 percent, for the four scenarios, respectively. Overall, the discharge peaks simulated for this study at the critical location are approximately balanced between overprediction and underprediction and do not indicate significant model bias or inaccuracy. Additional comparisons were made by using peak stages at the critical location and two additional sites and using peak discharges at one additional site. These comparisons showed the same pattern of differences between observed and simulated values across events but varying biases depending on streamgage and measurement type (discharge or stage). Altogether, the results from this study show no clear evidence that the design model is significantly inaccurate or biased and, therefore, no clear evidence that the modeled flood-control projects in Champaign and on the University of Illinois campus have increased flood stages or discharges downstream in Urbana.

  11. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    PubMed

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  12. Predictability of the 1997 and 1998 South Asian Summer Monsoons on the Intraseasonal Time Scale Based on 10 AMIP2 Model Runs

    NASA Technical Reports Server (NTRS)

    Wu, Man Li C.; Schubert, Siegfried; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Predictability of the 1997 and 1998 South Asian summer monsoons is examined using National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalyses, and 100 two-year simulations with ten different Atmospheric General Circulation Models (AGCMs) with prescribed sea surface temperature (SST). We focus on the intraseasonal variations of the south Asian summer monsoon associated with the Madden-Julian Oscillation (MJO). The NCEP/NCAR reanalysis shows a clear coupling between SST anomalies and upper level velocity potential anomalies associated with the MJO. We analyze several MJO events that developed during the 1997 and 1998 focusing of the coupling with the SST. The same analysis is carried out for the model simulations. Remarkably, the ensemble mean of the two-year AGCM simulations show a signature of the observed MJO events. The ensemble mean simulated MJO events are approximately in phase with the observed events, although they are weaker, the period of oscillation is somewhat longer, and their onset is delayed by about ten days compared with the observations. Details of the analysis and comparisons among the ten AMIP2 (Atmospheric Model Intercomparison Project) models will be presented in the conference.

  13. Capturing flood-to-drought transitions in regional climate model simulations

    NASA Astrophysics Data System (ADS)

    Anders, Ivonne; Haslinger, Klaus; Hofstätter, Michael; Salzmann, Manuela; Resch, Gernot

    2017-04-01

    In previous studies atmospheric cyclones have been investigated in terms of related precipitation extremes in Central Europe. Mediterranean (Vb-like) cyclones are of special relevance as they are frequently related to high atmospheric moisture fluxes leading to floods and landslides in the Alpine region. Another focus in this area is on droughts, affecting soil moisture and surface and sub-surface runoff as well. Such events develop differently depending on available pre-saturation of water in the soil. In a first step we investigated two time periods which encompass a flood event and a subsequent drought on very different time scales, one long lasting transition (2002/2003) and a rather short one between May and August 2013. In a second step we extended the investigation to the long time period 1950-2016. We focused on high spatial and temporal scales and assessed the currently achievable accuracy in the simulation of the Vb-events on one hand and following drought events on the other hand. The state-of-the-art regional climate model CCLM is applied in hindcast-mode simulating the single events described above, but also the time from 1948 to 2016 to evaluate the results from the short runs to be valid for the long time period. Besides the conventional forcing of the regional climate model at its lateral boundaries, a spectral nudging technique is applied. The simulations covering the European domain have been varied systematically different model parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). For the drought events the Standardized Precipitation Evapotranspiration Index (SPEI), soil moisture and runoff has been investigated. Varying the spectral nudging setup helps us to understand the 3D-processes during these events, but also to identify model deficiencies. To improve the simulation of such events in the past, improves also the ability to assess a climate change signal in the recent and far future.

  14. DeMO: An Ontology for Discrete-event Modeling and Simulation.

    PubMed

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-09-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.

  15. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  16. Simulation of a master-slave event set processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comfort, J.C.

    1984-03-01

    Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less

  17. Running Parallel Discrete Event Simulators on Sierra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  18. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  19. Arcus end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  20. Modeling the Historical Flood Events in France

    NASA Astrophysics Data System (ADS)

    Ali, Hani; Blaquière, Simon

    2017-04-01

    We will present the simulation results for different scenarios based on the flood model developed by AXA Global P&C CAT Modeling team. The model uses a Digital Elevation Model (DEM) with 75 m resolution, a hydrographic system (DB Carthage), daily rainfall data from "Météo France", water level from "HYDRO Banque" the French Hydrological Database (www.hydro.eaufrance.fr), for more than 1500 stations, hydrological model from IRSTEA and in-house hydraulic tool. In particular, the model re-simulates the most important and costly flood events that occurred during the past decade in France: we will present the re-simulated meteorological conditions since 1964 and estimate insurance loss incurred on current AXA portfolio of individual risks.

  1. Investigation of 2-stage meta-analysis methods for joint longitudinal and time-to-event data through simulation and real data application.

    PubMed

    Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi

    2018-04-15

    Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  2. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models.

  3. Response of the Antarctic Stratosphere to Warm Pool EI Nino Events in the GEOS CCM

    NASA Technical Reports Server (NTRS)

    Hurwitz, Margaret M.; Song, In-Sun; Oman, Luke D.; Newman, Paul A.; Molod, Andrea M.; Frith, Stacey M.; Nielsen, J. Eric

    2011-01-01

    A new type of EI Nino event has been identified in the last decade. During "warm pool" EI Nino (WPEN) events, sea surface temperatures (SSTs) in the central equatorial Pacific are warmer than average. The EI Nino signal propagates poleward and upward as large-scale atmospheric waves, causing unusual weather patterns and warming the polar stratosphere. In austral summer, observations show that the Antarctic lower stratosphere is several degrees (K) warmer during WPEN events than during the neutral phase of EI Nino/Southern Oscillation (ENSO). Furthermore, the stratospheric response to WPEN events depends of the direction of tropical stratospheric winds: the Antarctic warming is largest when WPEN events are coincident with westward winds in the tropical lower and middle stratosphere i.e., the westward phase of the quasi-biennial oscillation (QBO). Westward winds are associated with enhanced convection in the subtropics, and with increased poleward wave activity. In this paper, a new formulation of the Goddard Earth Observing System Chemistry-Climate Model, Version 2 (GEOS V2 CCM) is used to substantiate the observed stratospheric response to WPEN events. One simulation is driven by SSTs typical of a WPEN event, while another simulation is driven by ENSO neutral SSTs; both represent a present-day climate. Differences between the two simulations can be directly attributed to the anomalous WPEN SSTs. During WPEN events, relative to ENSO neutral, the model simulates the observed increase in poleward planetary wave activity in the South Pacific during austral spring, as well as the relative warming of the Antarctic lower stratosphere in austral summer. However, the modeled response to WPEN does not depend on the phase of the QBO. The modeled tropical wind oscillation does not extend far enough into the lower stratosphere and upper troposphere, likely explaining the model's insensitivity to the phase of the QBO during WPEN events.

  4. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  5. Heinrich events modeled in transient glacial simulations

    NASA Astrophysics Data System (ADS)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  6. Software engineering and simulation

    NASA Technical Reports Server (NTRS)

    Zhang, Shou X.; Schroer, Bernard J.; Messimer, Sherri L.; Tseng, Fan T.

    1990-01-01

    This paper summarizes the development of several automatic programming systems for discrete event simulation. Emphasis is given on the model development, or problem definition, and the model writing phases of the modeling life cycle.

  7. Weather and extremes in the last Millennium - a challenge for climate modelling

    NASA Astrophysics Data System (ADS)

    Raible, Christoph C.; Blumer, Sandro R.; Gomez-Navarro, Juan J.; Lehner, Flavio

    2015-04-01

    Changes in the climate mean state are expected to influence society, but the socio-economic sensitivity to extreme events might be even more severe. Whether or not the current frequency and severity of extreme events is a unique characteristic of anthropogenic-driven climate change can be assessed by putting the observed changes in a long-term perspective. In doing so, early instrumental series and proxy archives are a rich source to investigate also extreme events, in particular during the last millennium, yet they suffer from spatial and temporal scarcity. Therefore, simulations with coupled general circulation models (GCMs) could fill such gaps and help in deepening our process understanding. In this study, an overview of past and current efforts as well as challenges in modelling paleo weather and extreme events is presented. Using simulations of the last millennium we investigate extreme midlatitude cyclone characteristics, precipitation, and their connection to large-scale atmospheric patterns in the North Atlantic European region. In cold climate states such as the Maunder Minimum, the North Atlantic Oscillation (NAO) is found to be predominantly in its negative phase. In this sense, simulations of different models agree with proxy findings for this period. However, some proxy data available for this period suggests an increase in storminess during this period, which could be interpreted as a positive phase of the NAO - a superficial contradiction. The simulated cyclones are partly reduced over Europe, which is consistent with the aforementioned negative phase of the NAO. However, as the meridional temperature gradient is increased during this period - which constitutes a source of low-level baroclincity - they also intensify. This example illustrates how model simulations could be used to improve our proxy interpretation and to gain additional process understanding. Nevertheless, there are also limitations associated with climate modeling efforts to simulate the last millennium. In particular, these models still struggle to properly simulate atmospheric blocking events, an important dynamical feature for dry conditions during summer times. Finally, new and promising ways in improving past climate modelling are briefly introduced. In particular, the use of dynamical downscaling is a powerful tool to bridge the gap between the coarsely resolved GCMs and characteristics of the regional climate, which is potentially recorded in proxy archives. In particular, the representation of extreme events could be improved by dynamical downscaling as processes are better resolved than GCMs.

  8. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  9. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  10. How Unusual were Hurricane Harvey's Rains?

    NASA Astrophysics Data System (ADS)

    Emanuel, K.

    2017-12-01

    We apply an advanced technique for hurricane risk assessment to evaluate the probability of hurricane rainfall of Harvey's magnitude. The technique embeds a detailed computational hurricane model in the large-scale conditions represented by climate reanalyses and by climate models. We simulate 3700 hurricane events affecting the state of Texas, from each of three climate reanalyses spanning the period 1980-2016, and 2000 events from each of six climate models for each of two periods: the period 1981-2000 from historical simulations, and the period 2081-2100 from future simulations under Representative Concentration Pathway (RCP) 8.5. On the basis of these simulations, we estimate that hurricane rain of Harvey's magnitude in the state of Texas would have had an annual probability of 0.01 in the late twentieth century, and will have an annual probability of 0.18 by the end of this century, with remarkably small scatter among the six climate models downscaled. If the event frequency is changing linearly over time, this would yield an annual probability of 0.06 in 2017.

  11. Diagnosing added value of convection-permitting regional models using precipitation event identification and tracking

    NASA Astrophysics Data System (ADS)

    Chang, W.; Wang, J.; Marohnic, J.; Kotamarthi, V. R.; Moyer, E. J.

    2017-12-01

    We use a novel rainstorm identification and tracking algorithm (Chang et al 2016) to evaluate the effects of using resolved convection on improving how faithfully high-resolution regional simulations capture precipitation characteristics. The identification and tracking algorithm allocates all precipitation to individual rainstorms, including low-intensity events with complicated features, and allows us to decompose changes or biases in total mean precipitation into their causes: event size, intensity, number, and duration. It allows lower threshold for tracking so captures nearly all rainfall and improves tracking, so that events that are clearly meteorologically related are tracked across lifespans up to days. We evaluate a series of dynamically downscaled simulations of the summertime United States at 12 and 4 km under different model configurations, and find that resolved convection offers the largest gains in reducing biases in precipitation characteristics, especially in event size. Simulations with parametrized convection produce event sizes 80-220% too large in extent; with resolved convection the bias is reduced to 30%. The identification and tracking algorithm also allows us to demonstrate that the diurnal cycle in rainfall stems not from temporal variation in the production of new events but from diurnal fluctuations in rainfall from existing events. We show further hat model errors in the diurnal cycle biases are best represented as additive offsets that differ by time of day, and again that convection-permitting simulations are most efficient in reducing these additive biases.

  12. Simulating and Forecasting Flooding Events in the City of Jeddah, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Ghostine, Rabih; Viswanadhapalli, Yesubabu; Hoteit, Ibrahim

    2014-05-01

    Metropolitan cities in the Kingdom of Saudi Arabia, as Jeddah and Riyadh, are more frequently experiencing flooding events caused by strong convective storms that produce intense precipitation over a short span of time. The flooding in the city of Jeddah in November 2009 was described by civil defense officials as the worst in 27 years. As of January 2010, 150 people were reported killed and more than 350 were missing. Another flooding event, less damaging but comparably spectacular, occurred one year later (Jan 2011) in Jeddah. Anticipating floods before they occur could minimize human and economic losses through the implementation of appropriate protection, provision and rescue plans. We have developed a coupled hydro-meteorological model for simulating and predicting flooding events in the city of Jeddah. We use the Weather Research Forecasting (WRF) model assimilating all available data in the Jeddah region for simulating the storm events in Jeddah. The resulting rain is then used on 10 minutes intervals to feed up an advanced numerical shallow water model that has been discretized on an unstructured grid using different numerical schemes based on the finite elements or finite volume techniques. The model was integrated on a high-resolution grid size varying between 0.5m within the streets of Jeddah and 500m outside the city. This contribution will present the flooding simulation system and the simulation results, focusing on the comparison of the different numerical schemes on the system performances in terms of accuracy and computational efficiency.

  13. Core discrete event simulation model for the evaluation of health care technologies in major depressive disorder.

    PubMed

    Vataire, Anne-Lise; Aballéa, Samuel; Antonanzas, Fernando; Roijen, Leona Hakkaart-van; Lam, Raymond W; McCrone, Paul; Persson, Ulf; Toumi, Mondher

    2014-03-01

    A review of existing economic models in major depressive disorder (MDD) highlighted the need for models with longer time horizons that also account for heterogeneity in treatment pathways between patients. A core discrete event simulation model was developed to estimate health and cost outcomes associated with alternative treatment strategies. This model simulated short- and long-term clinical events (partial response, remission, relapse, recovery, and recurrence), adverse events, and treatment changes (titration, switch, addition, and discontinuation) over up to 5 years. Several treatment pathways were defined on the basis of fictitious antidepressants with three levels of efficacy, tolerability, and price (low, medium, and high) from first line to third line. The model was populated with input data from the literature for the UK setting. Model outputs include time in different health states, quality-adjusted life-years (QALYs), and costs from National Health Service and societal perspectives. The codes are open source. Predicted costs and QALYs from this model are within the range of results from previous economic evaluations. The largest cost components from the payer perspective were physician visits and hospitalizations. Key parameters driving the predicted costs and QALYs were utility values, effectiveness, and frequency of physician visits. Differences in QALYs and costs between two strategies with different effectiveness increased approximately twofold when the time horizon increased from 1 to 5 years. The discrete event simulation model can provide a more comprehensive evaluation of different therapeutic options in MDD, compared with existing Markov models, and can be used to compare a wide range of health care technologies in various groups of patients with MDD. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. A model of nitrous oxide evolution from soil driven by rainfall events. I - Model structure and sensitivity. II - Model applications

    NASA Technical Reports Server (NTRS)

    Changsheng, LI; Frolking, Steve; Frolking, Tod A.

    1992-01-01

    Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.

  15. A New ’Availability-Payment’ Model for Pricing Performance-Based Logistics Contracts

    DTIC Science & Technology

    2014-04-30

    maintenance network connected to the inventory and Original Equipment Manufacturer (OEM) used in this paper. The input to the Petri net in Figure 2 is the...contract structures. The model developed in this paper uses an affine controller to drive a discrete event simulator ( Petri net ) that produces...discrete event simulator ( Petri net ) that produces availability and cost measures. The model is used to explore the optimum availability assessment

  16. Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net

    NASA Astrophysics Data System (ADS)

    Ren, Yujuan; Bao, Hong

    2016-11-01

    In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.

  17. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  18. Producing physically consistent and bias free extreme precipitation events over the Switzerland: Bridging gaps between meteorology and impact models

    NASA Astrophysics Data System (ADS)

    José Gómez-Navarro, Juan; Raible, Christoph C.; Blumer, Sandro; Martius, Olivia; Felder, Guido

    2016-04-01

    Extreme precipitation episodes, although rare, are natural phenomena that can threat human activities, especially in areas densely populated such as Switzerland. Their relevance demands the design of public policies that protect public assets and private property. Therefore, increasing the current understanding of such exceptional situations is required, i.e. the climatic characterisation of their triggering circumstances, severity, frequency, and spatial distribution. Such increased knowledge shall eventually lead us to produce more reliable projections about the behaviour of these events under ongoing climate change. Unfortunately, the study of extreme situations is hampered by the short instrumental record, which precludes a proper characterization of events with return period exceeding few decades. This study proposes a new approach that allows studying storms based on a synthetic, but physically consistent database of weather situations obtained from a long climate simulation. Our starting point is a 500-yr control simulation carried out with the Community Earth System Model (CESM). In a second step, this dataset is dynamically downscaled with the Weather Research and Forecasting model (WRF) to a final resolution of 2 km over the Alpine area. However, downscaling the full CESM simulation at such high resolution is infeasible nowadays. Hence, a number of case studies are previously selected. This selection is carried out examining the precipitation averaged in an area encompassing Switzerland in the ESM. Using a hydrological criterion, precipitation is accumulated in several temporal windows: 1 day, 2 days, 3 days, 5 days and 10 days. The 4 most extreme events in each category and season are selected, leading to a total of 336 days to be simulated. The simulated events are affected by systematic biases that have to be accounted before this data set can be used as input in hydrological models. Thus, quantile mapping is used to remove such biases. For this task, a 20-yr high-resolution control simulation is carried out. The extreme events belong to this distribution, and can be mapped onto the distribution of precipitation obtained from a gridded product of precipitation provided by MeteoSwiss. This procedure yields bias-free extreme precipitation events which serve as input by hydrological models that eventually produce a simulated, yet physically consistent flooding event. Thereby, the proposed methodology guarantees consistency with the underlying physics of extreme events, and reproduces plausible impacts of up to one-in-five-centuries situations.

  19. Evaluation of uncertainty in capturing the spatial variability and magnitudes of extreme hydrological events for the uMngeni catchment, South Africa

    NASA Astrophysics Data System (ADS)

    Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer

    2018-02-01

    Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and useful for hydrological applications such as in design hydrology. Nevertheless, the availability of downscaled climatic output provide the potential of exploring climate model uncertainties in different hydro climatic regions at local scales where forcing data is often less accessible but more accurate at finer spatial scales and with adequate spatial detail.

  20. The added value of convection permitting simulations of extreme precipitation events over the eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Zittis, G.; Bruggeman, A.; Camera, C.; Hadjinicolaou, P.; Lelieveld, J.

    2017-07-01

    Climate change is expected to substantially influence precipitation amounts and distribution. To improve simulations of extreme rainfall events, we analyzed the performance of different convection and microphysics parameterizations of the WRF (Weather Research and Forecasting) model at very high horizontal resolutions (12, 4 and 1 km). Our study focused on the eastern Mediterranean climate change hot-spot. Five extreme rainfall events over Cyprus were identified from observations and were dynamically downscaled from the ERA-Interim (EI) dataset with WRF. We applied an objective ranking scheme, using a 1-km gridded observational dataset over Cyprus and six different performance metrics, to investigate the skill of the WRF configurations. We evaluated the rainfall timing and amounts for the different resolutions, and discussed the observational uncertainty over the particular extreme events by comparing three gridded precipitation datasets (E-OBS, APHRODITE and CHIRPS). Simulations with WRF capture rainfall over the eastern Mediterranean reasonably well for three of the five selected extreme events. For these three cases, the WRF simulations improved the ERA-Interim data, which strongly underestimate the rainfall extremes over Cyprus. The best model performance is obtained for the January 1989 event, simulated with an average bias of 4% and a modified Nash-Sutcliff of 0.72 for the 5-member ensemble of the 1-km simulations. We found overall added value for the convection-permitting simulations, especially over regions of high-elevation. Interestingly, for some cases the intermediate 4-km nest was found to outperform the 1-km simulations for low-elevation coastal parts of Cyprus. Finally, we identified significant and inconsistent discrepancies between the three, state of the art, gridded precipitation datasets for the tested events, highlighting the observational uncertainty in the region.

  1. Toward Improving Predictability of Extreme Hydrometeorological Events: the Use of Multi-scale Climate Modeling in the Northern High Plains

    NASA Astrophysics Data System (ADS)

    Munoz-Arriola, F.; Torres-Alavez, J.; Mohamad Abadi, A.; Walko, R. L.

    2014-12-01

    Our goal is to investigate possible sources of predictability of hydrometeorological extreme events in the Northern High Plains. Hydrometeorological extreme events are considered the most costly natural phenomena. Water deficits and surpluses highlight how the water-climate interdependence becomes crucial in areas where single activities drive economies such as Agriculture in the NHP. Nonetheless we recognize the Water-Climate interdependence and the regulatory role that human activities play, we still grapple to identify what sources of predictability could be added to flood and drought forecasts. To identify the benefit of multi-scale climate modeling and the role of initial conditions on flood and drought predictability on the NHP, we use the Ocean Land Atmospheric Model (OLAM). OLAM is characterized by a dynamic core with a global geodesic grid with hexagonal (and variably refined) mesh cells and a finite volume discretization of the full compressible Navier Stokes equations, a cut-grid cell method for topography (that reduces error in computational gradient computation and anomalous vertical dispersion). Our hypothesis is that wet conditions will drive OLAM's simulations of precipitation to wetter conditions affecting both flood forecast and drought forecast. To test this hypothesis we simulate precipitation during identified historical flood events followed by drought events in the NHP (i.e. 2011-2012 years). We initialized OLAM with CFS-data 1-10 days previous to a flooding event (as initial conditions) to explore (1) short-term and high-resolution and (2) long-term and coarse-resolution simulations of flood and drought events, respectively. While floods are assessed during a maximum of 15-days refined-mesh simulations, drought is evaluated during the following 15 months. Simulated precipitation will be compared with the Sub-continental Observation Dataset, a gridded 1/16th degree resolution data obtained from climatological stations in Canada, US, and Mexico. This in-progress research will ultimately contribute to integrate OLAM and VIC models and improve predictability of extreme hydrometeorological events.

  2. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a range of SST responses, as well as a range of RCP scenarios, allows us to assess the uncertainty in the response to elevated GHG emissions that occurs in the CMIP5 ensemble. Numerous extreme weather events can be studied. Firstly, we analyse droughts in Europe with a focus on the UK in the context of the project MaRIUS (Managing the Risks, Impacts and Uncertainties of droughts and water Scarcity). We analyse the characteristics of the simulated droughts, the underlying physical mechanisms, and assess droughts observed in the recent past. Secondly, we analyse windstorms by applying an objective storm-identification and tracking algorithm to the ensemble output, isolating those storms that cause high loss and building a probabilistic storm catalogue, which can be used by impact modellers, insurance loss modellers, etc. Finally, we combine the model output with a heat-stress index to determine the detrimental effect on health of heat waves in Europe. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  3. Design of virtual simulation experiment based on key events

    NASA Astrophysics Data System (ADS)

    Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu

    2018-06-01

    Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.

  4. Evaluation of estimation methods and power of tests of discrete covariates in repeated time-to-event parametric models: application to Gaucher patients treated by imiglucerase.

    PubMed

    Vigan, Marie; Stirnemann, Jérôme; Mentré, France

    2014-05-01

    Analysis of repeated time-to-event data is increasingly performed in pharmacometrics using parametric frailty models. The aims of this simulation study were (1) to assess estimation performance of Stochastic Approximation Expectation Maximization (SAEM) algorithm in MONOLIX, Adaptive Gaussian Quadrature (AGQ), and Laplace algorithm in PROC NLMIXED of SAS and (2) to evaluate properties of test of a dichotomous covariate on occurrence of events. The simulation setting is inspired from an analysis of occurrence of bone events after the initiation of treatment by imiglucerase in patients with Gaucher Disease (GD). We simulated repeated events with an exponential model and various dropout rates: no, low, or high. Several values of baseline hazard model, variability, number of subject, and effect of covariate were studied. For each scenario, 100 datasets were simulated for estimation performance and 500 for test performance. We evaluated estimation performance through relative bias and relative root mean square error (RRMSE). We studied properties of Wald and likelihood ratio test (LRT). We used these methods to analyze occurrence of bone events in patients with GD after starting an enzyme replacement therapy. SAEM with three chains and AGQ algorithms provided good estimates of parameters much better than SAEM with one chain and Laplace which often provided poor estimates. Despite a small number of repeated events, SAEM with three chains and AGQ gave small biases and RRMSE. Type I errors were closed to 5%, and power varied as expected for SAEM with three chains and AGQ. Probability of having at least one event under treatment was 19.1%.

  5. Relation of Parallel Discrete Event Simulation algorithms with physical models

    NASA Astrophysics Data System (ADS)

    Shchur, L. N.; Shchur, L. V.

    2015-09-01

    We extend concept of local simulation times in parallel discrete event simulation (PDES) in order to take into account architecture of the current hardware and software in high-performance computing. We shortly review previous research on the mapping of PDES on physical problems, and emphasise how physical results may help to predict parallel algorithms behaviour.

  6. Can discrete event simulation be of use in modelling major depression?

    PubMed Central

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-01-01

    Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes. PMID:17147790

  7. Can discrete event simulation be of use in modelling major depression?

    PubMed

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  8. Comparison of thunderstorm simulations from WRF-NMM and WRF-ARW models over East Indian Region.

    PubMed

    Litta, A J; Mary Ididcula, Sumam; Mohanty, U C; Kiran Prasad, S

    2012-01-01

    The thunderstorms are typical mesoscale systems dominated by intense convection. Mesoscale models are essential for the accurate prediction of such high-impact weather events. In the present study, an attempt has been made to compare the simulated results of three thunderstorm events using NMM and ARW model core of WRF system and validated the model results with observations. Both models performed well in capturing stability indices which are indicators of severe convective activity. Comparison of model-simulated radar reflectivity imageries with observations revealed that NMM model has simulated well the propagation of the squall line, while the squall line movement was slow in ARW. From the model-simulated spatial plots of cloud top temperature, we can see that NMM model has better captured the genesis, intensification, and propagation of thunder squall than ARW model. The statistical analysis of rainfall indicates the better performance of NMM than ARW. Comparison of model-simulated thunderstorm affected parameters with that of the observed showed that NMM has performed better than ARW in capturing the sharp rise in humidity and drop in temperature. This suggests that NMM model has the potential to provide unique and valuable information for severe thunderstorm forecasters over east Indian region.

  9. Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.

    PubMed

    Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime

    2017-10-01

    Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.

  10. TRACE Model for Simulation of Anticipated Transients Without Scram in a BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng L. Y.; Baek J.; Cuadra,A.

    2013-11-10

    A TRACE model has been developed for using theTRACE/PARCS computational package [1, 2] to simulate anticipated transients without scram (ATWS) events in a boiling water reactor (BWR). The model represents a BWR/5 housed in a Mark II containment. The reactor and the balance of plant systems are modeled in sufficient detail to enable the evaluation of plant responses and theeffectiveness of automatic and operator actions tomitigate this beyond design basis accident.The TRACE model implements features thatfacilitate the simulation of ATWS events initiated by turbine trip and closure of the main steam isolation valves (MSIV). It also incorporates control logic tomore » initiate actions to mitigate the ATWS events, such as water levelcontrol, emergency depressurization, and injection of boron via the standby liquid control system (SLCS). Two different approaches have been used to model boron mixing in the lower plenum of the reactor vessel: modulate coolant flow in the lower plenum by a flow valve, and use control logic to modular.« less

  11. Modeling hard clinical end-point data in economic analyses.

    PubMed

    Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V

    2013-11-01

    The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (<7). Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are more appropriate to accurately reflect the trial data.

  12. A conditional approach to determining the effect of anthropogenic climate change on very rare events.

    NASA Astrophysics Data System (ADS)

    Wehner, Michael; Pall, Pardeep; Zarzycki, Colin; Stone, Daithi

    2016-04-01

    Probabilistic extreme event attribution is especially difficult for weather events that are caused by extremely rare large-scale meteorological patterns. Traditional modeling techniques have involved using ensembles of climate models, either fully coupled or with prescribed ocean and sea ice. Ensemble sizes for the latter case ranges from several 100 to tens of thousand. However, even if the simulations are constrained by the observed ocean state, the requisite large-scale meteorological pattern may not occur frequently enough or even at all in free running climate model simulations. We present a method to ensure that simulated events similar to the observed event are modeled with enough fidelity that robust statistics can be determined given the large scale meteorological conditions. By initializing suitably constrained short term ensemble hindcasts of both the actual weather system and a counterfactual weather system where the human interference in the climate system is removed, the human contribution to the magnitude of the event can be determined. However, the change (if any) in the probability of an event of the observed magnitude is conditional not only on the state of the ocean/sea ice system but also on the prescribed initial conditions determined by the causal large scale meteorological pattern. We will discuss the implications of this technique through two examples; the 2013 Colorado flood and the 2014 Typhoon Haiyan.

  13. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  14. Evaluation of a new satellite-based precipitation dataset for climate studies in the Xiang River basin, Southern China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Hsu, K. L.

    2017-12-01

    A new satellite-based precipitation dataset, Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR) with long-term time series dating back to 1983 can be one valuable dataset for climate studies. This study investigates the feasibility of using PERSIANN-CDR as a reference dataset for climate studies. Sixteen CMIP5 models are evaluated over the Xiang River basin, southern China, by comparing their performance on precipitation projection and streamflow simulation, particularly on extreme precipitation and streamflow events. The results show PERSIANN-CDR is a valuable dataset for climate studies, even on extreme precipitation events. The precipitation estimates and their extreme events from CMIP5 models are improved significantly compared with rain gauge observations after bias-correction by the PERSIANN-CDR precipitation estimates. Given streamflows simulated with raw and bias-corrected precipitation estimates from 16 CMIP5 models, 10 out of 16 are improved after bias-correction. The impact of bias-correction on extreme events for streamflow simulations are unstable, with eight out of 16 models can be clearly claimed they are improved after the bias-correction. Concerning the performance of raw CMIP5 models on precipitation, IPSL-CM5A-MR excels the other CMIP5 models, while MRI-CGCM3 outperforms on extreme events with its better performance on six extreme precipitation metrics. Case studies also show that raw CCSM4, CESM1-CAM5, and MRI-CGCM3 outperform other models on streamflow simulation, while MIROC5-ESM-CHEM, MIROC5-ESM and IPSL-CM5A-MR behaves better than the other models after bias-correction.

  15. Apparent and internal validity of a Monte Carlo-Markov model for cardiovascular disease in a cohort follow-up study.

    PubMed

    Nijhuis, Rogier L; Stijnen, Theo; Peeters, Anna; Witteman, Jacqueline C M; Hofman, Albert; Hunink, M G Myriam

    2006-01-01

    To determine the apparent and internal validity of the Rotterdam Ischemic heart disease & Stroke Computer (RISC) model, a Monte Carlo-Markov model, designed to evaluate the impact of cardiovascular disease (CVD) risk factors and their modification on life expectancy (LE) and cardiovascular disease-free LE (DFLE) in a general population (hereinafter, these will be referred to together as (DF)LE). The model is based on data from the Rotterdam Study, a cohort follow-up study of 6871 subjects aged 55 years and older who visited the research center for risk factor assessment at baseline (1990-1993) and completed a follow-up visit 7 years later (original cohort). The transition probabilities and risk factor trends used in the RISC model were based on data from 3501 subjects (the study cohort). To validate the RISC model, the number of simulated CVD events during 7 years' follow-up were compared with the observed number of events in the study cohort and the original cohort, respectively, and simulated (DF)LEs were compared with the (DF)LEs calculated from multistate life tables. Both in the study cohort and in the original cohort, the simulated distribution of CVD events was consistent with the observed number of events (CVD deaths: 7.1% v. 6.6% and 7.4% v. 7.6%, respectively; non-CVD deaths: 11.2% v. 11.5% and 12.9% v. 13.0%, respectively). The distribution of (DF)LEs estimated with the RISC model consistently encompassed the (DF)LEs calculated with multistate life tables. The simulated events and (DF)LE estimates from the RISC model are consistent with observed data from a cohort follow-up study.

  16. The Materosion project, a sediment cascade modeling for torrential sediment transfers: final results and perspectives

    NASA Astrophysics Data System (ADS)

    Rudaz, Benjamin; Loye, Alexandre; Mazotti, Benoit; Bardou, Eric; Jaboyedoff, Michel

    2013-04-01

    The Materosion project, conducted between the swiss canton of Valais (CREALP) and University of Lausanne (CRET) aims at forecasting sediment transfer in alpine torrents using the sediment cascade concept. The study site is the high Anniviers valley, around the village of Zinal (Valais). The torrents are divided in homogeneous reaches, to and from which sediments are transported by debris flows and bedload transport events. The model runs simulations of 100 years, with a 1-month time step, each with a given a random meteorological event ranging from no activity up to high magnitude debris flows. These events are calibrated using local rain data and observed corresponding debris flow frequencies. The model is applied to ten torrent systems with variable geological context, watershed geometries and sediment supplies. Given the high number of possible event scenarios, 10'000 simulations per torrent are performed, giving a statistical distribution of cumulated volumes and an event size distribution. A way to visualize the complex results data is proposed, and a back-analysis of the internal sediment cascade dynamic is performed. The back-analysis shows that the results' distribution stabilize after ~5'000 simulations. The model results, especially the range of debris flow volumes are crucial to maintain mitigation measures such as retention dams, and give clues for future sediment cascade modeling.

  17. Modeling a maintenance simulation of the geosynchronous platform

    NASA Technical Reports Server (NTRS)

    Kleiner, A. F., Jr.

    1980-01-01

    A modeling technique used to conduct a simulation study comparing various maintenance routines for a space platform is dicussed. A system model is described and illustrated, the basic concepts of a simulation pass are detailed, and sections on failures and maintenance are included. The operation of the system across time is best modeled by a discrete event approach with two basic events - failure and maintenance of the system. Each overall simulation run consists of introducing a particular model of the physical system, together with a maintenance policy, demand function, and mission lifetime. The system is then run through many passes, each pass corresponding to one mission and the model is re-initialized before each pass. Statistics are compiled at the end of each pass and after the last pass a report is printed. Items of interest typically include the time to first maintenance, total number of maintenance trips for each pass, average capability of the system, etc.

  18. Climate change increases the probability of heavy rains in Northern England/Southern Scotland like those of storm Desmond—a real-time event attribution revisited

    NASA Astrophysics Data System (ADS)

    Otto, Friederike E. L.; van der Wiel, Karin; van Oldenborgh, Geert Jan; Philip, Sjoukje; Kew, Sarah F.; Uhe, Peter; Cullen, Heidi

    2018-02-01

    On 4-6 December 2015, storm Desmond caused very heavy rainfall in Northern England and Southern Scotland which led to widespread flooding. A week after the event we provided an initial assessment of the influence of anthropogenic climate change on the likelihood of one-day precipitation events averaged over an area encompassing Northern England and Southern Scotland using data and methods available immediately after the event occurred. The analysis was based on three independent methods of extreme event attribution: historical observed trends, coupled climate model simulations and a large ensemble of regional model simulations. All three methods agreed that the effect of climate change was positive, making precipitation events like this about 40% more likely, with a provisional 2.5%-97.5% confidence interval of 5%-80%. Here we revisit the assessment using more station data, an additional monthly event definition, a second global climate model and regional model simulations of winter 2015/16. The overall result of the analysis is similar to the real-time analysis with a best estimate of a 59% increase in event frequency, but a larger confidence interval that does include no change. It is important to highlight that the observational data in the additional monthly analysis does not only represent the rainfall associated with storm Desmond but also that of storms Eve and Frank occurring towards the end of the month.

  19. Event ambiguity fuels the effective spread of rumors

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Zhang, Yi

    2015-08-01

    In this paper, a new rumor spreading model which quantifies a specific rumor spreading feature is proposed. The specific feature focused on is the important role the event ambiguity plays in the rumor spreading process. To study the impact of this event ambiguity on the spread of rumors, the probability p(t) that an individual becomes a rumor spreader from an initially unaware person at time t is built. p(t) reflects the extent of event ambiguity, and a parameter c of p(t) is used to measure the speed at which the event moves from ambiguity to confirmation. At the same time, a principle is given to decide on the correct value for parameter c A rumor spreading model is then developed with this function added as a parameter to the traditional model. Then, several rumor spreading model simulations are conducted with different values for c on both regular networks and ER random networks. The simulation results indicate that a rumor spreads faster and more broadly when c is smaller. This shows that if events are ambiguous over a longer time, rumor spreading appears to be more effective, and is influenced more significantly by parameter c in a random network than in a regular network. We then determine parameters of this model through data fitting of the missing Malaysian plane, and apply this model to an analysis of the missing Malaysian plane. The simulation results demonstrate that the most critical time for authorities to control rumor spreading is in the early stages of a critical event.

  20. Widespread, Very Heavy Precipitation Events in Contemporary and Scenario Summer Climates from NARCCAP Simulations

    NASA Astrophysics Data System (ADS)

    Kawazoe, S.; Gutowski, W. J., Jr.

    2015-12-01

    We analyze the ability of regional climate models (RCMs) to simulate very heavy daily precipitation and supporting processes for both contemporary and future-scenario simulations during summer (JJA). RCM output comes from North American Regional Climate Change Assessment Program (NARCCAP) simulations, which are all run at a spatial resolution of 50 km. Analysis focuses on the upper Mississippi basin for summer, between 1982-1998 for the contemporary climate, and 2052-2068 during the scenario climate. We also compare simulated precipitation and supporting processes with those obtained from observed precipitation and reanalysis atmospheric states. Precipitation observations are from the University of Washington (UW) and the Climate Prediction Center (CPC) gridded dataset. Utilizing two observational datasets helps determine if any uncertainties arise from differences in precipitation gridding schemes. Reanalysis fields come from the North American Regional Reanalysis. The NARCCAP models generally reproduce well the precipitation-vs.-intensity spectrum seen in observations, while producing overly strong precipitation at high intensity thresholds. In the future-scenario climate, there is a decrease in frequency for light to moderate precipitation intensities, while an increase in frequency is seen for the higher intensity events. Further analysis focuses on precipitation events exceeding the 99.5 percentile that occur simultaneously at several points in the region, yielding so-called "widespread events". For widespread events, we analyze local and large scale environmental parameters, such as 2-m temperature and specific humidity, 500-hPa geopotential heights, Convective Available Potential Energy (CAPE), vertically integrated moisture flux convergence, among others, to compare atmospheric states and processes leading to such events in the models and observations. The results suggest that an analysis of atmospheric states supporting very heavy precipitation events is a more fruitful path for understanding and detecting changes than simply looking at precipitation itself.

  1. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment*†

    PubMed Central

    Khan, Md. Ashfaquzzaman; Herbordt, Martin C.

    2011-01-01

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations. PMID:21822327

  2. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment.

    PubMed

    Khan, Md Ashfaquzzaman; Herbordt, Martin C

    2011-07-20

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations.

  3. Flood-inundation and flood-mitigation modeling of the West Branch Wapsinonoc Creek Watershed in West Branch, Iowa

    USGS Publications Warehouse

    Cigrand, Charles V.

    2018-03-26

    The U.S. Geological Survey (USGS) in cooperation with the city of West Branch and the Herbert Hoover National Historic Site of the National Park Service assessed flood-mitigation scenarios within the West Branch Wapsinonoc Creek watershed. The scenarios are intended to demonstrate several means of decreasing peak streamflows and improving the conveyance of overbank flows from the West Branch Wapsinonoc Creek and its tributary Hoover Creek where they flow through the city and the Herbert Hoover National Historic Site located within the city.Hydrologic and hydraulic models of the watershed were constructed to assess the flood-mitigation scenarios. To accomplish this, the models used the U.S. Army Corps of Engineers Hydrologic Engineering Center-Hydrologic Modeling System (HEC–HMS) version 4.2 to simulate the amount of runoff and streamflow produced from single rain events. The Hydrologic Engineering Center-River Analysis System (HEC–RAS) version 5.0 was then used to construct an unsteady-state model that may be used for routing streamflows, mapping areas that may be inundated during floods, and simulating the effects of different measures taken to decrease the effects of floods on people and infrastructure.Both models were calibrated to three historic rainfall events that produced peak streamflows ranging between the 2-year and 10-year flood-frequency recurrence intervals at the USGS streamgage (05464942) on Hoover Creek. The historic rainfall events were calibrated by using data from two USGS streamgages along with surveyed high-water marks from one of the events. The calibrated HEC–HMS model was then used to simulate streamflows from design rainfall events of 24-hour duration ranging from a 20-percent to a 1-percent annual exceedance probability. These simulated streamflows were incorporated into the HEC–RAS model.The unsteady-state HEC–RAS model was calibrated to represent existing conditions within the watershed. HEC–RAS model simulations with the existing conditions and streamflows from the design rainfall events were then done to serve as a baseline for evaluating flood-mitigation scenarios. After these simulations were completed, three different flood-mitigation scenarios were developed with HEC–RAS: a detention-storage scenario, a conveyance improvement scenario, and a combination of both. In the detention-storage scenario, four in-channel detention structures were placed upstream from the city of West Branch to attenuate peak streamflows. To investigate possible improvements to conveying floodwaters through the city of West Branch, a section of abandoned railroad embankment and an old truss bridge were removed in the model, because these structures were producing backwater areas during flooding events. The third scenario combines the detention and conveyance scenarios so their joint efficiency could be evaluated. The scenarios with the design rainfall events were run in the HEC–RAS model so their flood-mitigation effects could be analyzed across a wide range of flood magnitudes.

  4. The role of density-dependent individual growth in the persistence of freshwater salmonid populations.

    PubMed

    Vincenzi, Simone; Crivelli, Alain J; Jesensek, Dusan; De Leo, Giulio A

    2008-06-01

    Theoretical and empirical models of populations dynamics have paid little attention to the implications of density-dependent individual growth on the persistence and regulation of small freshwater salmonid populations. We have therefore designed a study aimed at testing our hypothesis that density-dependent individual growth is a process that enhances population recovery and reduces extinction risk in salmonid populations in a variable environment subject to disturbance events. This hypothesis was tested in two newly introduced marble trout (Salmo marmoratus) populations living in Slovenian streams (Zakojska and Gorska) subject to severe autumn floods. We developed a discrete-time stochastic individual-based model of population dynamics for each population with demographic parameters and compensatory responses tightly calibrated on data from individually tagged marble trout. The occurrence of severe flood events causing population collapses was explicitly accounted for in the model. We used the model in a population viability analysis setting to estimate the quasi-extinction risk and demographic indexes of the two marble trout populations when individual growth was density-dependent. We ran a set of simulations in which the effect of floods on population abundance was explicitly accounted for and another set of simulations in which flood events were not included in the model. These simulation results were compared with those of scenarios in which individual growth was modelled with density-independent Von Bertalanffy growth curves. Our results show how density-dependent individual growth may confer remarkable resilience to marble trout populations in case of major flood events. The resilience to flood events shown by the simulation results can be explained by the increase in size-dependent fecundity as a consequence of the drop in population size after a severe flood, which allows the population to quickly recover to the pre-event conditions. Our results suggest that density-dependent individual growth plays a potentially powerful role in the persistence of freshwater salmonids living in streams subject to recurrent yet unpredictable flood events.

  5. CulSim: A simulator of emergence and resilience of cultural diversity

    NASA Astrophysics Data System (ADS)

    Ulloa, Roberto

    CulSim is an agent-based computer simulation software that allows further exploration of influential and recent models of emergence of cultural groups grounded in sociological theories. CulSim provides a collection of tools to analyze resilience of cultural diversity when events affect agents, institutions or global parameters of the simulations; upon combination, events can be used to approximate historical circumstances. The software provides a graphical and text-based user interface, and so makes this agent-based modeling methodology accessible to a variety of users from different research fields.

  6. Improving synoptic and intraseasonal variability in CFSv2 via stochastic representation of organized convection

    NASA Astrophysics Data System (ADS)

    Goswami, B. B.; Khouider, B.; Phani, R.; Mukhopadhyay, P.; Majda, A.

    2017-01-01

    To better represent organized convection in the Climate Forecast System version 2 (CFSv2), a stochastic multicloud model (SMCM) parameterization is adopted and a 15 year climate run is made. The last 10 years of simulations are analyzed here. While retaining an equally good mean state (if not better) as the parent model, the CFS-SMCM simulation shows significant improvement in the synoptic and intraseasonal variability. The CFS-SMCM provides a better account of convectively coupled equatorial waves and the Madden-Julian oscillation. The CFS-SMCM exhibits improvements in northward and eastward propagation of intraseasonal oscillation of convection including the MJO propagation beyond the maritime continent barrier, which is the Achilles Heel for coarse-resolution global climate models (GCMs). The distribution of precipitation events is better simulated in CFSsmcm and spreads naturally toward high-precipitation events. Deterministic GCMs tend to simulate a narrow distribution with too much drizzling precipitation and too little high-precipitation events.

  7. Simulated CONUS Flash Flood Climatologies from Distributed Hydrologic Models

    NASA Astrophysics Data System (ADS)

    Flamig, Z.; Gourley, J. J.; Vergara, H. J.; Kirstetter, P. E.; Hong, Y.

    2016-12-01

    This study will describe a CONUS flash flood climatology created over the period from 2002 through 2011. The MRMS reanalysis precipitation dataset was used as forcing into the Ensemble Framework For Flash Flood Forecasting (EF5). This high resolution 1-sq km 5-minute dataset is ideal for simulating flash floods with a distributed hydrologic model. EF5 features multiple water balance components including SAC-SMA, CREST, and a hydrophobic model all coupled with kinematic wave routing. The EF5/SAC-SMA and EF5/CREST water balance schemes were used for the creation of dual flash flood climatologies based on the differing water balance principles. For the period from 2002 through 2011 the daily maximum streamflow, unit streamflow, and time of peak streamflow was stored along with the minimum soil moisture. These variables are used to describe the states of the soils right before a flash flood event and the peak streamflow that was simulated during the flash flood event. The results will be shown, compared and contrasted. The resulting model simulations will be verified on basins less than 1,000-sq km with USGS gauges to ensure the distributed hydrologic models are reliable. The results will also be compared spatially to Storm Data flash flood event observations to judge the degree of agreement between the simulated climatologies and observations.

  8. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how GCR event rates mapped to biological signaling induction and relaxation times. We considered several hypotheses related to signaling and cancer risk, and then performed simulations for conditions where aberrant or adaptive signaling would occur on long-duration space mission. Our results do not support the conventional assumptions of dose, linearity and additivity. A discussion on how event-based systems biology models, which focus on biological signaling as the mechanism to propagate damage or adaptation, can be further developed for cancer and CNS space radiation risk projections is given.

  9. Simulation of deep ventilation in Crater Lake, Oregon, 1951–2099

    USGS Publications Warehouse

    Wood, Tamara M.; Wherry, Susan A.; Piccolroaz, Sebastiano; Girdner, Scott F

    2016-05-04

    The frequency of deep ventilation events in Crater Lake, a caldera lake in the Oregon Cascade Mountains, was simulated in six future climate scenarios, using a 1-dimensional deep ventilation model (1DDV) that was developed to simulate the ventilation of deep water initiated by reverse stratification and subsequent thermobaric instability. The model was calibrated and validated with lake temperature data collected from 1994 to 2011. Wind and air temperature data from three general circulation models and two representative concentration pathways were used to simulate the change in lake temperature and the frequency of deep ventilation events in possible future climates. The lumped model air2water was used to project lake surface temperature, a required boundary condition for the lake model, based on air temperature in the future climates.The 1DDV model was used to simulate daily water temperature profiles through 2099. All future climate scenarios projected increased water temperature throughout the water column and a substantive reduction in the frequency of deep ventilation events. The least extreme scenario projected the frequency of deep ventilation events to decrease from about 1 in 2 years in current conditions to about 1 in 3 years by 2100. The most extreme scenario considered projected the frequency of deep ventilation events to be about 1 in 7.7 years by 2100. All scenarios predicted that the temperature of the entire water column will be greater than 4 °C for increasing lengths of time in the future and that the conditions required for thermobaric instability induced mixing will become rare or non-existent.The disruption of deep ventilation by itself does not provide a complete picture of the potential ecological and water quality consequences of warming climate to Crater Lake. Estimating the effect of warming climate on deep water oxygen depletion and water clarity will require careful modeling studies to combine the physical mixing processes affected by the atmosphere with the multitude of factors affecting the growth of algae and corresponding water clarity.

  10. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  11. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  12. Teaching sexual history-taking skills using the Sexual Events Classification System.

    PubMed

    Fidler, Donald C; Petri, Justin Daniel; Chapman, Mark

    2010-01-01

    The authors review the literature about educational programs for teaching sexual history-taking skills and describe novel techniques for teaching these skills. Psychiatric residents enrolled in a brief sexual history-taking course that included instruction on the Sexual Events Classification System, feedback on residents' video-recorded interviews with simulated patients, discussion of videos that simulated bad interviews, simulated patients, and a competency scoring form to score a video of a simulated interview. After the course, residents completed an anonymous survey to assess the usefulness of the experience. After the course, most residents felt more comfortable taking sexual histories. They described the Sexual Events Classification System and simulated interviews as practical methods for teaching sexual history-taking skills. The Sexual Events Classification System and simulated patient experiences may serve as a practical model for teaching sexual history-taking skills to general psychiatric residents.

  13. Sensitivity of the WRF model to the lower boundary in an extreme precipitation event - Madeira island case study

    NASA Astrophysics Data System (ADS)

    Teixeira, J. C.; Carvalho, A. C.; Carvalho, M. J.; Luna, T.; Rocha, A.

    2014-08-01

    The advances in satellite technology in recent years have made feasible the acquisition of high-resolution information on the Earth's surface. Examples of such information include elevation and land use, which have become more detailed. Including this information in numerical atmospheric models can improve their results in simulating lower boundary forced events, by providing detailed information on their characteristics. Consequently, this work aims to study the sensitivity of the weather research and forecast (WRF) model to different topography as well as land-use simulations in an extreme precipitation event. The test case focused on a topographically driven precipitation event over the island of Madeira, which triggered flash floods and mudslides in the southern parts of the island. Difference fields between simulations were computed, showing that the change in the data sets produced statistically significant changes to the flow, the planetary boundary layer structure and precipitation patterns. Moreover, model results show an improvement in model skill in the windward region for precipitation and in the leeward region for wind, in spite of the non-significant enhancement in the overall results with higher-resolution data sets of topography and land use.

  14. The Influence of Aerosol Hygroscopicity on Precipitation Intensity During a Mesoscale Convective Event

    NASA Astrophysics Data System (ADS)

    Kawecki, Stacey; Steiner, Allison L.

    2018-01-01

    We examine how aerosol composition affects precipitation intensity using the Weather and Research Forecasting Model with Chemistry (version 3.6). By changing the prescribed default hygroscopicity values to updated values from laboratory studies, we test model assumptions about individual component hygroscopicity values of ammonium, sulfate, nitrate, and organic species. We compare a baseline simulation (BASE, using default hygroscopicity values) with four sensitivity simulations (SULF, increasing the sulfate hygroscopicity; ORG, decreasing organic hygroscopicity; SWITCH, using a concentration-dependent hygroscopicity value for ammonium; and ALL, including all three changes) to understand the role of aerosol composition on precipitation during a mesoscale convective system (MCS). Overall, the hygroscopicity changes influence the spatial patterns of precipitation and the intensity. Focusing on the maximum precipitation in the model domain downwind of an urban area, we find that changing the individual component hygroscopicities leads to bulk hygroscopicity changes, especially in the ORG simulation. Reducing bulk hygroscopicity (e.g., ORG simulation) initially causes fewer activated drops, weakened updrafts in the midtroposphere, and increased precipitation from larger hydrometeors. Increasing bulk hygroscopicity (e.g., SULF simulation) simulates more numerous and smaller cloud drops and increases precipitation. In the ALL simulation, a stronger cold pool and downdrafts lead to precipitation suppression later in the MCS evolution. In this downwind region, the combined changes in hygroscopicity (ALL) reduces the overprediction of intense events (>70 mm d-1) and better captures the range of moderate intensity (30-60 mm d-1) events. The results of this single MCS analysis suggest that aerosol composition can play an important role in simulating high-intensity precipitation events.

  15. A New Look at Stratospheric Sudden Warmings. Part II: Evaluation of Numerical Model Simulations

    NASA Technical Reports Server (NTRS)

    Charlton, Andrew J.; Polvani, Lorenza M.; Perlwitz, Judith; Sassi, Fabrizio; Manzini, Elisa; Shibata, Kiyotaka; Pawson, Steven; Nielsen, J. Eric; Rind, David

    2007-01-01

    The simulation of major midwinter stratospheric sudden warmings (SSWs) in six stratosphere-resolving general circulation models (GCMs) is examined. The GCMs are compared to a new climatology of SSWs, based on the dynamical characteristics of the events. First, the number, type, and temporal distribution of SSW events are evaluated. Most of the models show a lower frequency of SSW events than the climatology, which has a mean frequency of 6.0 SSWs per decade. Statistical tests show that three of the six models produce significantly fewer SSWs than the climatology, between 1.0 and 2.6 SSWs per decade. Second, four process-based diagnostics are calculated for all of the SSW events in each model. It is found that SSWs in the GCMs compare favorably with dynamical benchmarks for SSW established in the first part of the study. These results indicate that GCMs are capable of quite accurately simulating the dynamics required to produce SSWs, but with lower frequency than the climatology. Further dynamical diagnostics hint that, in at least one case, this is due to a lack of meridional heat flux in the lower stratosphere. Even though the SSWs simulated by most GCMs are dynamically realistic when compared to the NCEP-NCAR reanalysis, the reasons for the relative paucity of SSWs in GCMs remains an important and open question.

  16. Dynamically adaptive data-driven simulation of extreme hydrological flows

    NASA Astrophysics Data System (ADS)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  17. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  18. A Spiking Neural Simulator Integrating Event-Driven and Time-Driven Computation Schemes Using Parallel CPU-GPU Co-Processing: A Case Study.

    PubMed

    Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo

    2015-07-01

    Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.

  19. Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada (Invited)

    NASA Astrophysics Data System (ADS)

    Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.

    2013-12-01

    Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be available during more frequent droughts. Simulated floods for the region indicate issues related to drainage in the developed areas around Lake Tahoe, and necessary dam releases that create downstream flood risks.

  20. Comparison of Thunderstorm Simulations from WRF-NMM and WRF-ARW Models over East Indian Region

    PubMed Central

    Litta, A. J.; Mary Ididcula, Sumam; Mohanty, U. C.; Kiran Prasad, S.

    2012-01-01

    The thunderstorms are typical mesoscale systems dominated by intense convection. Mesoscale models are essential for the accurate prediction of such high-impact weather events. In the present study, an attempt has been made to compare the simulated results of three thunderstorm events using NMM and ARW model core of WRF system and validated the model results with observations. Both models performed well in capturing stability indices which are indicators of severe convective activity. Comparison of model-simulated radar reflectivity imageries with observations revealed that NMM model has simulated well the propagation of the squall line, while the squall line movement was slow in ARW. From the model-simulated spatial plots of cloud top temperature, we can see that NMM model has better captured the genesis, intensification, and propagation of thunder squall than ARW model. The statistical analysis of rainfall indicates the better performance of NMM than ARW. Comparison of model-simulated thunderstorm affected parameters with that of the observed showed that NMM has performed better than ARW in capturing the sharp rise in humidity and drop in temperature. This suggests that NMM model has the potential to provide unique and valuable information for severe thunderstorm forecasters over east Indian region. PMID:22645480

  1. The Impacts of Bias in Cloud-Radiation-Dynamics Interactions on Central Pacific Seasonal and El Niño Simulations in Contemporary GCMs

    NASA Astrophysics Data System (ADS)

    Li, J.-L. F.; Suhas, E.; Richardson, Mark; Lee, Wei-Liang; Wang, Yi-Hui; Yu, Jia-Yuh; Lee, Tong; Fetzer, Eric; Stephens, Graeme; Shen, Min-Hua

    2018-02-01

    Most of the global climate models (GCMs) in the Coupled Model Intercomparison Project, phase 5 do not include precipitating ice (aka falling snow) in their radiation calculations. We examine the importance of the radiative effects of precipitating ice on simulated surface wind stress and sea surface temperatures (SSTs) in terms of seasonal variation and in the evolution of central Pacific El Niño (CP-El Niño) events. Using controlled simulations with the CESM1 model, we show that the exclusion of precipitating ice radiative effects generates a persistent excessive upper-level radiative cooling and an increasingly unstable atmosphere over convective regions such as the western Pacific and tropical convergence zones. The invigorated convection leads to persistent anomalous low-level outflows which weaken the easterly trade winds, reducing upper-ocean mixing and leading to a positive SST bias in the model mean state. In CP-El Niño events, this means that outflow from the modeled convection in the central Pacific reduces winds to the east, allowing unrealistic eastward propagation of warm SST anomalies following the peak in CP-El Niño activity. Including the radiative effects of precipitating ice reduces these model biases and improves the simulated life cycle of the CP-El Niño. Improved simulations of present-day tropical seasonal variations and CP-El Niño events would increase the confidence in simulating their future behavior.

  2. Estimation and Correction of bias of long-term simulated climate data from Global Circulation Models (GCMs)

    NASA Astrophysics Data System (ADS)

    Mehan, S.; Gitau, M. W.

    2017-12-01

    Global circulation models are often used in simulating long-term climate data for use in hydrologic studies. However, some bias (difference between simulated values and observed data) has been observed especially while simulating precipitation events. The bias is especially evident with respect to simulating dry and wet days. This is because GCMs tend to underestimate large precipitation events with the associated precipitation amounts being distributed to some dry days, thus, leading to a larger number of wet days each with some amount of rainfall. The accuracy of precipitation simulations impacts the accuracy of other simulated components such as flow and water quality. It is, thus, very important to correct the bias associated with precipitation before it is used for any modeling applications. This study aims to correct the bias specifically associated with precipitation events with a focus on the Western Lake Erie Basin (WLEB). Analytical, statistical, and extreme event analyses for three different stations (Adrian, MI; Norwalk, OH; and Fort Wayne, IN) in the WLEB were carried out to quantify the bias. Findings indicated that GCMs overestimated the wet sequences and underestimated dry day probabilities. The number of wet sequences simulated by nine GCMs each from two different open sources were 310-678 (Fort Wayne, IN); 318-600 (Adrian, MI); and 346-638 (Norwalk, OH) compared with 166, 150, and 180, respectively. Predicted conditional probabilities of a dry day followed by wet day (P (D|W)) ranged between 0.16-0.42 (Fort Wayne, IN); 0.29-0.41(Adrian, MI); and 0.13-0.40 (Norwalk, OH) from the different GCMs compared to 0.52 (Fort Wayne, IN and Norwalk, OH); and 0.54 (Adrian, MI) from the observed climate data. There was a difference of 0-8.5% between the distribution of simulated climate values and observed climate data for precipitation and temperature for all three stations (Cohen's d effective size < 0.2). Further work involves the use of Stochastic Weather Generators to correct the conditional probabilities and better capture the dry and wet events for use in the hydrologic and water resources modeling.

  3. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system

    PubMed Central

    2010-01-01

    Background The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the “in silico” stochastic event based modeling approach to find the molecular dynamics of the system. Results In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Conclusions Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics. PMID:21143785

  4. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system.

    PubMed

    Ghosh, Preetam; Ghosh, Samik; Basu, Kalyan; Das, Sajal K; Zhang, Chaoyang

    2010-12-01

    The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the "in silico" stochastic event based modeling approach to find the molecular dynamics of the system. In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics.

  5. Review of GEM Radiation Belt Dropout and Buildup Challenges

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Li, Wen; Morley, Steve; Albert, Jay

    2017-04-01

    In Summer 2015 the US NSF GEM (Geospace Environment Modeling) focus group named "Quantitative Assessment of Radiation Belt Modeling" started the "RB dropout" and "RB buildup" challenges, focused on quantitative modeling of the radiation belt buildups and dropouts. This is a community effort which includes selecting challenge events, gathering model inputs that are required to model the radiation belt dynamics during these events (e.g., various magnetospheric waves, plasmapause and density models, electron phase space density data), simulating the challenge events using different types of radiation belt models, and validating the model results by comparison to in situ observations of radiation belt electrons (from Van Allen Probes, THEMIS, GOES, LANL/GEO, etc). The goal is to quantitatively assess the relative importance of various acceleration, transport, and loss processes in the observed radiation belt dropouts and buildups. Since 2015, the community has selected four "challenge" events under four different categories: "storm-time enhancements", "non-storm enhancements", "storm-time dropouts", and "non-storm dropouts". Model inputs and data for each selected event have been coordinated and shared within the community to establish a common basis for simulations and testing. Modelers within and outside US with different types of radiation belt models (diffusion-type, diffusion-convection-type, test particle codes, etc.) have participated in our challenge and shared their simulation results and comparison with spacecraft measurements. Significant progress has been made in quantitative modeling of the radiation belt buildups and dropouts as well as accessing the modeling with new measures of model performance. In this presentation, I will review the activities from our "RB dropout" and "RB buildup" challenges and the progresses achieved in understanding radiation belt physics and improving model validation and verification.

  6. The Influence of Preferential Flow on Pressure Propagation and Landslide Triggering of the Rocca Pitigliana Landslide

    NASA Astrophysics Data System (ADS)

    Shao, W.; Bogaard, T.; Bakker, M.; Berti, M.; Savenije, H. H. G.

    2016-12-01

    The fast pore water pressure response to rain events is an important triggering factor for slope instability. The fast pressure response may be caused by preferential flow that bypasses the soil matrix. Currently, most of the hydro-mechanical models simulate pore water pressure using a single-permeability model, which cannot quantify the effects of preferential flow on pressure propagation and landslide triggering. Previous studies showed that a model based on the linear-diffusion equation can simulate the fast pressure propagation in near-saturated landslides such as the Rocca Pitigliana landslide. In such a model, the diffusion coefficient depends on the degree of saturation, which makes it difficult to use the model for predictions. In this study, the influence of preferential flow on pressure propagation and slope stability is investigated with a 1D dual-permeability model coupled with an infinite-slope stability approach. The dual-permeability model uses two modified Darcy-Richards equations to simultaneously simulate the matrix flow and preferential flow in hillslopes. The simulated pressure head is used in an infinite-slope stability analysis to identify the influence of preferential flow on the fast pressure response and landslide triggering. The dual-permeability model simulates the height and arrival of the pressure peak reasonably well. Performance of the dual-permeability model is as good as or better than the linear-diffusion model even though the dual-permeability model is calibrated for two single pulse rain events only, while the linear-diffusion model is calibrated for each rain event separately.

  7. The impact of bathymetry input on flood simulations

    NASA Astrophysics Data System (ADS)

    Khanam, M.; Cohen, S.

    2017-12-01

    Flood prediction and mitigation systems are inevitable for improving public safety and community resilience all over the worldwide. Hydraulic simulations of flood events are becoming an increasingly efficient tool for studying and predicting flood events and susceptibility. A consistent limitation of hydraulic simulations of riverine dynamics is the lack of information about river bathymetry as most terrain data record water surface elevation. The impact of this limitation on the accuracy on hydraulic simulations of flood has not been well studies over a large range of flood magnitude and modeling frameworks. Advancing our understanding of this topic is timely given emerging national and global efforts for developing automated flood predictions systems (e.g. NOAA National Water Center). Here we study the response of flood simulation to the incorporation of different bathymetry and floodplain surveillance source. Different hydraulic models are compared, Mike-Flood, a 2D hydrodynamic model, and GSSHA, a hydrology/hydraulics model. We test a hypothesis that the impact of inclusion/exclusion of bathymetry data on hydraulic model results will vary in its magnitude as a function of river size. This will allow researcher and stake holders more accurate predictions of flood events providing useful information that will help local communities in a vulnerable flood zone to mitigate flood hazards. Also, it will help to evaluate the accuracy and efficiency of different modeling frameworks and gage their dependency on detailed bathymetry input data.

  8. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2010-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring. WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed WP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce a realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  9. Can the GEOS CCM Simulate the Temperature Response to Warm Pool El Nino Events in the Antarctic Stratosphere?

    NASA Technical Reports Server (NTRS)

    Hurwitz, M. M.; Song, I.-S.; Oman, L. D.; Newman, P. A.; Molod, A. M.; Frith, S. M.; Nielsen, J. E.

    2011-01-01

    "Warm pool" (WP) El Nino events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. During austral spring, WP El Nino events are associated with an enhancement of convective activity in the South Pacific Convergence Zone, provoking a tropospheric planetary wave response and thus increasing planetary wave driving of the Southern Hemisphere stratosphere. These conditions lead to higher polar stratospheric temperatures and to a weaker polar jet during austral summer, as compared with neutral ENSO years. Furthermore, this response is sensitive to the phase of the quasi-biennial oscillation (QBO): a stronger warming is seen in WP El Nino events coincident with the easterly phase of the quasi-biennial oscillation (QBO) as compared with WP El Nino events coincident with a westerly or neutral QBO. The Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) is used to further explore the atmospheric response to ENSO. Time-slice simulations are forced by composited SSTs from observed NP El Nino and neutral ENSO events. The modeled eddy heat flux, temperature and wind responses to WP El Nino events are compared with observations. A new gravity wave drag scheme has been implemented in the GEOS CCM, enabling the model to produce e realistic, internally generated QBO. By repeating the above time-slice simulations with this new model version, the sensitivity of the WP El Nino response to the phase of the quasi-biennial oscillation QBO is estimated.

  10. A Model Independent General Search for new physics in ATLAS

    NASA Astrophysics Data System (ADS)

    Amoroso, S.; ATLAS Collaboration

    2016-04-01

    We present results of a model-independent general search for new phenomena in proton-proton collisions at a centre-of-mass energy of 8 TeV with the ATLAS detector at the LHC. The data set corresponds to a total integrated luminosity of 20.3 fb-1. Event topologies involving isolated electrons, photons and muons, as well as jets, including those identified as originating from b-quarks (b-jets) and missing transverse momentum are investigated. The events are subdivided according to their final states into exclusive event classes. For the 697 classes with a Standard Model expectation greater than 0.1 events, a search algorithm tests the compatibility of data against the Monte Carlo simulated background in three kinematic variables sensitive to new physics effects. No significant deviation is found in data. The number and size of the observed deviations follow the Standard Model expectation obtained from simulated pseudo-experiments.

  11. Collaborative Project: Development of an Isotope-Enabled CESM for Testing Abrupt Climate Changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhengyu

    One of the most important validations for a state-of-art Earth System Model (ESM) with respect to climate changes is the simulation of the climate evolution and abrupt climate change events in the Earth’s history of the last 21,000 years. However, one great challenge for model validation is that ESMs usually do not directly simulate geochemical variables that can be compared directly with past proxy records. In this proposal, we have met this challenge by developing the simulation capability of major isotopes in a state-of-art ESM, the Community Earth System Model (CESM), enabling us to make direct model-data comparison by comparingmore » the model directly against proxy climate records. Our isotope-enabled ESM incorporates the capability of simulating key isotopes and geotracers, notably δ 18O, δD, δ 14C, and δ 13C, Nd and Pa/Th. The isotope-enabled ESM have been used to perform some simulations for the last 21000 years. The direct comparison of these simulations with proxy records has shed light on the mechanisms of important climate change events.« less

  12. THE STORM WATER MANAGEMENT MODEL (SWMM) AND RELATED WATERSHED TOOLS DEVELOPMENT

    EPA Science Inventory

    The Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. It is the only publicly available model capable of performing a comprehensiv...

  13. Recent examples of mesoscale numerical forecasts of severe weather events along the east coast

    NASA Technical Reports Server (NTRS)

    Kocin, P. J.; Uccellini, L. W.; Zack, J. W.; Kaplan, M. L.

    1984-01-01

    Mesoscale numerical forecasts utilizing the Mesoscale Atmospheric Simulation System (MASS) are documented for two East Coast severe weather events. The two events are the thunderstorm and heavy snow bursts in the Washington, D.C. - Baltimore, MD region on 8 March 1984 and the devastating tornado outbreak across North and South Carolina on 28 March 1984. The forecasts are presented to demonstrate the ability of the model to simulate dynamical interactions and diabatic processes and to note some of the problems encountered when using mesoscale models for day-to-day forecasting.

  14. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  15. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  16. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  17. Pickup Protons: Comparisons using the Three-Dimensional MHD HHMS-PI model and Ulysses SWICS Measurements

    NASA Technical Reports Server (NTRS)

    Intriligator, Devrie S.; Detman, Thomas; Gloecker, George; Gloeckler, Christine; Dryer, Murray; Sun, Wei; Intriligator, James; Deehr, Charles

    2012-01-01

    We report the first comparisons of pickup proton simulation results with in situ measurements of pickup protons obtained by the SWICS instrument on Ulysses. Simulations were run using the three dimensional (3D) time-dependent Hybrid Heliospheric Modeling System with Pickup Protons (HHMS-PI). HHMS-PI is an MHD solar wind model, expanded to include the basic physics of pickup protons from neutral hydrogen that drifts into the heliosphere from the local interstellar medium. We use the same model and input data developed by Detman et al. (2011) to now investigate the pickup protons. The simulated interval of 82 days in 2003 2004, includes both quiet solar wind (SW) and also the October November 2003 solar events (the Halloween 2003 solar storms). The HHMS-PI pickup proton simulations generally agree with the SWICS measurements and the HHMS-PI simulated solar wind generally agrees with SWOOPS (also on Ulysses) measurements. Many specific features in the observations are well represented by the model. We simulated twenty specific solar events associated with the Halloween 2003 storm. We give the specific values of the solar input parameters for the HHMS-PI simulations that provide the best combined agreement in the times of arrival of the solar-generated shocks at both ACE and Ulysses. We show graphical comparisons of simulated and observed parameters, and we give quantitative measures of the agreement of simulated with observed parameters. We suggest that some of the variations in the pickup proton density during the Halloween 2003 solar events may be attributed to depletion of the inflowing local interstellar medium (LISM) neutral hydrogen (H) caused by its increased conversion to pickup protons in the immediately preceding shock.

  18. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks

    PubMed Central

    Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under increasing levels of neural complexity. PMID:28223930

  19. Simulating Impacts of Disruptions to Liquid Fuels Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Michael; Corbet, Thomas F.; Baker, Arnold B.

    This report presents a methodology for estimating the impacts of events that damage or disrupt liquid fuels infrastructure. The impact of a disruption depends on which components of the infrastructure are damaged, the time required for repairs, and the position of the disrupted components in the fuels supply network. Impacts are estimated for seven stressing events in regions of the United States, which were selected to represent a range of disruption types. For most of these events the analysis is carried out using the National Transportation Fuels Model (NTFM) to simulate the system-level liquid fuels sector response. Results are presentedmore » for each event, and a brief cross comparison of event simulation results is provided.« less

  20. Reversible Parallel Discrete-Event Execution of Large-scale Epidemic Outbreak Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    The spatial scale, runtime speed and behavioral detail of epidemic outbreak simulations together require the use of large-scale parallel processing. In this paper, an optimistic parallel discrete event execution of a reaction-diffusion simulation model of epidemic outbreaks is presented, with an implementation over themore » $$\\mu$$sik simulator. Rollback support is achieved with the development of a novel reversible model that combines reverse computation with a small amount of incremental state saving. Parallel speedup and other runtime performance metrics of the simulation are tested on a small (8,192-core) Blue Gene / P system, while scalability is demonstrated on 65,536 cores of a large Cray XT5 system. Scenarios representing large population sizes (up to several hundred million individuals in the largest case) are exercised.« less

  1. Assessing the applicability of WRF optimal parameters under the different precipitation simulations in the Greater Beijing Area

    NASA Astrophysics Data System (ADS)

    Di, Zhenhua; Duan, Qingyun; Wang, Chen; Ye, Aizhong; Miao, Chiyuan; Gong, Wei

    2018-03-01

    Forecasting skills of the complex weather and climate models have been improved by tuning the sensitive parameters that exert the greatest impact on simulated results based on more effective optimization methods. However, whether the optimal parameter values are still work when the model simulation conditions vary, which is a scientific problem deserving of study. In this study, a highly-effective optimization method, adaptive surrogate model-based optimization (ASMO), was firstly used to tune nine sensitive parameters from four physical parameterization schemes of the Weather Research and Forecasting (WRF) model to obtain better summer precipitation forecasting over the Greater Beijing Area in China. Then, to assess the applicability of the optimal parameter values, simulation results from the WRF model with default and optimal parameter values were compared across precipitation events, boundary conditions, spatial scales, and physical processes in the Greater Beijing Area. The summer precipitation events from 6 years were used to calibrate and evaluate the optimal parameter values of WRF model. Three boundary data and two spatial resolutions were adopted to evaluate the superiority of the calibrated optimal parameters to default parameters under the WRF simulations with different boundary conditions and spatial resolutions, respectively. Physical interpretations of the optimal parameters indicating how to improve precipitation simulation results were also examined. All the results showed that the optimal parameters obtained by ASMO are superior to the default parameters for WRF simulations for predicting summer precipitation in the Greater Beijing Area because the optimal parameters are not constrained by specific precipitation events, boundary conditions, and spatial resolutions. The optimal values of the nine parameters were determined from 127 parameter samples using the ASMO method, which showed that the ASMO method is very highly-efficient for optimizing WRF model parameters.

  2. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  3. Simulation of an Extreme Off Season Rainy Event over Senegal Using WRF ARW Model: A focus on dynamic, thermodynamic processes and predictability

    NASA Astrophysics Data System (ADS)

    Sarr, A.

    2016-12-01

    This study investigates less known weather events, Off Season Rain affecting during boreal winter Western parts of Sahel region mainly, Senegal, Cape Verde and Mauritania. They are characterized by cloudy conditions at mid level, which can trigger light long lasting rains. In January 2002, an extreme case occurred from 09 to 11th producing unusual heavy rains, which had dramatic consequences on livestock and irrigated crops. The Weather and Research Forecast model (WRF ARW version 3.4) is used to simulate the event, which affected the western coast around the land/ocean interface and caused huge damages in Senegal and Mauritania. The model was able to reasonably simulate the event and its intensity 2 to 3 days in advance, demonstrating the usefulness of such a tools for early warning system (EWS), which could help mitigate the impacts. The location of the rain band was closer to the observed situation in higher resolution domains. The study showed keys dynamic and thermodynamic conditions associated with the event. Precipitable water (PW) evolution played a central role on the intensity of the event. The deep trough, associated with the disturbance, forced a northeast transport of moisture from the Inter Tropical Convergence Zone (ITCZ) over the Ocean towards Senegal and Mauritania.

  4. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis.

    PubMed

    Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L

    2011-12-01

    To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.

  5. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  6. On simulating large earthquakes by Green's-function addition of smaller earthquakes

    NASA Astrophysics Data System (ADS)

    Joyner, William B.; Boore, David M.

    Simulation of ground motion from large earthquakes has been attempted by a number of authors using small earthquakes (subevents) as Green's functions and summing them, generally in a random way. We present a simple model for the random summation of subevents to illustrate how seismic scaling relations can be used to constrain methods of summation. In the model η identical subevents are added together with their start times randomly distributed over the source duration T and their waveforms scaled by a factor κ. The subevents can be considered to be distributed on a fault with later start times at progressively greater distances from the focus, simulating the irregular propagation of a coherent rupture front. For simplicity the distance between source and observer is assumed large compared to the source dimensions of the simulated event. By proper choice of η and κ the spectrum of the simulated event deduced from these assumptions can be made to conform at both low- and high-frequency limits to any arbitrary seismic scaling law. For the ω -squared model with similarity (that is, with constant Moƒ3o scaling, where ƒo is the corner frequency), the required values are η = (Mo/Moe)4/3 and κ = (Mo/Moe)-1/3, where Mo is moment of the simulated event and Moe is the moment of the subevent. The spectra resulting from other choices of η and κ, will not conform at both high and low frequency. If η is determined by the ratio of the rupture area of the simulated event to that of the subevent and κ = 1, the simulated spectrum will conform at high frequency to the ω-squared model with similarity, but not at low frequency. Because the high-frequency part of the spectrum is generally the important part for engineering applications, however, this choice of values for η and κ may be satisfactory in many cases. If η is determined by the ratio of the moment of the simulated event to that of the subevent and κ = 1, the simulated spectrum will conform at low frequency to the ω-squared model with similarity, but not at high frequency. Interestingly, the high-frequency scaling implied by this latter choice of η and κ corresponds to an ω-squared model with constant Moƒ4o—a scaling law proposed by Nuttli, although questioned recently by Haar and others. Simple scaling with κ equal to unity and η equal to the moment ratio would work if the high-frequency spectral decay were ω-1.5 instead of ω-2. Just the required decay is exhibited by the stochastic source model recently proposed by Joynet, if the dislocation-time function is deconvolved out of the spectrum. Simulated motions derived from such source models could be used as subevents rather than recorded motions as is usually done. This strategy is a promising approach to simulation of ground motion from an extended rupture.

  7. Enhancement of the Logistics Battle Command Model: Architecture Upgrades and Attrition Module Development

    DTIC Science & Technology

    2017-01-05

    module. 15. SUBJECT TERMS Logistics, attrition, discrete event simulation, Simkit, LBC 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION...stochastics, and discrete event model programmed in Java building largely on the Simkit library. The primary purpose of the LBC model is to support...equations makes them incompatible with the discrete event construct of LBC. Bullard further advances this methodology by developing a stochastic

  8. Stochastic generation of hourly rainstorm events in Johor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nojumuddin, Nur Syereena; Yusof, Fadhilah; Yusop, Zulkifli

    2015-02-03

    Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was usedmore » in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.« less

  9. Staffs’ and managers’ perceptions of how and when discrete event simulation modelling can be used as a decision support in quality improvement: a focus group discussion study at two hospital settings in Sweden

    PubMed Central

    Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria

    2017-01-01

    Objective To explore healthcare staffs’ and managers’ perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Design Two focus group discussions were performed. Setting Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Participants Healthcare staff and managers (n=13) from the two settings. Interventions Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Results Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Conclusions Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. PMID:28588107

  10. Improving simulated long-term responses of vegetation to temperature and precipitation extremes using the ACME land model

    NASA Astrophysics Data System (ADS)

    Ricciuto, D. M.; Warren, J.; Guha, A.

    2017-12-01

    While carbon and energy fluxes in current Earth system models generally have reasonable instantaneous responses to extreme temperature and precipitation events, they often do not adequately represent the long-term impacts of these events. For example, simulated net primary productivity (NPP) may decrease during an extreme heat wave or drought, but may recover rapidly to pre-event levels following the conclusion of the extreme event. However, field measurements indicate that long-lasting damage to leaves and other plant components often occur, potentially affecting the carbon and energy balance for months after the extreme event. The duration and frequency of such extreme conditions is likely to shift in the future, and therefore it is critical for Earth system models to better represent these processes for more accurate predictions of future vegetation productivity and land-atmosphere feedbacks. Here we modify the structure of the Accelerated Climate Model for Energy (ACME) land surface model to represent long-term impacts and test the improved model against observations from experiments that applied extreme conditions in growth chambers. Additionally, we test the model against eddy covariance measurements that followed extreme conditions at selected locations in North America, and against satellite-measured vegetation indices following regional extreme events.

  11. Modeling of single event transients with dual double-exponential current sources: Implications for logic cell characterization

    DOE PAGES

    Black, Dolores Archuleta; Robinson, William H.; Wilcox, Ian Zachary; ...

    2015-08-07

    Single event effects (SEE) are a reliability concern for modern microelectronics. Bit corruptions can be caused by single event upsets (SEUs) in the storage cells or by sampling single event transients (SETs) from a logic path. Likewise, an accurate prediction of soft error susceptibility from SETs requires good models to convert collected charge into compact descriptions of the current injection process. This paper describes a simple, yet effective, method to model the current waveform resulting from a charge collection event for SET circuit simulations. The model uses two double-exponential current sources in parallel, and the results illustrate why a conventionalmore » model based on one double-exponential source can be incomplete. Furthermore, a small set of logic cells with varying input conditions, drive strength, and output loading are simulated to extract the parameters for the dual double-exponential current sources. As a result, the parameters are based upon both the node capacitance and the restoring current (i.e., drive strength) of the logic cell.« less

  12. Detection and characterization of debris avalanche and pyroclastic flow dynamics from the simulation of the seismic signal they generate: application to Montserrat, Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Mangeney, A.; Moretti, L.; Stutzmann, E.; Calder, E. S.; Smith, P. J.; Capdeville, Y.; Le Friant, A.; Cole, P.; Luckett, R.; Robertson, R.

    2011-12-01

    Gravitational instabilities such as debris avalanches or pyroclastic flows represent one of the major natural hazards for populations who live in mountainous or volcanic areas. Detection and understanding of the dynamics of these events is crucial for risk assessment. Furthermore, during an eruption, a series of explosions and gravitational flows can occur, making it difficult to retrieve the characteristics of the individual gravitational events such as their volume, velocity, etc. In this context, the seismic signal generated by these events provides a unique tool to extract information on the history of the eruptive process and to validate gravitational flow models. We analyze here a series of events including explosions, debris avalanche and pyroclastic flows occurring in Montserrat in December 1997. This seismic signal is composed of six main pulses. The characteristics of the seismic signals generated by pyroclastic flows (amplitude, emergent onset, frequency spectrum, etc.) are described and linked to the volume of the individual events estimated from past field surveys. As a first step, we simulate the waveform of each event by assuming that the generation process reduces to a simple force applied at the surface of the topography. Going further, we perform detailed numerical simulation of the Boxing Day debris avalanche and of the following pyroclastic flow using a landslide model able to take into account the 3D topography. The stress field generated by the gravitational flows on the topography is then applied as surface boundary condition in a wave propagation model, making it possible to simulate the seismic signal generated by the avalanche and pyroclastic flow. Comparison between the simulated signal and the seismic signal recorded at the Puerto Rico seismic station located 450 km away from the source, show that this method allows us to reproduce the low frequency seismic signal and to constrain the volume and frictional behavior of the individual events. As a result, simulation of seismic signals generated by gravitational flows provides insight into the history of eruptive sequences and into the characteristics of the individual events.

  13. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  14. Rainfall Stochastic models

    NASA Astrophysics Data System (ADS)

    Campo, M. A.; Lopez, J. J.; Rebole, J. P.

    2012-04-01

    This work was carried out in north of Spain. San Sebastian A meteorological station, where there are available precipitation records every ten minutes was selected. Precipitation data covers from October of 1927 to September of 1997. Pulse models describe the temporal process of rainfall as a succession of rainy cells, main storm, whose origins are distributed in time according to a Poisson process and a secondary process that generates a random number of cells of rain within each storm. Among different pulse models, the Bartlett-Lewis was used. On the other hand, alternative renewal processes and Markov chains describe the way in which the process will evolve in the future depending only on the current state. Therefore they are nor dependant on past events. Two basic processes are considered when describing the occurrence of rain: the alternation of wet and dry periods and temporal distribution of rainfall in each rain event, which determines the rainwater collected in each of the intervals that make up the rain. This allows the introduction of alternative renewal processes and Markov chains of three states, where interstorm time is given by either of the two dry states, short or long. Thus, the stochastic model of Markov chains tries to reproduce the basis of pulse models: the succession of storms, each one composed for a series of rain, separated by a short interval of time without theoretical complexity of these. In a first step, we analyzed all variables involved in the sequential process of the rain: rain event duration, event duration of non-rain, average rainfall intensity in rain events, and finally, temporal distribution of rainfall within the rain event. Additionally, for pulse Bartlett-Lewis model calibration, main descriptive statistics were calculated for each month, considering the process of seasonal rainfall in each month. In a second step, both models were calibrated. Finally, synthetic series were simulated with calibration parameters; series were recorded every ten minutes and hourly, aggregated. Preliminary results show adequate simulation of the main features of rain. Main variables are well simulated for time series of ten minutes, also over one hour precipitation time series, which are those that generate higher rainfall hydrologic design. For coarse scales, less than one hour, rainfall durations are not appropriate under the simulation. A hypothesis may be an excessive number of simulated events, which causes further fragmentation of storms, resulting in an excess of rain "short" (less than 1 hour), and therefore also among rain events, compared with the ones that occur in the actual series.

  15. Autonomous control of production networks using a pheromone approach

    NASA Astrophysics Data System (ADS)

    Armbruster, D.; de Beer, C.; Freitag, M.; Jagalski, T.; Ringhofer, C.

    2006-04-01

    The flow of parts through a production network is usually pre-planned by a central control system. Such central control fails in presence of highly fluctuating demand and/or unforeseen disturbances. To manage such dynamic networks according to low work-in-progress and short throughput times, an autonomous control approach is proposed. Autonomous control means a decentralized routing of the autonomous parts themselves. The parts’ decisions base on backward propagated information about the throughput times of finished parts for different routes. So, routes with shorter throughput times attract parts to use this route again. This process can be compared to ants leaving pheromones on their way to communicate with following ants. The paper focuses on a mathematical description of such autonomously controlled production networks. A fluid model with limited service rates in a general network topology is derived and compared to a discrete-event simulation model. Whereas the discrete-event simulation of production networks is straightforward, the formulation of the addressed scenario in terms of a fluid model is challenging. Here it is shown, how several problems in a fluid model formulation (e.g. discontinuities) can be handled mathematically. Finally, some simulation results for the pheromone-based control with both the discrete-event simulation model and the fluid model are presented for a time-dependent influx.

  16. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  17. Simulation of metals transport and toxicity at a mine-impacted watershed: California Gulch, Colorado.

    PubMed

    Velleux, Mark L; Julien, Pierre Y; Rojas-Sanchez, Rosalia; Clements, William H; England, John F

    2006-11-15

    The transport and toxicity of metals at the California Gulch, Colorado mine-impacted watershed were simulated with a spatially distributed watershed model. Using a database of observations for the period 1984-2004, hydrology, sediment transport, and metals transport were simulated for a June 2003 calibration event and a September 2003 validation event. Simulated flow volumes were within approximately 10% of observed conditions. Observed ranges of total suspended solids, cadmium, copper, and zinc concentrations were also successfully simulated. The model was then used to simulate the potential impacts of a 1-in-100-year rainfall event. Driven by large flows and corresponding soil and sediment erosion for the 1-in-100-year event, estimated solids and metals export from the watershed is 10,000 metric tons for solids, 215 kg for Cu, 520 kg for Cu, and 15,300 kg for Zn. As expressed by the cumulative criterion unit (CCU) index, metals concentrations far exceed toxic effects thresholds, suggesting a high probability of toxic effects downstream of the gulch. More detailed Zn source analyses suggest that much of the Zn exported from the gulch originates from slag piles adjacent to the lower gulch floodplain and an old mining site located near the head of the lower gulch.

  18. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  19. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model.

  20. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    EPA Science Inventory

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  1. Urban nonpoint source pollution buildup and washoff models for simulating storm runoff quality in the Los Angeles County.

    PubMed

    Wang, Long; Wei, Jiahua; Huang, Yuefei; Wang, Guangqian; Maqsood, Imran

    2011-07-01

    Many urban nonpoint source pollution models utilize pollutant buildup and washoff functions to simulate storm runoff quality of urban catchments. In this paper, two urban pollutant washoff load models are derived using pollutant buildup and washoff functions. The first model assumes that there is no residual pollutant after a storm event while the second one assumes that there is always residual pollutant after each storm event. The developed models are calibrated and verified with observed data from an urban catchment in the Los Angeles County. The application results show that the developed model with consideration of residual pollutant is more capable of simulating nonpoint source pollution from urban storm runoff than that without consideration of residual pollutant. For the study area, residual pollutant should be considered in pollutant buildup and washoff functions for simulating urban nonpoint source pollution when the total runoff volume is less than 30 mm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  3. The influence of preferential flow on pressure propagation and landslide triggering of the Rocca Pitigliana landslide

    NASA Astrophysics Data System (ADS)

    Shao, Wei; Bogaard, Thom; Bakker, Mark; Berti, Matteo

    2016-12-01

    The fast pore water pressure response to rain events is an important triggering factor for slope instability. The fast pressure response may be caused by preferential flow that bypasses the soil matrix. Currently, most of the hydro-mechanical models simulate pore water pressure using a single-permeability model, which cannot quantify the effects of preferential flow on pressure propagation and landslide triggering. Previous studies showed that a model based on the linear-diffusion equation can simulate the fast pressure propagation in near-saturated landslides such as the Rocca Pitigliana landslide. In such a model, the diffusion coefficient depends on the degree of saturation, which makes it difficult to use the model for predictions. In this study, the influence of preferential flow on pressure propagation and slope stability is investigated with a 1D dual-permeability model coupled with an infinite-slope stability approach. The dual-permeability model uses two modified Darcy-Richards equations to simultaneously simulate the matrix flow and preferential flow in hillslopes. The simulated pressure head is used in an infinite-slope stability analysis to identify the influence of preferential flow on the fast pressure response and landslide triggering. The dual-permeability model simulates the height and arrival of the pressure peak reasonably well. Performance of the dual-permeability model is as good as or better than the linear-diffusion model even though the dual-permeability model is calibrated for two single pulse rain events only, while the linear-diffusion model is calibrated for each rain event separately. In conclusion, the 1D dual-permeability model is a promising tool for landslides under similar conditions.

  4. On the contributions of diffusion and thermal activation to electron transfer between Phormidium laminosum plastocyanin and cytochrome f: Brownian dynamics simulations with explicit modeling of nonpolar desolvation interactions and electron transfer events.

    PubMed

    Gabdoulline, Razif R; Wade, Rebecca C

    2009-07-08

    The factors that determine the extent to which diffusion and thermal activation processes govern electron transfer (ET) between proteins are debated. The process of ET between plastocyanin (PC) and cytochrome f (CytF) from the cyanobacterium Phormidium laminosum was initially thought to be diffusion-controlled but later was found to be under activation control (Schlarb-Ridley, B. G.; et al. Biochemistry 2005, 44, 6232). Here we describe Brownian dynamics simulations of the diffusional association of PC and CytF, from which ET rates were computed using a detailed model of ET events that was applied to all of the generated protein configurations. The proteins were modeled as rigid bodies represented in atomic detail. In addition to electrostatic forces, which were modeled as in our previous simulations of protein-protein association, the proteins interacted by a nonpolar desolvation (hydrophobic) force whose derivation is described here. The simulations yielded close to realistic residence times of transient protein-protein encounter complexes of up to tens of microseconds. The activation barrier for individual ET events derived from the simulations was positive. Whereas the electrostatic interactions between P. laminosum PC and CytF are weak, simulations for a second cyanobacterial PC-CytF pair, that from Nostoc sp. PCC 7119, revealed ET rates influenced by stronger electrostatic interactions. In both cases, the simulations imply significant contributions to ET from both diffusion and thermal activation processes.

  5. Highly Efficient Computation of the Basal kon using Direct Simulation of Protein-Protein Association with Flexible Molecular Models.

    PubMed

    Saglam, Ali S; Chong, Lillian T

    2016-01-14

    An essential baseline for determining the extent to which electrostatic interactions enhance the kinetics of protein-protein association is the "basal" kon, which is the rate constant for association in the absence of electrostatic interactions. However, since such association events are beyond the milliseconds time scale, it has not been practical to compute the basal kon by directly simulating the association with flexible models. Here, we computed the basal kon for barnase and barstar, two of the most rapidly associating proteins, using highly efficient, flexible molecular simulations. These simulations involved (a) pseudoatomic protein models that reproduce the molecular shapes, electrostatic, and diffusion properties of all-atom models, and (b) application of the weighted ensemble path sampling strategy, which enhanced the efficiency of generating association events by >130-fold. We also examined the extent to which the computed basal kon is affected by inclusion of intermolecular hydrodynamic interactions in the simulations.

  6. Changes in record-breaking temperature events in China and projections for the future

    NASA Astrophysics Data System (ADS)

    Deng, Hanqing; Liu, Chun; Lu, Yanyu; He, Dongyan; Tian, Hong

    2017-06-01

    As global warming intensifies, more record-breaking (RB) temperature events are reported in many places around the world where temperatures are higher than ever before http://cn.bing.com/dict/search?q=.&FORM=BDVSP6&mkt=zh-cn. The RB temperatures have caused severe impacts on ecosystems and human society. Here, we address changes in RB temperature events occurring over China in the past (1961-2014) as well as future projections (2006-2100) using observational data and the newly available simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5). The number of RB events has a significant multi-decadal variability in China, and the intensity expresses a strong decrease from 1961 to 2014. However, more frequent RB events occurred in mid-eastern and northeastern China over last 30 years (1981-2010). Comparisons with observational data indicate multi-model ensemble (MME) simulations from the CMIP5 model perform well in simulating RB events for the historical run period (1961-2005). CMIP5 MME shows a relatively larger uncertainty for the change in intensity. From 2051 to 2100, fewer RB events are projected to occur in most parts of China according to RCP 2.6 scenarios. Over the longer period from 2006 to 2100, a remarkable increase is expected for the entire country according to RCP 8.5 scenarios and the maximum numbers of RB events increase by approximately 600 per year at end of twenty-first century.

  7. Dust Storm Feature Identification and Tracking from 4D Simulation Data

    NASA Astrophysics Data System (ADS)

    Yu, M.; Yang, C. P.

    2016-12-01

    Dust storms cause significant damage to health, property and the environment worldwide every year. To help mitigate the damage, dust forecasting models simulate and predict upcoming dust events, providing valuable information to scientists, decision makers, and the public. Normally, the model simulations are conducted in four-dimensions (i.e., latitude, longitude, elevation and time) and represent three-dimensional (3D), spatial heterogeneous features of the storm and its evolution over space and time. This research investigates and proposes an automatic multi-threshold, region-growing based identification algorithm to identify critical dust storm features, and track the evolution process of dust storm events through space and time. In addition, a spatiotemporal data model is proposed, which can support the characterization and representation of dust storm events and their dynamic patterns. Quantitative and qualitative evaluations for the algorithm are conducted to test the sensitivity, and capability of identify and track dust storm events. This study has the potential to assist a better early warning system for decision-makers and the public, thus making hazard mitigation plans more effective.

  8. A predictive model of nuclear power plant crew decision-making and performance in a dynamic simulation environment

    NASA Astrophysics Data System (ADS)

    Coyne, Kevin Anthony

    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.

  9. An Aircraft Encounter with Turbulence in the Vicinity of a Thunderstorm

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2003-01-01

    Large eddy simulations of three convective turbulence events are investigated and compared with observational data. Two events were characterized with severe turbulence and the other with moderate turbulence. Two of the events occurred during NASA s turbulence flight experiments during the spring of 2002, and the third was an event identified by the Flight Operational Quality Assurance (FOQA) Program. Each event was associated with developing or ongoing convection and was characterized by regions of low to moderate radar reflectivity. Model comparisons with observations are favorable. The data sets from these simulations can be used to test turbulence detection sensors.

  10. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  11. Single-Event Upset (SEU) model verification and threshold determination using heavy ions in a bipolar static RAM

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.

    1985-01-01

    Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.

  12. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  13. Modeling of episodic particulate matter events using a 3-D air quality model with fine grid: Applications to a pair of cities in the US/Mexico border

    NASA Astrophysics Data System (ADS)

    Choi, Yu-Jin; Hyde, Peter; Fernando, H. J. S.

    High (episodic) particulate matter (PM) events over the sister cities of Douglas (AZ) and Agua Prieta (Sonora), located in the US-Mexico border, were simulated using the 3D Eulerian air quality model, MODELS-3/CMAQ. The best available input information was used for the simulations, with pollution inventory specified on a fine grid. In spite of inherent uncertainties associated with the emission inventory as well as the chemistry and meteorology of the air quality simulation tool, model evaluations showed acceptable PM predictions, while demonstrating the need for including the interaction between meteorology and emissions in an interactive mode in the model, a capability currently unavailable in MODELS-3/CMAQ when dealing with PM. Sensitivity studies on boundary influence indicate an insignificant regional (advection) contribution of PM to the study area. The contribution of secondary particles to the occurrence of high PM events was trivial. High PM episodes in the study area, therefore, are purely local events that largely depend on local meteorological conditions. The major PM emission sources were identified as vehicular activities on unpaved/paved roads and wind-blown dust. The results will be of immediate utility in devising PM mitigation strategies for the study area, which is one of the US EPA-designated non-attainment areas with respect to PM.

  14. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    NASA Astrophysics Data System (ADS)

    Nijland, Linda; Arentze, Theo; Timmermans, Harry

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.

  15. An evaluation of the performance of a WRF multi-physics ensemble for heatwave events over the city of Melbourne in southeast Australia

    NASA Astrophysics Data System (ADS)

    Imran, H. M.; Kala, J.; Ng, A. W. M.; Muthukumaran, S.

    2018-04-01

    Appropriate choice of physics options among many physics parameterizations is important when using the Weather Research and Forecasting (WRF) model. The responses of different physics parameterizations of the WRF model may vary due to geographical locations, the application of interest, and the temporal and spatial scales being investigated. Several studies have evaluated the performance of the WRF model in simulating the mean climate and extreme rainfall events for various regions in Australia. However, no study has explicitly evaluated the sensitivity of the WRF model in simulating heatwaves. Therefore, this study evaluates the performance of a WRF multi-physics ensemble that comprises 27 model configurations for a series of heatwave events in Melbourne, Australia. Unlike most previous studies, we not only evaluate temperature, but also wind speed and relative humidity, which are key factors influencing heatwave dynamics. No specific ensemble member for all events explicitly showed the best performance, for all the variables, considering all evaluation metrics. This study also found that the choice of planetary boundary layer (PBL) scheme had largest influence, the radiation scheme had moderate influence, and the microphysics scheme had the least influence on temperature simulations. The PBL and microphysics schemes were found to be more sensitive than the radiation scheme for wind speed and relative humidity. Additionally, the study tested the role of Urban Canopy Model (UCM) and three Land Surface Models (LSMs). Although the UCM did not play significant role, the Noah-LSM showed better performance than the CLM4 and NOAH-MP LSMs in simulating the heatwave events. The study finally identifies an optimal configuration of WRF that will be a useful modelling tool for further investigations of heatwaves in Melbourne. Although our results are invariably region-specific, our results will be useful to WRF users investigating heatwave dynamics elsewhere.

  16. The Co-Evolution of Knowledge and Event Memory

    ERIC Educational Resources Information Center

    Nelson, Angela B.; Shiffrin, Richard M.

    2013-01-01

    We present a theoretical framework and a simplified simulation model for the co-evolution of knowledge and event memory, both termed SARKAE (Storing and Retrieving Knowledge and Events). Knowledge is formed through the accrual of individual events, a process that operates in tandem with the storage of individual event memories. In 2 studies, new…

  17. Computer simulation techniques for artificial modification of the ionosphere. Final report 31 jan 79-30 apr 81

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vance, B.; Mendillo, M.

    1981-04-30

    A three-dimensional model of the ionosphere was developed including chemical reactions and neutral and plasma transport. The model uses Finite Element Simulation to simulate ionospheric modification rather than solving a set of differential equations. The initial conditions of the Los Alamos Scientific Laboratory experiments, Lagopedo Uno and Dos, were input to the model, and these events were simulated. Simulation results were compared to ground and rocketborne electron-content measurements. A simulation of the transport of released SF6 was also made.

  18. A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris

    2017-04-01

    Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.

  19. FastSim: A Fast Simulation for the SuperB Detector

    NASA Astrophysics Data System (ADS)

    Andreassen, R.; Arnaud, N.; Brown, D. N.; Burmistrov, L.; Carlson, J.; Cheng, C.-h.; Di Simone, A.; Gaponenko, I.; Manoni, E.; Perez, A.; Rama, M.; Roberts, D.; Rotondo, M.; Simi, G.; Sokoloff, M.; Suzuki, A.; Walsh, J.

    2011-12-01

    We have developed a parameterized (fast) simulation for detector optimization and physics reach studies of the proposed SuperB Flavor Factory in Italy. Detector components are modeled as thin sections of planes, cylinders, disks or cones. Particle-material interactions are modeled using simplified cross-sections and formulas. Active detectors are modeled using parameterized response functions. Geometry and response parameters are configured using xml files with a custom-designed schema. Reconstruction algorithms adapted from BaBar are used to build tracks and clusters. Multiple sources of background signals can be merged with primary signals. Pattern recognition errors are modeled statistically by randomly misassigning nearby tracking hits. Standard BaBar analysis tuples are used as an event output. Hadronic B meson pair events can be simulated at roughly 10Hz.

  20. Characterizing Drought Events from a Hydrological Model Ensemble

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia

    2017-04-01

    Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event detection and characterization. This ensemble approach allows for uncertainty estimates and confidence intervals to be explored in simulations of drought event characteristics, such as duration and severity, which would not otherwise be available from a deterministic approach. The acquired understanding of uncertainty in drought events may then be applied to historic drought reconstructions, supplying evidence which could prove vital in decision making scenarios.

  1. Single-Event Effects in High-Frequency Linear Amplifiers: Experiment and Analysis

    NASA Astrophysics Data System (ADS)

    Zeinolabedinzadeh, Saeed; Ying, Hanbin; Fleetwood, Zachary E.; Roche, Nicolas J.-H.; Khachatrian, Ani; McMorrow, Dale; Buchner, Stephen P.; Warner, Jeffrey H.; Paki-Amouzou, Pauline; Cressler, John D.

    2017-01-01

    The single-event transient (SET) response of two different silicon-germanium (SiGe) X-band (8-12 GHz) low noise amplifier (LNA) topologies is fully investigated in this paper. The two LNAs were designed and implemented in 130nm SiGe HBT BiCMOS process technology. Two-photon absorption (TPA) laser pulses were utilized to induce transients within various devices in these LNAs. Impulse response theory is identified as a useful tool for predicting the settling behavior of the LNAs subjected to heavy ion strikes. Comprehensive device and circuit level modeling and simulations were performed to accurately simulate the behavior of the circuits under ion strikes. The simulations agree well with TPA measurements. The simulation, modeling and analysis presented in this paper can be applied for any other circuit topologies for SET modeling and prediction.

  2. Analysing the response of European ecosystems to droughts and heat waves within ISI-MIP2 simulations.

    NASA Astrophysics Data System (ADS)

    Dury, M.; Henrot, A. J.; Francois, L. M.; Munhoven, G.; Jacquemin, I.; Friend, A. D.; Rademacher, T. T.; Hacket Pain, A. J.; Hickler, T.

    2015-12-01

    With unprecedented speed and extent, the future climate change can be expected to severely impact terrestrial ecosystems due to more frequent extreme events, such as droughts or heat waves. What will be the impacts of these extreme events on ecosystem functioning and structure? How far will net primary production be reduced by such events? What will be the impact on plant mortality? Could such events trigger changes in the abundance of plant species, thus leading to biome shifts? In this contribution, we propose to use ISI-MIP2 model historical simulations from the biome sector to analyse the response of ecosystems to droughts or heat waves, trying to understand the differences between several vegetation models (e.g. CARAIB, HYBRID, LPJ). The analysis will focus on Europe. It will compare and assess the model responses for a series of well-marked drought or heat wave events in the simulated historical period, such as those that occurred in 1976, 2003 or 2010. This analysis will be performed in terms of several important environmental variables, like soil water and hydric stress, runoff, PFT abundance, net primary productivity and biomass, fire frequency, turnover of soil organic matter, etc. Whenever possible, the response of the model will be compared to available data for the most recent well-marked events. Examples of data to be used are eddy covariance, satellite data (including leaf area and fire occurrence) or tree rings.

  3. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auld, Joshua; Hope, Michael; Ley, Hubert

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less

  4. Convection-Resolving Climate Change Simulations: Intensification of Heavy Hourly Precipitation Events

    NASA Astrophysics Data System (ADS)

    Ban, N.; Schmidli, J.; Schar, C.

    2014-12-01

    Reliable climate-change projections of extreme precipitation events are of great interest to decision makers, due to potentially important hydrological impacts such as floods, land slides and debris flows. Low-resolution climate models generally project increases of heavy precipitation events with climate change, but there are large uncertainties related to the limited spatial resolution and the parameterized representation of atmospheric convection. Here we employ a convection-resolving version of the COSMO model across an extended region (1100 km x 1100 km) covering the European Alps to investigate the differences between parameterized and explicit convection in climate-change scenarios. We conduct 10-year long integrations at resolutions of 12 and 2km. Validation using ERA-Interim driven simulations reveals major improvements with the 2km resolution, in particular regarding the diurnal cycle of mean precipitation and the representation of hourly extremes. In addition, 2km simulations replicate the observed super-adiabatic scaling at precipitation stations, i.e. peak hourly events increase faster with temperature than the Clausius-Clapeyron scaling of 7%/K (see Ban et al. 2014). Convection-resolving climate change scenarios are conducted using control (1991-2000) and scenario (2081-2090) simulations driven by a CMIP5 GCM (i.e. the MPI-ESM-LR) under the IPCC RCP8.5 scenario. Comparison between 12 and 2km resolutions with parameterized and explicit convection, respectively, reveals close agreement in terms of mean summer precipitation amounts (decrease by 30%), and regarding slight increases of heavy day-long events (amounting to 15% for 90th-percentile for wet-day precipitation). However, the different resolutions yield large differences regarding extreme hourly precipitation, with the 2km version projecting substantially faster increases of heavy hourly precipitation events (about 30% increases for 90th-percentile hourly events). Ban, N., J. Schmidli and C. Schӓr (2014): Evaluation of the convection-resolving regional climate modeling approach in decade-long simulations. J. Geophys. Res. Atmos.,119, 7889-7907, doi:10.1002/2014JD021478

  5. MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation

    NASA Technical Reports Server (NTRS)

    Charest, Leonard

    1994-01-01

    This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.

  6. Two case studies on NARCCAP precipitation extremes

    NASA Astrophysics Data System (ADS)

    Weller, Grant B.; Cooley, Daniel; Sain, Stephan R.; Bukovsky, Melissa S.; Mearns, Linda O.

    2013-09-01

    We introduce novel methodology to examine the ability of six regional climate models (RCMs) in the North American Regional Climate Change Assessment Program (NARCCAP) ensemble to simulate past extreme precipitation events seen in the observational record over two different regions and seasons. Our primary objective is to examine the strength of daily correspondence of extreme precipitation events between observations and the output of both the RCMs and the driving reanalysis product. To explore this correspondence, we employ methods from multivariate extreme value theory. These methods require that we account for marginal behavior, and we first model and compare climatological quantities which describe tail behavior of daily precipitation for both the observations and model output before turning attention to quantifying the correspondence of the extreme events. Daily precipitation in a West Coast region of North America is analyzed in two seasons, and it is found that the simulated extreme events from the reanalysis-driven NARCCAP models exhibit strong daily correspondence to extreme events in the observational record. Precipitation over a central region of the United States is examined, and we find some daily correspondence between winter extremes simulated by reanalysis-driven NARCCAP models and those seen in observations, but no such correspondence is found for summer extremes. Furthermore, we find greater discrepancies among the NARCCAP models in the tail characteristics of the distribution of daily summer precipitation over this region than seen in precipitation over the West Coast region. We find that the models which employ spectral nudging exhibit stronger tail dependence to observations in the central region.

  7. Modeling four occurred debris flow events in the Dolomites area (North-Eastern Italian Alps)

    NASA Astrophysics Data System (ADS)

    Boreggio, Mauro; Gregoretti, Carlo; Degetto, Massimo; Bernard, Martino

    2016-04-01

    Four occurred debris flows in the Dolomites area (North-Eastern Italian Alps) are modeled by back-analysis. The four debris flows events are those occurred at Rio Lazer (Trento) on the 4th of November 1966, at Fiames (Belluno) on the 5th of July 2006, at Rovina di Cancia (Belluno) on the 18th of July 2009 and at Rio Val Molinara (Trento) on the 15th of August 2010. In all the events, runoff entrained sediments present on natural channels and formed a solid-liquid wave that routed downstream. The first event concerns the routing of debris flow on an inhabited fan. The second event the deviation of debris flow from the usual path due to an obstruction with the excavation of a channel in the scree and the downstream spreading in a wood. The third event concerns the routing of debris flow in a channel with an ending the reservoir, its overtopping and final spreading in the inhabited area. The fourth event concerns the routing of debris flow along the main channel downstream the initiation area until spreading just upstream a village. All the four occurred debris flows are simulated by modeling runoff that entrained debris flow for determining the solid-liquid hydrograph. The routing of the solid-liquid hydrograph is simulated by a bi-phase cell model based on the kinematic approach. The comparison between simulated and measured erosion and deposition depths is satisfactory. Nearly the same parameters for computing erosion and deposition were used for all the four occurred events. The maps of erosion and deposition depths are obtained by comparing the results of post-event surveys with the pre-event DEM. The post-event surveys were conducted by using different instruments (LiDAR and GPS) or the combination photos-single points depth measurements (in this last case it is possible obtaining the deposition/erosion depths by means of stereoscopy techniques).

  8. Numerical modeling of coronal mass ejections based on various pre-event model atmospheres

    NASA Technical Reports Server (NTRS)

    Suess, S. T.; Wang, A. H.; Wu, S. T.; Poletto, G.

    1994-01-01

    We examine how the initial state (pre-event corona) affects the numerical MHD simulation for a coronal mass ejection (CME). Earlier simulations based on a pre-event corona with a homogeneous density and temperature distribution at lower boundary (i.e. solar surface) have been used to analyze the role of streamer properties in determining the characteristics of loop-like transients. The present paper extends these studies to show how a broader class of global coronal properties leads not only to different types of CME's, but also modifies the adjacent quiet corona and/or coronal holes. We consider four pre-event coronal cases: (1) Constant boundary conditions and a polytropic gas with gamma = 1.05; (2) Non-constant (latitude dependent) boundary conditions and a polytropic gas with gamma = 1.05; (3) Constant boundary conditions with a volumetric energy source and gamma = 1.67; (4) Non-constant (latitude dependent) boundary conditions with a volumetric energy source and gamma = 1.67. In all models, the pre-event magnetic fields separate the corona into closed field regions (streamers) and open field regions. The CME's initiation is simulated by introducing at the base of the corona, within the streamer region, a standard pressure pulse and velocity change. Boundary values are determined using MHD characteristic theory. The simulations show how different CME's, including loop-like transients, clouds, and bright rays, might occur. There are significant new features in comparison to published results. We conclude that the pre-event corona is a crucial factor in dictating CME's properties.

  9. Numerical Modeling of Coronal Mass Ejections Based on Various Pre-event Model Atmospheres

    NASA Technical Reports Server (NTRS)

    Wang, A. H.; Wu, S. T.; Suess, S. T.; Poletto, G.

    1995-01-01

    We examine how the initial state (pre-event corona) affects the numerical MHD simulation for a coronal mass ejection (CME). Earlier simulations based on a pre-event corona with a homogeneous density and temperature distribution, at the lower boundary (i.e., solar surface) have been used to analyze the role of streamer properties in determining the characteristics of loop-like transients. The present paper extends these studies to show how a broader class of global coronal properties leads not only to different types of CME's, but also modifies the adjacent quiet corona and/or coronal holes. We consider four pre-event coronal cases: (1) constant boundary conditions and a polytropic gas with gamma = 1.05; (2) non-constant (latitude dependent) boundary conditions and a polytropic gas with gamma = 1.05; (3) constant boundary conditions with a volumetric energy source and gamma = 1.67; (4) non-constant (latitude dependent) boundary conditions with a volumetric energy source and gamma = 1.67. In all models, the pre-event magnetic fields separate the corona into closed field regions (streamers) and open field regions. The CME's initiation is simulated by introducing at the base of the corona, within the streamer region, a standard pressure pulse and velocity change. Boundary values are determined using magnetohydrodynamic (MHD) characteristic theory. The simulations show how different CME's, including loop-like transients, clouds and bright rays, might occur. There are significant new features in comparison to published results. We conclude that the pre-event corona is a crucial factor in dictating CME's properties.

  10. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations from a variety of CMIP5 model ensembles. Here, we present results for the UK 2013/14 winter floods as proof of concept and we show validation and testing results that demonstrate the robustness of our method. We also revisit the record temperatures over Europe in 2014 and present a detailed analysis of this attribution exercise as it is one of the events to demonstrate that we can make a sensible statement of how the odds for such a year to occur have changed while it still unfolds.

  11. A generic discrete-event simulation model for outpatient clinics in a large public hospital.

    PubMed

    Weerawat, Waressara; Pichitlamken, Juta; Subsombat, Peerapong

    2013-01-01

    The orthopedic outpatient department (OPD) ward in a large Thai public hospital is modeled using Discrete-Event Stochastic (DES) simulation. Key Performance Indicators (KPIs) are used to measure effects across various clinical operations during different shifts throughout the day. By considering various KPIs such as wait times to see doctors, percentage of patients who can see a doctor within a target time frame, and the time that the last patient completes their doctor consultation, bottlenecks are identified and resource-critical clinics can be prioritized. The simulation model quantifies the chronic, high patient congestion that is prevalent amongst Thai public hospitals with very high patient-to-doctor ratios. Our model can be applied across five different OPD wards by modifying the model parameters. Throughout this work, we show how DES models can be used as decision-support tools for hospital management.

  12. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    NASA Astrophysics Data System (ADS)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  13. Quasi-Dynamic Versus Fully-Dynamic Simulations of Slip Accumulation on Faults with Enhanced Dynamic Weakening

    NASA Astrophysics Data System (ADS)

    Lapusta, N.; Thomas, M.; Noda, H.; Avouac, J.

    2012-12-01

    Long-term simulations that incorporate both seismic events and aseismic slip are quite important for studies of earthquake physics but challenging computationally. To study long deformation histories, most simulation methods do not incorporate full inertial effects (wave propagation) during simulated earthquakes, using quasi-dynamic approximations instead. Here we compare the results of quasi-dynamic simulations to the fully dynamic ones for a range of problems to determine the applicability of the quasi-dynamic approach. Intuitively, the quasi-dynamic approach should do relatively well in problems where wave-mediated effects are relatively simple but should have substantially different (and hence wrong) response when the wave-mediated stress transfers dominate the character of the seismic events. This is exactly what we observe in our simulations. We consider a 2D model of a rate-and-state fault with a seismogenic (steady-state velocity-weakening) zone surrounded by creeping (steady-state velocity-strengthening) areas. If the seismogenic zone is described by the standard Dieterich-Ruina rate-and-state friction, the resulting earthquake sequences consist of relatively simple crack-like ruptures, and the inclusion of true wave-propagation effects mostly serves to concentrate stress more efficiently at the rupture front. Hence, in such models, rupture speeds and slip rates are significantly (several times) lower in the quasi-dynamic simulations compared to the fully dynamic ones, but the total slip, the crack-like nature of seismic events, and the overall pattern of earthquake sequences is comparable, consistently with prior studies. Such behavior can be classified as qualitatively similar but quantitatively different, and it motivates the popularity of the quasi-dynamic methods in simulations. However, the comparison changes dramatically once we consider a model with enhanced dynamic weakening in the seismogenic zone in the form of flash heating. In this case, the fully dynamic simulations produce seismic ruptures in the form of short-duration slip pulses, where the pulses form due to a combination of enhanced weakening and wave effects. The quasi-dynamic simulations in the same model produce completely different results, with large crack-like ruptures, different total slips, different rupture patterns, and different prestress state before large, model-spanning events. Such qualitative differences between the quasi-dynamic and fully-dynamic simulation should result in any model where inertial effects lead to qualitative differences, such as cases with supershear transition or fault with different materials on the two sides. We will present results on our current work on how the quasi-dynamic and fully dynamic simulations compare for the cases with heterogeneous fault properties.

  14. Predicting Strong Ground-Motion Seismograms for Magnitude 9 Cascadia Earthquakes Using 3D Simulations with High Stress Drop Sub-Events

    NASA Astrophysics Data System (ADS)

    Frankel, A. D.; Wirth, E. A.; Stephenson, W. J.; Moschetti, M. P.; Ramirez-Guzman, L.

    2015-12-01

    We have produced broadband (0-10 Hz) synthetic seismograms for magnitude 9.0 earthquakes on the Cascadia subduction zone by combining synthetics from simulations with a 3D velocity model at low frequencies (≤ 1 Hz) with stochastic synthetics at high frequencies (≥ 1 Hz). We use a compound rupture model consisting of a set of M8 high stress drop sub-events superimposed on a background slip distribution of up to 20m that builds relatively slowly. The 3D simulations were conducted using a finite difference program and the finite element program Hercules. The high-frequency (≥ 1 Hz) energy in this rupture model is primarily generated in the portion of the rupture with the M8 sub-events. In our initial runs, we included four M7.9-8.2 sub-events similar to those that we used to successfully model the strong ground motions recorded from the 2010 M8.8 Maule, Chile earthquake. At periods of 2-10 s, the 3D synthetics exhibit substantial amplification (about a factor of 2) for sites in the Puget Lowland and even more amplification (up to a factor of 5) for sites in the Seattle and Tacoma sedimentary basins, compared to rock sites outside of the Puget Lowland. This regional and more localized basin amplification found from the simulations is supported by observations from local earthquakes. There are substantial variations in the simulated M9 time histories and response spectra caused by differences in the hypocenter location, slip distribution, down-dip extent of rupture, coherence of the rupture front, and location of sub-events. We examined the sensitivity of the 3D synthetics to the velocity model of the Seattle basin. We found significant differences in S-wave focusing and surface wave conversions between a 3D model of the basin from a spatially-smoothed tomographic inversion of Rayleigh-wave phase velocities and a model that has an abrupt southern edge of the Seattle basin, as observed in seismic reflection profiles.

  15. Physically-based modelling of high magnitude torrent events with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the observed data collected in the first stage of the investigation. O'Brien, J.S., Julien, P.Y., Fullerton, W. T., 1993. Two-dimensional water flood and mudflow simulation. Journal of Hydraulic Engineering 119(2): 244-261.
 Song, X., Zhang, J., Zhan, C., Xuan, Y., Ye, M., Xu C., 2015. Global sensitivity analysis in hydrological modeling: Review of concepts, methods, theoretical frameworks, Journal of Hydrology 523: 739-757.

  16. Total Dust Deposition Flux During Precipitation in Toyama, Japan, in the Spring of 2009: A Sensitivity Analysis with the NASA GEOS-5 Model

    NASA Technical Reports Server (NTRS)

    Yasunari, Teppei J.; Colarco, Peter R.; Lau, William K. M.; Osada, Kazuo; Kido, Mizuka; Mahanama, Sarith P. P.; Kim, Kyu-Myong; Da Silva, Arlindo M.

    2015-01-01

    We compared the observed total dust deposition fluxes during precipitation (TDP) mainly at Toyama in Japan during the period January - April 2009 with results available from four NASA GEOS-5 global model experiments. The modeled results were obtained from three previous experiments and carried out in one experiment, which were all driven by assimilated meteorology and simulating aerosol distributions for the time period. We focus mainly on the observations of two distinct TDP events, which were reported in Osada et al. (2011), at Toyama, Japan, in February (Event B) and March 2009 (Event C). Although all of our GEOS-5 simulations captured aspects of the observed TDP, we found that our low horizontal spatial resolution control experiment performed generally the worst. The other three experiments were run at a higher spatial resolution, with the first differing only in that respect from the control, the second adding imposed a prescribed corrected precipitation product, and the final experiment adding as well assimilation of aerosol optical depth based on MODIS observations. During Event C, the increased horizontal resolution could increase TDP with precipitation increase. There was no significant improvement, however, due to the imposition of the corrected precipitation product. The simulation that incorporated aerosol data assimilation performed was by far the best for this event, but even so could only reproduce less than half of the observed TDP despite the significantly increased atmospheric dust mass concentrations. All three of the high spatial resolution experiments had higher simulated precipitation at Toyama than was observed and that in the lower resolution control run. During Event B, the aerosol data assimilation run did not perform appreciably better than the other higher resolution simulations, suggesting that upstream conditions (i.e., upstream cloudiness), or vertical or horizontal misplacement of the dust plume did not allow for significant improvement in the simulated aerosol distributions. Furthermore, a detailed comparison of observed hourly precipitation and surface particulate mass concentration data suggests that the observed TDP during Event B was highly dependent on short periods of weak precipitation correlated with elevated dust surface concentrations, important details possibly not captured well in a current global model.

  17. Gulf Coast megaregion evacuation traffic simulation modeling and analysis.

    DOT National Transportation Integrated Search

    2015-12-01

    This paper describes a project to develop a micro-level traffic simulation for a megaregion. To : accomplish this, a mass evacuation event was modeled using a traffic demand generation process that : created a spatial and temporal distribution of dep...

  18. Scientific Benefits of Space Science Models Archiving at Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Berrios, David; Chulaki, Anna; Hesse, Michael; MacNeice, Peter J.; Maddox, Marlo M.; Pulkkinen, Antti; Rastaetter, Lutz; Taktakishvili, Aleksandre

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. CCMC provides a web-based Run-on-Request system, by which the interested scientist can request simulations for a broad range of space science problems. To allow the models to be driven by data relevant to particular events CCMC developed a tool that automatically downloads data from data archives and transform them to required formats. CCMC also provides a tailored web-based visualization interface for the model output, as well as the capability to download the simulation output in portable format. CCMC offers a variety of visualization and output analysis tools to aid scientists in interpretation of simulation results. During eight years since the Run-on-request system became available the CCMC archived the results of almost 3000 runs that are covering significant space weather events and time intervals of interest identified by the community. The simulation results archived at CCMC also include a library of general purpose runs with modeled conditions that are used for education and research. Archiving results of simulations performed in support of several Modeling Challenges helps to evaluate the progress in space weather modeling over time. We will highlight the scientific benefits of CCMC space science model archive and discuss plans for further development of advanced methods to interact with simulation results.

  19. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  20. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Wellmann, J. F.; Thiele, S. T.; Lindsay, M. D.; Jessell, M. W.

    2015-11-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  1. Simulation of air admission in a propeller hydroturbine during transient events

    NASA Astrophysics Data System (ADS)

    Nicolle, J.; Morissette, J.-F.

    2016-11-01

    In this study, multiphysic simulations are carried out in order to model fluid loading and structural stresses on propeller blades during startup and runaway. It is found that air admission plays an important role during these transient events and that biphasic simulations are therefore required. At the speed no load regime, a large air pocket with vertical free surface forms in the centre of the runner displacing the water flow near the shroud. This significantly affects the torque developed on the blades and thus structural loading. The resulting pressures are applied to a quasi-static structural model and good agreement is obtained with experimental strain gauge data.

  2. WRF Simulations of the 20-22 January 2007 Snow Events over Eastern Canada: Comparison with In-Situ and Satellite Observations

    NASA Technical Reports Server (NTRS)

    Shi, J. J.; Tao, W.-K.; Matsui, T.; Cifelli, R.; Huo, A.; Lang, S.; Tokay, A.; Peters-Lidard, C.; Jackson, G.; Rutledge, S.; hide

    2009-01-01

    One of the grand challenges of the Global Precipitation Measurement (GPM) mission is to improve cold season precipitation measurements in middle and high latitudes through the use of high-frequency passive microwave radiometry. For this, the Weather Research and Forecasting (WRF) model with the Goddard microphysics scheme is coupled with a satellite data simulation unit (WRF-SDSU) that has been developed to facilitate over-land snowfall retrieval algorithms by providing a virtual cloud library and microwave brightness temperature (Tb) measurements consistent with the GPM Microwave Imager (GMI). This study tested the Goddard cloud microphysics scheme in WRF for two snowstorm events, a lake effect and a synoptic event, that occurred between 20 and 22 January 2007 over the Canadian CloudSAT/CALIPSO Validation Project (C3VP) site in Ontario, Canada. The 24h-accumulated snowfall predicted by the WRF model with the Goddard microphysics was comparable to the observed accumulated snowfall by the ground-based radar for both events. The model correctly predicted the onset and ending of both snow events at the CARE site. WRF simulations capture the basic cloud properties as seen by the ground-based radar and satellite (i.e., CloudSAT, AMSU-B) observations as well as the observed cloud streak organization in the lake event. This latter result reveals that WRF was able to capture the cloud macro-structure reasonably well.

  3. On the Numerical Study of Heavy Rainfall in Taiwan

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chen, Ching-Sen; Chen, Yi-Leng; Jou, Ben Jong-Dao; Lin, Pay-Liam; Starr, David OC. (Technical Monitor)

    2001-01-01

    Heavy rainfall events are frequently observed over the western side of the CMR (central mountain range), which runs through Taiwan in a north-south orientation, in a southwesterly flow regime and over the northeastern side of the CMR in a northeasterly flow regime. Previous studies have revealed the mechanisms by which the heavy rainfall events are formed. Some of them have examined characteristics of the heavy rainfall via numerical simulations. In this paper, some of the previous numerical studies on heavy rainfall events around Taiwan during the Mei-Yu season (May and June), summer (non-typhoon cases) and autumn will be reviewed. Associated mechanisms proposed from observational studies will be reviewed first, and then characteristics of numerically simulated heavy rainfall events will be presented. The formation mechanisms of heavy rainfall from simulated results and from observational analysis are then compared and discussed. Based on these previous modeling studies, we will also discuss what are the major observations and modeling processes which will be needed for understanding the heavy precipitation in the future.

  4. Response to droughts and heat waves of the productivity of natural and agricultural ecosystems in Europe within ISI-MIP2 historical simulations

    NASA Astrophysics Data System (ADS)

    François, Louis; Henrot, Alexandra-Jane; Dury, Marie; Jacquemin, Ingrid; Munhoven, Guy; Friend, Andrew; Rademacher, Tim T.; Hacket Pain, Andrew J.; Hickler, Thomas; Tian, Hanqin; Morfopoulos, Catherine; Ostberg, Sebastian; Chang, Jinfeng; Rafique, Rashid; Nishina, Kazuya

    2016-04-01

    According to the projections of climate models, extreme events such as droughts and heat waves are expected to become more frequent and more severe in the future. Such events are known to severely impact the productivity of both natural and agricultural ecosystems, and hence to affect ecosystem services such as crop yield and ecosystem carbon sequestration potential. Dynamic vegetation models are conventional tools to evaluate the productivity and carbon sequestration of ecosystems and their response to climate change. However, how far are these models able to correctly represent the sensitivity of ecosystems to droughts and heat waves? How do the responses of natural and agricultural ecosystems compare to each other, in terms of drought-induced changes in productivity and carbon sequestration? In this contribution, we use ISI-MIP2 model historical simulations from the biome sector to tentatively answer these questions. Nine dynamic vegetation models have participated in the biome sector intercomparison of ISI-MIP2: CARAIB, DLEM, HYBRID, JULES, LPJ-GUESS, LPJml, ORCHIDEE, VEGAS and VISIT. We focus the analysis on well-marked droughts or heat waves that occured in Europe after 1970, such as the 1976, 2003 and 2010 events. For most recent studied events, the model results are compared to the response observed at several eddy covariance sites in Europe, and, at a larger scale, to the changes in crop productivities reported in national statistics or to the drought impacts on gross primary productivity derived from satellite data (Terra MODIS instrument). The sensitivity of the models to the climatological dataset used in the simulations, as well as to the inclusion or not of anthropogenic land use, is also analysed within the studied events. Indeed, the ISI-MIP simulations have been run with four different historical climatic forcings, as well as for several land use/land cover configurations (natural vegetation, fixed land use and variable land use).

  5. A new concept for simulation of vegetated land surface dynamics - Part 1: The event driven phenology model

    NASA Astrophysics Data System (ADS)

    Kovalskyy, V.; Henebry, G. M.

    2012-01-01

    Phenologies of the vegetated land surface are being used increasingly for diagnosis and prognosis of climate change consequences. Current prospective and retrospective phenological models stand far apart in their approaches to the subject. We report on an exploratory attempt to implement a phenological model based on a new event driven concept which has both diagnostic and prognostic capabilities in the same modeling framework. This Event Driven Phenological Model (EDPM) is shown to simulate land surface phenologies and phenophase transition dates in agricultural landscapes based on assimilation of weather data and land surface observations from spaceborne sensors. The model enables growing season phenologies to develop in response to changing environmental conditions and disturbance events. It also has the ability to ingest remotely sensed data to adjust its output to improve representation of the modeled variable. We describe the model and report results of initial testing of the EDPM using Level 2 flux tower records from the Ameriflux sites at Mead, Nebraska, USA, and at Bondville, Illinois, USA. Simulating the dynamics of normalized difference vegetation index based on flux tower data, the predictions by the EDPM show good agreement (RMSE < 0.08; r2 > 0.8) for maize and soybean during several growing seasons at different locations. This study presents the EDPM used in the companion paper (Kovalskyy and Henebry, 2011) in a coupling scheme to estimate daily actual evapotranspiration over multiple growing seasons.

  6. A new concept for simulation of vegetated land surface dynamics - Part 1: The event driven phenology model

    NASA Astrophysics Data System (ADS)

    Kovalskyy, V.; Henebry, G. M.

    2011-05-01

    Phenologies of the vegetated land surface are being used increasingly for diagnosis and prognosis of climate change consequences. Current prospective and retrospective phenological models stand far apart in their approaches to the subject. We report on an exploratory attempt to implement a phenological model based on a new event driven concept which has both diagnostic and prognostic capabilities in the same modeling framework. This Event Driven Phenological Model (EDPM) is shown to simulate land surface phenologies and phenophase transition dates in agricultural landscapes based on assimilation of weather data and land surface observations from spaceborne sensors. The model enables growing season phenologies to develop in response to changing environmental conditions and disturbance events. It also has the ability to ingest remotely sensed data to adjust its output to improve representation of the modeled variable. We describe the model and report results of initial testing of the EDPM using Level 2 flux tower records from the Ameriflux sites at Mead, Nebraska, USA, and at Bondville, Illinois, USA. Simulating the dynamics of normalized difference vegetation index based on flux tower data, the predictions by the EDPM show good agreement (RMSE < 0.08; r2>0.8) for maize and soybean during several growing seasons at different locations. This study presents the EDPM used in the companion paper (Kovalskyy and Henebry, 2011) in a coupling scheme to estimate daily actual evapotranspiration over multiple growing seasons.

  7. Modeling effectiveness of management practices for flood mitigation using GIS spatial analysis functions in Upper Cilliwung watershed

    NASA Astrophysics Data System (ADS)

    Darma Tarigan, Suria

    2016-01-01

    Flooding is caused by excessive rainfall flowing downstream as cumulative surface runoff. Flooding event is a result of complex interaction of natural system components such as rainfall events, land use, soil, topography and channel characteristics. Modeling flooding event as a result of interaction of those components is a central theme in watershed management. The model is usually used to test performance of various management practices in flood mitigation. There are various types of management practices for flood mitigation including vegetative and structural management practices. Existing hydrological model such as SWAT and HEC-HMS models have limitation to accommodate discrete management practices such as infiltration well, small farm reservoir, silt pits in its analysis due to the lumped structure of these models. Aim of this research is to use raster spatial analysis functions of Geo-Information System (RGIS-HM) to model flooding event in Ciliwung watershed and to simulate impact of discrete management practices on surface runoff reduction. The model was validated using flooding data event of Ciliwung watershed on 29 January 2004. The hourly hydrograph data and rainfall data were available during period of model validation. The model validation provided good result with Nash-Suthcliff efficiency of 0.8. We also compared the RGIS-HM with Netlogo Hydrological Model (NL-HM). The RGIS-HM has similar capability with NL-HM in simulating discrete management practices in watershed scale.

  8. Stochastic summation of empirical Green's functions

    USGS Publications Warehouse

    Wennerberg, Leif

    1990-01-01

    Two simple strategies are presented that use random delay times for repeatedly summing the record of a relatively small earthquake to simulate the effects of a larger earthquake. The simulations do not assume any fault plane geometry or rupture dynamics, but realy only on the ω−2 spectral model of an earthquake source and elementary notions of source complexity. The strategies simulate ground motions for all frequencies within the bandwidth of the record of the event used as a summand. The first strategy, which introduces the basic ideas, is a single-stage procedure that consists of simply adding many small events with random time delays. The probability distribution for delays has the property that its amplitude spectrum is determined by the ratio of ω−2 spectra, and its phase spectrum is identically zero. A simple expression is given for the computation of this zero-phase scaling distribution. The moment rate function resulting from the single-stage simulation is quite simple and hence is probably not realistic for high-frequency (>1 Hz) ground motion of events larger than ML∼ 4.5 to 5. The second strategy is a two-stage summation that simulates source complexity with a few random subevent delays determined using the zero-phase scaling distribution, and then clusters energy around these delays to get an ω−2 spectrum for the sum. Thus, the two-stage strategy allows simulations of complex events of any size for which the ω−2 spectral model applies. Interestingly, a single-stage simulation with too few ω−2records to get a good fit to an ω−2 large-event target spectrum yields a record whose spectral asymptotes are consistent with the ω−2 model, but that includes a region in its spectrum between the corner frequencies of the larger and smaller events reasonably approximated by a power law trend. This spectral feature has also been discussed as reflecting the process of partial stress release (Brune, 1970), an asperity failure (Boatwright, 1984), or the breakdown of ω−2 scaling due to rupture significantly longer than the width of the seismogenic zone (Joyner, 1984).

  9. Simulation of earthquake caused building damages for the development of fast reconnaissance techniques

    NASA Astrophysics Data System (ADS)

    Schweier, C.; Markus, M.; Steinle, E.

    2004-04-01

    Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.

  10. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, Paul K.; Siegel, Andrew R.

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. Lastly, when the execution times for events are allowed to vary, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration.« less

  11. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE PAGES

    Romano, Paul K.; Siegel, Andrew R.

    2017-07-01

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. Lastly, when the execution times for events are allowed to vary, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration.« less

  12. Simulations of carbon fiber composite delamination tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kay, G

    2007-10-25

    Simulations of mode I interlaminar fracture toughness tests of a carbon-reinforced composite material (BMS 8-212) were conducted with LSDYNA. The fracture toughness tests were performed by U.C. Berkeley. The simulations were performed to investigate the validity and practicality of employing decohesive elements to represent interlaminar bond failures that are prevalent in carbon-fiber composite structure penetration events. The simulations employed a decohesive element formulation that was verified on a simple two element model before being employed to perform the full model simulations. Care was required during the simulations to ensure that the explicit time integration of LSDYNA duplicate the near steady-statemore » testing conditions. In general, this study validated the use of employing decohesive elements to represent the interlaminar bond failures seen in carbon-fiber composite structures, but the practicality of employing the elements to represent the bond failures seen in carbon-fiber composite structures during penetration events was not established.« less

  13. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model.

    PubMed

    Fabian, M Patricia; Stout, Natasha K; Adamkiewicz, Gary; Geggel, Amelia; Ren, Cizao; Sandel, Megan; Levy, Jonathan I

    2012-09-18

    In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality, and medical history on asthma outcomes (symptom-days, medication use, hospitalizations, and emergency room visits). The model can be used to evaluate building interventions and green building construction practices on pollutant concentrations, energy savings, and asthma healthcare utilization costs, and demonstrates the value of a simulation approach for studying complex diseases such as asthma.

  14. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    PubMed Central

    2012-01-01

    Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality, and medical history on asthma outcomes (symptom-days, medication use, hospitalizations, and emergency room visits). The model can be used to evaluate building interventions and green building construction practices on pollutant concentrations, energy savings, and asthma healthcare utilization costs, and demonstrates the value of a simulation approach for studying complex diseases such as asthma. PMID:22989068

  15. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  16. Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Oulman, Charles S.; Lee, Motoko Y.

    Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…

  17. Discrete-Event Simulation Models of Plasmodium falciparum Malaria

    PubMed Central

    McKenzie, F. Ellis; Wong, Roger C.; Bossert, William H.

    2008-01-01

    We develop discrete-event simulation models using a single “timeline” variable to represent the Plasmodium falciparum lifecycle in individual hosts and vectors within interacting host and vector populations. Where they are comparable our conclusions regarding the relative importance of vector mortality and the durations of host immunity and parasite development are congruent with those of classic differential-equation models of malaria, epidemiology. However, our results also imply that in regions with intense perennial transmission, the influence of mosquito mortality on malaria prevalence in humans may be rivaled by that of the duration of host infectivity. PMID:18668185

  18. Using Coupled Groundwater-Surface Water Models to Simulate Eco-Regional Differences in Climate Change Impacts on Hydrological Drought Regimes in British Columbia

    NASA Astrophysics Data System (ADS)

    Dierauer, J. R.; Allen, D. M.

    2016-12-01

    Climate change is expected to lead to an increase in extremes, including daily maximum temperatures, heat waves, and meteorological droughts, which will likely result in shifts in the hydrological drought regime (i.e. the frequency, timing, duration, and severity of drought events). While many studies have used hydrologic models to simulate climate change impacts on water resources, only a small portion of these studies have analyzed impacts on low flows and/or hydrological drought. This study is the first to use a fully coupled groundwater-surface water (gw-sw) model to study climate change impacts on hydrological drought. Generic catchment-scale gw-sw models were created for each of the six major eco-regions in British Columbia using the MIKE-SHE/MIKE-11 modelling code. Daily precipitation and temperature time series downscaled using bias-correction spatial disaggregation for the simulated period of 1950-2100 were obtained from the Pacific Climate Institute Consortium (PCIC). Streamflow and groundwater drought events were identified from the simulated time series for each catchment model using the moving window quantile threshold. The frequency, timing, duration, and severity of drought events were compared between the reference period (1961-2000) and two future time periods (2031-2060, 2071-2100). Results show how hydrological drought regimes across the different British Columbia eco-regions will be impacted by climate change.

  19. Simulations of cloud-radiation interaction using large-scale forcing derived from the CINDY/DYNAMO northern sounding array

    DOE PAGES

    Wang, Shuguang; Sobel, Adam H.; Fridlind, Ann; ...

    2015-09-25

    The recently completed CINDY/DYNAMO field campaign observed two Madden-Julian oscillation (MJO) events in the equatorial Indian Ocean from October to December 2011. Prior work has indicated that the moist static energy anomalies in these events grew and were sustained to a significant extent by radiative feedbacks. We present here a study of radiative fluxes and clouds in a set of cloud-resolving simulations of these MJO events. The simulations are driven by the large scale forcing dataset derived from the DYNAMO northern sounding array observations, and carried out in a doubly-periodic domain using the Weather Research and Forecasting (WRF) model. simulatedmore » cloud properties and radiative fluxes are compared to those derived from the S-Polka radar and satellite observations. Furthermore, to accommodate the uncertainty in simulated cloud microphysics, a number of single moment (1M) and double moment (2M) microphysical schemes in the WRF model are tested.« less

  20. Data-Model and Inter-Model Comparisons of the GEM Outflow Events Using the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Eccles, J. V.; Barakat, A. R.; Kistler, L. M.; Haaland, S.; Schunk, R. W.; Chappell, C. R.

    2015-12-01

    Two storm periods were selected by the Geospace Environment Modeling Ionospheric Outflow focus group for community collaborative study because of its high magnetospheric activity and extensive data coverage: the September 27 - October 4, 2002 corotating interaction region event and the October 22 - 29 coronal mass ejection event. During both events, the FAST, Polar, Cluster, and other missions made key observations, creating prime periods for data-model comparison. The GEM community has come together to simulate this period using many different methods in order to evaluate models, compare results, and expand our knowledge of ionospheric outflow and its effects on global dynamics. This paper presents Space Weather Modeling Framework (SWMF) simulations of these important periods compared against observations from the Polar TIDE, Cluster CODIF and EFW instruments. Emphasis will be given to the second event. Density and velocity of oxygen and hydrogen throughout the lobes, plasma sheet, and inner magnetosphere will be the focus of these comparisons. For these simulations, the SWMF couples the multifluid version of BATS-R-US MHD to a variety of ionospheric outflow models of varying complexity. The simplest is outflow arising from constant MHD inner boundary conditions. Two first-principles-based models are also leveraged: the Polar Wind Outflow Model (PWOM), a fluid treatment of outflow dynamics, and the Generalized Polar Wind (GPW) model, which combines fluid and particle-in-cell approaches. Each model is capable of capturing a different set of energization mechanisms, yielding different outflow results. The data-model comparisons will illustrate how well each approach captures reality and which energization mechanisms are most important. Inter-model comparisons will illustrate how the different outflow specifications affect the magnetosphere. Specifically, it is found that the GPW provides increased heavy ion outflow over a broader spatial range than the alternative models, improving comparisons in some regions but degrading the agreement in others. This work will also assess our current capability to reproduce ionosphere-magnetosphere mass coupling.

  1. Discrete event simulation modelling of patient service management with Arena

    NASA Astrophysics Data System (ADS)

    Guseva, Elena; Varfolomeyeva, Tatyana; Efimova, Irina; Movchan, Irina

    2018-05-01

    This paper describes the simulation modeling methodology aimed to aid in solving the practical problems of the research and analysing the complex systems. The paper gives the review of a simulation platform sand example of simulation model development with Arena 15.0 (Rockwell Automation).The provided example of the simulation model for the patient service management helps to evaluate the workload of the clinic doctors, determine the number of the general practitioners, surgeons, traumatologists and other specialized doctors required for the patient service and develop recommendations to ensure timely delivery of medical care and improve the efficiency of the clinic operation.

  2. Bayesian Approach for Flexible Modeling of Semicompeting Risks Data

    PubMed Central

    Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.

    2016-01-01

    Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445

  3. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, Paul K.; Siegel, Andrew R.

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.« less

  4. Estimating hypothetical present-day insured losses for past intense hurricanes in the French Antilles

    NASA Astrophysics Data System (ADS)

    Thornton, James; Desarthe, Jérémy; Naulin, Jean-Philippe; Garnier, Emmanuel; Liu, Ye; Moncoulon, David

    2015-04-01

    On the islands of the French Antilles, the period for which systematic meteorological measurements and historic event loss data are available is short relative to the recurrence intervals of very intense, damaging hurricanes. Additionally, the value of property at risk changes through time. As such, the recent past can only provide limited insight into potential losses from extreme storms in coming years. Here we present some research that seeks to overcome, as far as is possible, the limitations of record length in assessing the possible impacts of near-future hurricanes on insured properties. First, using the archives of the French overseas departments (which included administrative and weather reports, inventories of damage to houses, crops and trees, as well as some meteorological observations after 1950) we reconstructed the spatial patterns of hazard intensity associated with three historical events. They are: i) the 1928 Hurricane (Guadeloupe), ii) Hurricane Betsy (1956, Guadeloupe) and iii) Hurricane David (1979, Martinique). These events were selected because all were damaging, and the information available on each is rich. Then, using a recently developed catastrophe model for hurricanes affecting Guadeloupe, Martinique, Saint-Barthélemy and Saint-Martin, we simulated the hypothetical losses to insured properties that the reconstructed events might cause if they were to reoccur today. The model simulated damage due to wind, rainfall-induced flooding and storm surge flooding. These 'what if' scenarios provided an initial indication of the potential present-day exposure of the insurance industry to intense hurricanes. However, we acknowledge that historical events are unlikely to repeat exactly. We therefore extended the study by producing a stochastic event catalogue containing a large number of synthetic but plausible hurricane events. Instrumental data were used as a basis for event generation, but importantly the statistical methods we applied permit the extrapolation of simulated events beyond the observed intensity ranges. The event catalogue enabled the model to be run in a probabilistic mode; the losses for each synthetic event in a 10,000-year period were simulated. In this way, the aleatory uncertainty associated with future hazard outcomes was addressed. In conclusion, we consider how the reconstructed event hazard intensities and losses compare with the distribution of 32,320 events in the stochastic event set. Further comparisons are made with a longer chronology of tropical cyclones in the Antilles (going back to the 17th Century) prepared solely from documentary sources. Overall, the novelty of this work lies in the integration of data sources that are frequently overlooked in catastrophe model development and evaluation.

  5. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  6. Simulator for Microlens Planet Surveys

    NASA Astrophysics Data System (ADS)

    Ipatov, Sergei I.; Horne, Keith; Alsubai, Khalid A.; Bramich, Daniel M.; Dominik, Martin; Hundertmark, Markus P. G.; Liebig, Christine; Snodgrass, Colin D. B.; Street, Rachel A.; Tsapras, Yiannis

    2014-04-01

    We summarize the status of a computer simulator for microlens planet surveys. The simulator generates synthetic light curves of microlensing events observed with specified networks of telescopes over specified periods of time. Particular attention is paid to models for sky brightness and seeing, calibrated by fitting to data from the OGLE survey and RoboNet observations in 2011. Time intervals during which events are observable are identified by accounting for positions of the Sun and the Moon, and other restrictions on telescope pointing. Simulated observations are then generated for an algorithm that adjusts target priorities in real time with the aim of maximizing planet detection zone area summed over all the available events. The exoplanet detection capability of observations was compared for several telescopes.

  7. Budget impact analysis of thrombolysis for stroke in Spain: a discrete event simulation model.

    PubMed

    Mar, Javier; Arrospide, Arantzazu; Comas, Mercè

    2010-01-01

    Thrombolysis within the first 3 hours after the onset of symptoms of a stroke has been shown to be a cost-effective treatment because treated patients are 30% more likely than nontreated patients to have no residual disability. The objective of this study was to calculate by means of a discrete event simulation model the budget impact of thrombolysis in Spain. The budget impact analysis was based on stroke incidence rates and the estimation of the prevalence of stroke-related disability in Spain and its translation to hospital and social costs. A discrete event simulation model was constructed to represent the flow of patients with stroke in Spain. If 10% of patients with stroke from 2000 to 2015 would receive thrombolytic treatment, the prevalence of dependent patients in 2015 would decrease from 149,953 to 145,922. For the first 6 years, the cost of intervention would surpass the savings. Nevertheless, the number of cases in which patient dependency was avoided would steadily increase, and after 2006 the cost savings would be greater, with a widening difference between the cost of intervention and the cost of nonintervention, until 2015. The impact of thrombolysis on society's health and social budget indicates a net benefit after 6 years, and the improvement in health grows continuously. The validation of the model demonstrates the adequacy of the discrete event simulation approach in representing the epidemiology of stroke to calculate the budget impact.

  8. Quasi-dynamic versus fully dynamic simulations of earthquakes and aseismic slip with and without enhanced coseismic weakening

    NASA Astrophysics Data System (ADS)

    Thomas, Marion Y.; Lapusta, Nadia; Noda, Hiroyuki; Avouac, Jean-Philippe

    2014-03-01

    Physics-based numerical simulations of earthquakes and slow slip, coupled with field observations and laboratory experiments, can, in principle, be used to determine fault properties and potential fault behaviors. Because of the computational cost of simulating inertial wave-mediated effects, their representation is often simplified. The quasi-dynamic (QD) approach approximately accounts for inertial effects through a radiation damping term. We compare QD and fully dynamic (FD) simulations by exploring the long-term behavior of rate-and-state fault models with and without additional weakening during seismic slip. The models incorporate a velocity-strengthening (VS) patch in a velocity-weakening (VW) zone, to consider rupture interaction with a slip-inhibiting heterogeneity. Without additional weakening, the QD and FD approaches generate qualitatively similar slip patterns with quantitative differences, such as slower slip velocities and rupture speeds during earthquakes and more propensity for rupture arrest at the VS patch in the QD cases. Simulations with additional coseismic weakening produce qualitatively different patterns of earthquakes, with near-periodic pulse-like events in the FD simulations and much larger crack-like events accompanied by smaller events in the QD simulations. This is because the FD simulations with additional weakening allow earthquake rupture to propagate at a much lower level of prestress than the QD simulations. The resulting much larger ruptures in the QD simulations are more likely to propagate through the VS patch, unlike for the cases with no additional weakening. Overall, the QD approach should be used with caution, as the QD simulation results could drastically differ from the true response of the physical model considered.

  9. Evaluation of the HadGEM3-A simulations in view of detection and attribution of human influence on extreme events in Europe

    NASA Astrophysics Data System (ADS)

    Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal

    2018-04-01

    A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns indicates that biases mainly originate from local and regional physical processes. This makes local bias adjustment meaningful for climate change attribution.

  10. The need for enhanced initial moisture information in simulations of a complex summertime precipitation event

    NASA Technical Reports Server (NTRS)

    Waight, Kenneth T., III; Zack, John W.; Karyampudi, V. Mohan

    1989-01-01

    Initial simulations of the June 28, 1986 Cooperative Huntsville Meteorological Experiment case illustrate the need for mesoscale moisture information in a summertime situation in which deep convection is organized by weak large scale forcing. A methodology is presented for enhancing the initial moisture field from a combination of IR satellite imagery, surface-based cloud observations, and manually digitized radar data. The Mesoscale Atmospheric Simulation Model is utilized to simulate the events of June 28-29. This procedure insures that areas known to have precipitation at the time of initialization will be nearly saturated on the grid scale, which should decrease the time needed by the model to produce the observed Bonnie (a relatively weak hurricane that moved on shore two days before) convection. This method will also result in an initial distribution of model cloudiness (transmissivity) that is very similar to that of the IR satellite image.

  11. Stochastic Simulation of Actin Dynamics Reveals the Role of Annealing and Fragmentation

    PubMed Central

    Fass, Joseph; Pak, Chi; Bamburg, James; Mogilner, Alex

    2008-01-01

    Recent observations of F-actin dynamics call for theoretical models to interpret and understand the quantitative data. A number of existing models rely on simplifications and do not take into account F-actin fragmentation and annealing. We use Gillespie’s algorithm for stochastic simulations of the F-actin dynamics including fragmentation and annealing. The simulations vividly illustrate that fragmentation and annealing have little influence on the shape of the polymerization curve and on nucleotide profiles within filaments but drastically affect the F-actin length distribution, making it exponential. We find that recent surprising measurements of high length diffusivity at the critical concentration cannot be explained by fragmentation and annealing events unless both fragmentation rates and frequency of undetected fragmentation and annealing events are greater than previously thought. The simulations compare well with experimentally measured actin polymerization data and lend additional support to a number of existing theoretical models. PMID:18279896

  12. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  13. Airlift Operation Modeling Using Discrete Event Simulation (DES)

    DTIC Science & Technology

    2009-12-01

    Java ......................................................................................................20 2. Simkit...JRE Java Runtime Environment JVM Java Virtual Machine lbs Pounds LAM Load Allocation Mode LRM Landing Spot Reassignment Mode LEGO Listener Event...SOFTWARE DEVELOPMENT ENVIRONMENT The following are the software tools and development environment used for constructing the models. 1. Java Java

  14. Spontaneous abrupt climate change due to an atmospheric blocking-sea-ice-ocean feedback in an unforced climate model simulation.

    PubMed

    Drijfhout, Sybren; Gleeson, Emily; Dijkstra, Henk A; Livina, Valerie

    2013-12-03

    Abrupt climate change is abundant in geological records, but climate models rarely have been able to simulate such events in response to realistic forcing. Here we report on a spontaneous abrupt cooling event, lasting for more than a century, with a temperature anomaly similar to that of the Little Ice Age. The event was simulated in the preindustrial control run of a high-resolution climate model, without imposing external perturbations. Initial cooling started with a period of enhanced atmospheric blocking over the eastern subpolar gyre. In response, a southward progression of the sea-ice margin occurred, and the sea-level pressure anomaly was locked to the sea-ice margin through thermal forcing. The cold-core high steered more cold air to the area, reinforcing the sea-ice concentration anomaly east of Greenland. The sea-ice surplus was carried southward by ocean currents around the tip of Greenland. South of 70 °N, sea ice already started melting and the associated freshwater anomaly was carried to the Labrador Sea, shutting off deep convection. There, surface waters were exposed longer to atmospheric cooling and sea surface temperature dropped, causing an even larger thermally forced high above the Labrador Sea. In consequence, east of Greenland, anomalous winds changed from north to south, terminating the event with similar abruptness to its onset. Our results imply that only climate models that possess sufficient resolution to correctly represent atmospheric blocking, in combination with a sensitive sea-ice model, are able to simulate this kind of abrupt climate change.

  15. Spontaneous abrupt climate change due to an atmospheric blocking–sea-ice–ocean feedback in an unforced climate model simulation

    PubMed Central

    Drijfhout, Sybren; Gleeson, Emily; Dijkstra, Henk A.; Livina, Valerie

    2013-01-01

    Abrupt climate change is abundant in geological records, but climate models rarely have been able to simulate such events in response to realistic forcing. Here we report on a spontaneous abrupt cooling event, lasting for more than a century, with a temperature anomaly similar to that of the Little Ice Age. The event was simulated in the preindustrial control run of a high-resolution climate model, without imposing external perturbations. Initial cooling started with a period of enhanced atmospheric blocking over the eastern subpolar gyre. In response, a southward progression of the sea-ice margin occurred, and the sea-level pressure anomaly was locked to the sea-ice margin through thermal forcing. The cold-core high steered more cold air to the area, reinforcing the sea-ice concentration anomaly east of Greenland. The sea-ice surplus was carried southward by ocean currents around the tip of Greenland. South of 70°N, sea ice already started melting and the associated freshwater anomaly was carried to the Labrador Sea, shutting off deep convection. There, surface waters were exposed longer to atmospheric cooling and sea surface temperature dropped, causing an even larger thermally forced high above the Labrador Sea. In consequence, east of Greenland, anomalous winds changed from north to south, terminating the event with similar abruptness to its onset. Our results imply that only climate models that possess sufficient resolution to correctly represent atmospheric blocking, in combination with a sensitive sea-ice model, are able to simulate this kind of abrupt climate change. PMID:24248352

  16. Transient modeling in simulation of hospital operations for emergency response.

    PubMed

    Paul, Jomon Aliyas; George, Santhosh K; Yi, Pengfei; Lin, Li

    2006-01-01

    Rapid estimates of hospital capacity after an event that may cause a disaster can assist disaster-relief efforts. Due to the dynamics of hospitals, following such an event, it is necessary to accurately model the behavior of the system. A transient modeling approach using simulation and exponential functions is presented, along with its applications in an earthquake situation. The parameters of the exponential model are regressed using outputs from designed simulation experiments. The developed model is capable of representing transient, patient waiting times during a disaster. Most importantly, the modeling approach allows real-time capacity estimation of hospitals of various sizes and capabilities. Further, this research is an analysis of the effects of priority-based routing of patients within the hospital and the effects on patient waiting times determined using various patient mixes. The model guides the patients based on the severity of injuries and queues the patients requiring critical care depending on their remaining survivability time. The model also accounts the impact of prehospital transport time on patient waiting time.

  17. Simulating Coronal Loop Implosion and Compressible Wave Modes in a Flare Hit Active Region

    NASA Astrophysics Data System (ADS)

    Sarkar, Aveek; Vaidya, Bhargav; Hazra, Soumitra; Bhattacharyya, Jishnu

    2017-12-01

    There is considerable observational evidence of implosion of magnetic loop systems inside solar coronal active regions following high-energy events like solar flares. In this work, we propose that such collapse can be modeled in three dimensions quite accurately within the framework of ideal magnetohydrodynamics. We furthermore argue that the dynamics of loop implosion is only sensitive to the transmitted disturbance of one or more of the system variables, e.g., velocity generated at the event site. This indicates that to understand loop implosion, it is sensible to leave the event site out of the simulated active region. Toward our goal, a velocity pulse is introduced to model the transmitted disturbance generated at the event site. Magnetic field lines inside our simulated active region are traced in real time, and it is demonstrated that the subsequent dynamics of the simulated loops closely resemble observed imploding loops. Our work highlights the role of plasma β in regards to the rigidity of the loop systems and how that might affect the imploding loops’ dynamics. Compressible magnetohydrodynamic modes such as kink and sausage are also shown to be generated during such processes, in accordance with observations.

  18. Modeling of ESD events from polymeric surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pfeifer, Kent Bryant

    2014-03-01

    Transient electrostatic discharge (ESD) events are studied to assemble a predictive model of discharge from polymer surfaces. An analog circuit simulation is produced and its response is compared to various literature sources to explore its capabilities and limitations. Results suggest that polymer ESD events can be predicted to within an order of magnitude. These results compare well to empirical findings from other sources having similar reproducibility.

  19. Evaluation of Precipitation Simulated by Seven SCMs against the ARM Observations at the SGP Site

    NASA Technical Reports Server (NTRS)

    Song, Hua; Lin, Wuyin; Lin, Yanluan; Wolf, Audrey B.; Neggers, Roel; Donner, Leo J.; Del Genio, Anthony D.; Liu, Yangang

    2013-01-01

    This study evaluates the performances of seven single-column models (SCMs) by comparing simulated surface precipitation with observations at the Atmospheric Radiation Measurement Program Southern Great Plains (SGP) site from January 1999 to December 2001. Results show that although most SCMs can reproduce the observed precipitation reasonably well, there are significant and interesting differences in their details. In the cold season, the model-observation differences in the frequency and mean intensity of rain events tend to compensate each other for most SCMs. In the warm season, most SCMs produce more rain events in daytime than in nighttime, whereas the observations have more rain events in nighttime. The mean intensities of rain events in these SCMs are much stronger in daytime, but weaker in nighttime, than the observations. The higher frequency of rain events during warm-season daytime in most SCMs is related to the fact that most SCMs produce a spurious precipitation peak around the regime of weak vertical motions but rich in moisture content. The models also show distinct biases between nighttime and daytime in simulating significant rain events. In nighttime, all the SCMs have a lower frequency of moderate-to-strong rain events than the observations for both seasons. In daytime, most SCMs have a higher frequency of moderate-to-strong rain events than the observations, especially in the warm season. Further analysis reveals distinct meteorological backgrounds for large underestimation and overestimation events. The former occur in the strong ascending regimes with negative low-level horizontal heat and moisture advection, whereas the latter occur in the weak or moderate ascending regimes with positive low-level horizontal heat and moisture advection.

  20. Bayesian Techniques for Comparing Time-dependent GRMHD Simulations to Variable Event Horizon Telescope Observations

    NASA Astrophysics Data System (ADS)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  1. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less

  2. Flood simulation and verification with IoT sensors

    NASA Astrophysics Data System (ADS)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  3. Capabilities of a Global 3D MHD Model for Monitoring Extremely Fast CMEs

    NASA Astrophysics Data System (ADS)

    Wu, C. C.; Plunkett, S. P.; Liou, K.; Socker, D. G.; Wu, S. T.; Wang, Y. M.

    2015-12-01

    Since the start of the space era, spacecraft have recorded many extremely fast coronal mass ejections (CMEs) which have resulted in severe geomagnetic storms. Accurate and timely forecasting of the space weather effects of these events is important for protecting expensive space assets and astronauts and avoiding communications interruptions. Here, we will introduce a newly developed global, three-dimensional (3D) magnetohydrodynamic (MHD) model (G3DMHD). The model takes the solar magnetic field maps at 2.5 solar radii (Rs) and intepolates the solar wind plasma and field out to 18 Rs using the algorithm of Wang and Sheeley (1990, JGR). The output is used as the inner boundary condition for a 3D MHD model. The G3DMHD model is capable of simulating (i) extremely fast CME events with propagation speeds faster than 2500 km/s; and (ii) multiple CME events in sequence or simultaneously. We will demonstrate the simulation results (and comparison with in-situ observation) for the fastest CME in record on 23 July 2012, the shortest transit time in March 1976, and the well-known historic Carrington 1859 event.

  4. Simulation of the last sapropel event using high-regional oceanic model

    NASA Astrophysics Data System (ADS)

    Vadsaria, Tristan; Ramstein, Gilles; Li, Laurent; Dutay, jean-Claude

    2017-04-01

    Since decades, the simulation of sapropel events remains a challenge. These events, occurring periodically in the Mediterranean Sea produce a strong stratification of the water column and break intermediate and deep convection, thereby leading to a decrease in deep water oxygen, of which evidence are recorded in marine sediment cores. Data from Mediterranean sediments have thus helped to better understand the anoxia process, in particular for the last sapropel event, S1, lasting 3000 years about 10 kyrs ago. However the causal link between insolation changes and the African monsoon variations - thought to be the trigger of sapropel events -, and anoxia has still to be quantified. From a modelling point of view, a requisite for studying sapropel events is to capture seasonal winds that are instrumental in producing convection in the Med Sea. Recently, the development of high-resolution several models studies intend to fill this gap, building different scenarios (Grimm et al, 2015). Combining an atmospheric GCM (LMDZ4) and a high-resolution oceanic model (NEMOMED8, resolution of 1/8 degree) dedicated to the Med Sea, our first objective is to test whether monsoon precipitation triggered by insolation changes can increase the Nile run-off enough to stratify the East Mediterranean Sea. We notably show that a 15 mSv Nile runoff increase triggers a large decrease of convection in the whole Eastern Mediterranean Sea associated with strong anoxia in bottom waters.. Comparisons of our first experiments with δ18O and ɛ-Nd data will also be presented. Future work includes extending our simulations to investigate whether sapropel events can be maintained on longer time scales.

  5. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  6. Storm Water Management Model Reference Manual Volume I, Hydrology

    EPA Science Inventory

    SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and gene...

  7. Estimating winter wheat phenological parameters: Implications for crop modeling

    USDA-ARS?s Scientific Manuscript database

    Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...

  8. Storm Water Management Model Reference Manual Volume II – Hydraulics

    EPA Science Inventory

    SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and gene...

  9. PSPs and ERPs: applying the dynamics of post-synaptic potentials to individual units in simulation of temporally extended Event-Related Potential reading data.

    PubMed

    Laszlo, Sarah; Armstrong, Blair C

    2014-05-01

    The Parallel Distributed Processing (PDP) framework is built on neural-style computation, and is thus well-suited for simulating the neural implementation of cognition. However, relatively little cognitive modeling work has concerned neural measures, instead focusing on behavior. Here, we extend a PDP model of reading-related components in the Event-Related Potential (ERP) to simulation of the N400 repetition effect. We accomplish this by incorporating the dynamics of cortical post-synaptic potentials--the source of the ERP signal--into the model. Simulations demonstrate that application of these dynamics is critical for model elicitation of repetition effects in the time and frequency domains. We conclude that by advancing a neurocomputational understanding of repetition effects, we are able to posit an interpretation of their source that is both explicitly specified and mechanistically different from the well-accepted cognitive one. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Sensitivity of the Atmospheric Response to Warm Pool El Nino Events to Modeled SSTs and Future Climate Forcings

    NASA Technical Reports Server (NTRS)

    Hurwitz, Margaret M.; Garfinkel, Chaim I.; Newman, Paul A.; Oman, Luke D.

    2013-01-01

    Warm pool El Nino (WPEN) events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. Under present-day climate conditions, WPEN events generate poleward propagating wavetrains and enhance midlatitude planetary wave activity, weakening the stratospheric polar vortices. The late 21st century extratropical atmospheric response to WPEN events is investigated using the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM), version 2. GEOSCCM simulations are forced by projected late 21st century concentrations of greenhouse gases (GHGs) and ozone-depleting substances (ODSs) and by SSTs and sea ice concentrations from an existing ocean-atmosphere simulation. Despite known ocean-atmosphere model biases, the prescribed SST fields represent a best estimate of the structure of late 21st century WPEN events. The future Arctic vortex response is qualitatively similar to that observed in recent decades but is weaker in late winter. This response reflects the weaker SST forcing in the Nino 3.4 region and subsequently weaker Northern Hemisphere tropospheric teleconnections. The Antarctic stratosphere does not respond to WPEN events in a future climate, reflecting a change in tropospheric teleconnections: The meridional wavetrain weakens while a more zonal wavetrain originates near Australia. Sensitivity simulations show that a strong poleward wavetrain response to WPEN requires a strengthening and southeastward extension of the South Pacific Convergence Zone; this feature is not captured by the late 21st century modeled SSTs. Expected future increases in GHGs and decreases in ODSs do not affect the polar stratospheric responses to WPEN.

  11. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remainsmore » mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.« less

  12. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    NASA Astrophysics Data System (ADS)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  13. Diagnosing causes of extreme aerosol optical depth events

    NASA Astrophysics Data System (ADS)

    Bernstein, D. N.; Sullivan, R.; Crippa, P.; Thota, A.; Pryor, S. C.

    2017-12-01

    Aerosol burdens and optical properties exhibit substantial spatiotemporal variability, and simulation of current and possible future aerosol burdens and characteristics exhibits relatively high uncertainty due to uncertainties in emission estimates and in chemical and physical processes associated with aerosol formation, dynamics and removal. We report research designed to improve understanding of the causes and characteristics of extreme aerosol optical depth (AOD) at the regional scale, and diagnose and attribute model skill in simulating these events. Extreme AOD events over the US Midwest are selected by identifying all dates on which AOD in a MERRA-2 reanalysis grid cell exceeds the local seasonally computed 90th percentile (p90) value during 2004-2016 and then finding the dates on which the highest number of grid cells exceed their local p90. MODIS AOD data are subsequently used to exclude events dominated by wildfires. MERRA-2 data are also analyzed within a synoptic classification to determine in what ways the extreme AOD events are atypical and to identify possible meteorological `finger-prints' that can be detected in regional climate model simulations of future climate states to project possible changes in the occurrence of extreme AOD. Then WRF-Chem v3.6 is applied at 12-km resolution and regridded to the MERRA-2 resolution over eastern North America to quantify model performance, and also evaluated using in situ measurements of columnar AOD (AERONET) and near-surface PM2.5 (US EPA). Finally the sensitivity to (i) spin-up time (including procedure used to spin-up the chemistry), (ii) modal versus sectional aerosol schemes, (iii) meteorological nudging, (iv) chemistry initial and boundary conditions, and (v) anthropogenic emissions is quantified. Despite recent declines in mean AOD, supraregional (> 1000 km) extreme AOD events continue to occur. During these events AOD exceeds 0.6 in many Midwestern grid cells for multiple consecutive days. In all seasons WRF-Chem exhibits some skill in reproducing the intensity of these events, but not the precise location of the AOD maximum. Model skill is generally (but not uniformly) highest for simulations employing MOZART LBC/IBC, modal aerosol description, meteorological nudging and a 3 day spin-up, with little or no sensitivity to longer spin up times.

  14. WRF-Chem Model Simulations of Arizona Dust Storms

    NASA Astrophysics Data System (ADS)

    Mohebbi, A.; Chang, H. I.; Hondula, D.

    2017-12-01

    The online Weather Research and Forecasting model with coupled chemistry module (WRF-Chem) is applied to simulate the transport, deposition and emission of the dust aerosols in an intense dust outbreak event that took place on July 5th, 2011 over Arizona. Goddard Chemistry Aerosol Radiation and Transport (GOCART), Air Force Weather Agency (AFWA), and University of Cologne (UoC) parameterization schemes for dust emission were evaluated. The model was found to simulate well the synoptic meteorological conditions also widely documented in previous studies. The chemistry module performance in reproducing the atmospheric desert dust load was evaluated using the horizontal field of the Aerosol Optical Depth (AOD) from Moderate Resolution Imaging Spectro (MODIS) radiometer Terra/Aqua and Aerosol Robotic Network (AERONET) satellites employing standard Dark Target (DT) and Deep Blue (DB) algorithms. To assess the temporal variability of the dust storm, Particulate Matter mass concentration data (PM10 and PM2.5) from Arizona Department of Environmental Quality (AZDEQ) ground-based air quality stations were used. The promising performance of WRF-Chem indicate that the model is capable of simulating the right timing and loading of a dust event in the planetary-boundary-layer (PBL) which can be used to forecast approaching severe dust events and to communicate an effective early warning.

  15. GRMHD Simulations of Visibility Amplitude Variability for Event Horizon Telescope Images of Sgr A*

    NASA Astrophysics Data System (ADS)

    Medeiros, Lia; Chan, Chi-kwan; Özel, Feryal; Psaltis, Dimitrios; Kim, Junhan; Marrone, Daniel P.; Sa¸dowski, Aleksander

    2018-04-01

    The Event Horizon Telescope will generate horizon scale images of the black hole in the center of the Milky Way, Sgr A*. Image reconstruction using interferometric visibilities rests on the assumption of a stationary image. We explore the limitations of this assumption using high-cadence disk- and jet-dominated GRMHD simulations of Sgr A*. We also employ analytic models that capture the basic characteristics of the images to understand the origin of the variability in the simulated visibility amplitudes. We find that, in all simulations, the visibility amplitudes for baselines oriented parallel and perpendicular to the spin axis of the black hole follow general trends that do not depend strongly on accretion-flow properties. This suggests that fitting Event Horizon Telescope observations with simple geometric models may lead to a reasonably accurate determination of the orientation of the black hole on the plane of the sky. However, in the disk-dominated models, the locations and depths of the minima in the visibility amplitudes are highly variable and are not related simply to the size of the black hole shadow. This suggests that using time-independent models to infer additional black hole parameters, such as the shadow size or the spin magnitude, will be severely affected by the variability of the accretion flow.

  16. Validation of a Simulation Process for Assessing the Response of a Vehicle and Its Occupants to an Explosive Threat

    DTIC Science & Technology

    2010-01-01

    gross vehicle response; and the effects of blast mitigation material, restraint system, and seat design to the loads developed on the members of an...occupant. A Blast Event Simulation sysTem (BEST) has been developed for facilitating the easy use of the LS- DYNA solvers for conducting a...et al, 1999] for modeling blast events. In this paper the Eulerian solver of LS- DYNA is employed for simulating the soil – explosive – air

  17. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its meteorological drivers, and how it can be expected to change in the future. Finally, we assess the applicability of this methodology to other regions. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  18. Simulation of three lahars in the Mount St Helens area, Washington using a one-dimensional, unsteady-state streamflow model

    USGS Publications Warehouse

    Laenen, Antonius; Hansen, R.P.

    1988-01-01

    A one-dimensional, unsteady-state, open-channel model was used to analytically reproduce three lahar events. Factors contributing to the success of the modeling were: (1) the lahars were confined to a channel, (2) channel roughness was defined by field information, and (3) the volume of the flow remained relatively unchanged for the duration of the peak. Manning 's 'n ' values used in computing conveyance in the model were subject to the changing rheology of the debris flow and were calculated from field cross-section information (velocities used in these calculations were derived from super-elevation or run-up formulas). For the events modeled in this exercise, Manning 's 'n ' calculations ranged from 0.020 to 0.099. In all lahar simulations, the rheology of the flow changed in a downstream direction during the course of the event. Chen 's 'U ', the mudflow consistency index, changed approximately an order of magnitude for each event. The ' u ' values ranged from 5-2,260 kg/m for three events modeled. The empirical approach adopted in this paper is useful as a tool to help predict debris-flow behavior, but does not lead to understanding the physical processes of debris flows. (Author 's abstract)

  19. Can dynamically downscaled climate model outputs improve pojections of extreme precipitation events?

    EPA Science Inventory

    Many of the storms that generate damaging floods are caused by locally intense, sub-daily precipitation, yet the spatial and temporal resolution of the most widely available climate model outputs are both too coarse to simulate these events. Thus there is often a disconnect betwe...

  20. Sensitivity of summer ensembles of fledgling superparameterized U.S. mesoscale convective systems to cloud resolving model microphysics and grid configuration

    DOE PAGES

    Elliott, Elizabeth J.; Yu, Sungduk; Kooperman, Gabriel J.; ...

    2016-05-01

    The sensitivities of simulated mesoscale convective systems (MCSs) in the central U.S. to microphysics and grid configuration are evaluated here in a global climate model (GCM) that also permits global-scale feedbacks and variability. Since conventional GCMs do not simulate MCSs, studying their sensitivities in a global framework useful for climate change simulations has not previously been possible. To date, MCS sensitivity experiments have relied on controlled cloud resolving model (CRM) studies with limited domains, which avoid internal variability and neglect feedbacks between local convection and larger-scale dynamics. However, recent work with superparameterized (SP) GCMs has shown that eastward propagating MCS-likemore » events are captured when embedded CRMs replace convective parameterizations. This study uses a SP version of the Community Atmosphere Model version 5 (SP-CAM5) to evaluate MCS sensitivities, applying an objective empirical orthogonal function algorithm to identify MCS-like events, and harmonizing composite storms to account for seasonal and spatial heterogeneity. A five-summer control simulation is used to assess the magnitude of internal and interannual variability relative to 10 sensitivity experiments with varied CRM parameters, including ice fall speed, one-moment and two-moment microphysics, and grid spacing. MCS sensitivities were found to be subtle with respect to internal variability, and indicate that ensembles of over 100 storms may be necessary to detect robust differences in SP-GCMs. Furthermore, these results emphasize that the properties of MCSs can vary widely across individual events, and improving their representation in global simulations with significant internal variability may require comparison to long (multidecadal) time series of observed events rather than single season field campaigns.« less

  1. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    PubMed Central

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones. PMID:27134782

  2. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    PubMed

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  3. Evolution and propagation of the July 23, 2012, CME-driven shock: A 3-D MHD simulation result

    NASA Astrophysics Data System (ADS)

    Wu, S. T.; Dryer, Ph D., M.; Liou, K.; Wu, C. C.

    2016-12-01

    The interplanetary shock associated with the July 23, 2012 CME event is studied with the H3DMHD 3-D magnetohydrodynamic (MHD) simulation model. This backside CME event has been actively studied, probably due to its extremely fast propagating speed ( 2000 km/s) and large magnetic field magnitude ( 100 nT) at 1 AU. Some workers even compared this even with the Carrington event. In this study we focus on the acceleration and deceleration of the shock at the cobpoints. The H3DMHD is a data (photospheric magnetic field) driven model, which combines the HAF kinematic model for regions sunward of 18 Rs and the 3DMHD ideal MHD model for antisunward of 18 Rs up to 1.5 AU. To simulate the CME a gaussian velocity pulse is manually applied to the inner simulation boundary at 2.5 Rs above the flare site, with the initial peak velocity ( 3000 km/s) taken from the coronagraph measurements. In situ measurements of the solar wind parameters at STEREO-A are used to validate the simulation result, in particular the arrival time of the shock at STEREO-A. It is found, for this particular event, the CME-driven shock strength varies significantly across the shock surface. In general, the shock strength slowly weakened while propagating outward but stayed hypersonic (> Mach 5) for a cone shape region of a few 10's of degrees surrounding the shock nose. We will discuss our result in the context of the acceleration/deceleration of shock in a much slower background solar wind and the relationship of the shock strength with the flux of solar energetic particles observed by STEREO-A.

  4. Nowcasting, forecasting and hindcasting Harvey and Irma inundation in near-real time using a continental 2D hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Sampson, C. C.; Wing, O.; Quinn, N.; Smith, A.; Neal, J. C.; Schumann, G.; Bates, P.

    2017-12-01

    During an ongoing natural disaster data are required on: (1) the current situation (nowcast); (2) its likely immediate evolution (forecast); and (3) a consistent view post-event of what actually happened (hindcast or reanalysis). We describe methods used to achieve all three tasks for flood inundation during the Harvey and Irma events using a continental scale 2D hydrodynamic model (Wing et al., 2017). The model solves the local inertial form of the Shallow Water equations over a regular grid of 1 arcsecond ( 30m). Terrain data are taken from the USGS National Elevation Dataset with known flood defences represented using the U.S. Army Corps of Engineers National Levee Dataset. Channels are treated as sub-grid scale features using the HydroSHEDS global hydrography data set. The model is driven using river flows, rainfall and coastal water levels. It simulates river flooding in basins > 50 km2, and fluvial and coastal flooding everywhere. Previous wide area validation tests show this model to be capable of matching FEMA maps and USGS local models built with bespoke data with hit rates of 86% and 92% respectively (Wing et al., 2017). Boundary conditions were taken from NOAA QPS data to produce nowcast and forecast simulations in near real time, before updating with NOAA observations to produce the hindcast. During the event simulation results were supplied to major insurers and multi-nationals who used them to estimate their likely capital exposure and to mitigate flood damage to their infrastructure whilst the event was underway. Simulations were validated against modelled flood footprints computed by FEMA and USACE, and composite satellite imagery produced by the Dartmouth Flood Observatory. For the Harvey event, hit rates ranged from 60-84% against these data sources, but a lack of metadata meant it was difficult to perform like-for-like comparisons. The satellite data also appeared to miss known flooding in urban areas that was picked up in the models. Despite these limitations, the validation was able to pick our areas, notably along the Colorado River near Houston, where our model under-performed and identify areas for future development. The study shows that high resolution near real-time inundation predictions over very large areas during complex events with multiple flood drivers are now a reality.

  5. Linkages between observed, modeled Saharan dust loading and meningitis in Senegal during 2012 and 2013

    NASA Astrophysics Data System (ADS)

    Diokhane, Aminata Mbow; Jenkins, Gregory S.; Manga, Noel; Drame, Mamadou S.; Mbodji, Boubacar

    2016-04-01

    The Sahara desert transports large quantities of dust over the Sahelian region during the Northern Hemisphere winter and spring seasons (December-April). In episodic events, high dust concentrations are found at the surface, negatively impacting respiratory health. Bacterial meningitis in particular is known to affect populations that live in the Sahelian zones, which is otherwise known as the meningitis belt. During the winter and spring of 2012, suspected meningitis cases (SMCs) were with three times higher than in 2013. We show higher surface particular matter concentrations at Dakar, Senegal and elevated atmospheric dust loading in Senegal for the period of 1 January-31 May during 2012 relative to 2013. We analyze simulated particulate matter over Senegal from the Weather Research and Forecasting (WRF) model during 2012 and 2013. The results show higher simulated dust concentrations during the winter season of 2012 for Senegal. The WRF model correctly captures the large dust events from 1 January-31 March but has shown less skill during April and May for simulated dust concentrations. The results also show that the boundary conditions are the key feature for correctly simulating large dust events and initial conditions are less important.

  6. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    NASA Astrophysics Data System (ADS)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by Itasca Inc.

  7. Three occurred debris flows in North-Eastern Italian Alps: documentation and modeling

    NASA Astrophysics Data System (ADS)

    Boreggio, Mauro; Gregoretti, Carlo; Degetto, Massimo; Bernard, Martino

    2015-04-01

    Three occurred events of debris flows are documented and modeled by back-analysis. The three debris flows events are those occurred at Rio Lazer on the 4th of November 1966, at Fiames on the 5th of July 2006 and at Rovina di Cancia on the 18th of July 2009. All the three sites are located in the North-Eastern Italian Alps. In all the events, runoff entrained sediments present on natural channels and formed a solid-liquid wave that routed downstream. The first event concerns the routing of debris flow on an inhabited fan. Map of deposition pattern of sediments are built by using post-events photos through stereoscopy techniques. The second event concerns the routing of debris flow along the main channel descending from Pomagagnon Fork. Due to the obstruction of the cross-section debris flow deviated from the original path on the left side and routed downstream by cutting a new channel on the fan. It dispersed in multiple paths when met the wooden area. Map of erosion and deposition depths are built after using a combination of Lidar and GPS data. The third event concerns the routing of debris flow in the Rovina di Cancia channel that filled the reservoir built at the end of the channel and locally overtopped the retaining wall on the left side. A wave of mud and debris inundated the area downstream the overtopping point. Map of erosion and deposition depths are obtained by subtracting two GPS surveys, pre and post event. All the three occurred debris flows are simulated by modeling runoff that entrained debris flow for determining the solid-liquid hydrograph downstream the triggering areas. The routing of the solid-liquid hydrograph was simulated by a bi-phase cell model based on the kinematic approach. The comparison between simulated and measured erosion and deposition depths is satisfactory. The same parameters for computing erosion and deposition were used for the three occurred events.

  8. Parallel discrete-event simulation schemes with heterogeneous processing elements.

    PubMed

    Kim, Yup; Kwon, Ikhyun; Chae, Huiseung; Yook, Soon-Hyung

    2014-07-01

    To understand the effects of nonidentical processing elements (PEs) on parallel discrete-event simulation (PDES) schemes, two stochastic growth models, the restricted solid-on-solid (RSOS) model and the Family model, are investigated by simulations. The RSOS model is the model for the PDES scheme governed by the Kardar-Parisi-Zhang equation (KPZ scheme). The Family model is the model for the scheme governed by the Edwards-Wilkinson equation (EW scheme). Two kinds of distributions for nonidentical PEs are considered. In the first kind computing capacities of PEs are not much different, whereas in the second kind the capacities are extremely widespread. The KPZ scheme on the complex networks shows the synchronizability and scalability regardless of the kinds of PEs. The EW scheme never shows the synchronizability for the random configuration of PEs of the first kind. However, by regularizing the arrangement of PEs of the first kind, the EW scheme is made to show the synchronizability. In contrast, EW scheme never shows the synchronizability for any configuration of PEs of the second kind.

  9. Extended Magnetohydrodynamics with Embedded Particle-in-Cell Simulation of Ganymede's Magnetosphere

    NASA Technical Reports Server (NTRS)

    Toth, Gabor; Jia, Xianzhe; Markidis, Stefano; Peng, Ivy Bo; Chen, Yuxi; Daldorff, Lars K. S.; Tenishev, Valeriy M.; Borovikov, Dmitry; Haiducek, John D.; Gombosi, Tamas I.; hide

    2016-01-01

    We have recently developed a new modeling capability to embed the implicit particle-in-cell (PIC) model iPIC3D into the Block-Adaptive-Tree-Solarwind-Roe-Upwind-Scheme magnetohydrodynamic (MHD) model. The MHD with embedded PIC domains (MHO-EPIC) algorithm Is a two-way coupled kinetic-fluid model. As one of the very first applications of the MHD-EPIC algorithm, we simulate the Interaction between Jupiter's magnetospherlc plasma and Ganymede's magnetosphere. We compare the MHO-EPIC simulations with pure Hall MHD simulations and compare both model results with Galileo observations to assess the Importance of kinetic effects In controlling the configuration and dynamics of Ganymede's magnetosphere. We find that the Hall MHD and MHO-EPIC solutions are qualitatively similar, but there are significant quantitative differences. In particular. the density and pressure inside the magnetosphere show different distributions. For our baseline grid resolution the PIC solution is more dynamic than the Hall MHD simulation and it compares significantly better with the Galileo magnetic measurements than the Hall MHD solution. The power spectra of the observed and simulated magnetic field fluctuations agree extremely well for the MHD-EPIC model. The MHO-EPIC simulation also produced a few flux transfer events (FTEs) that have magnetic signatures very similar to an observed event. The simulation shows that the FTEs often exhibit complex 3-0 structures with their orientations changing substantially between the equatorial plane and the Galileo trajectory, which explains the magnetic signatures observed during the magnetopause crossings. The computational cost of the MHO-EPIC simulation was only about 4 times more than that of the Hall MHD simulation.

  10. Discrete event simulation model of sudden cardiac death predicts high impact of preventive interventions.

    PubMed

    Andreev, Victor P; Head, Trajen; Johnson, Neil; Deo, Sapna K; Daunert, Sylvia; Goldschmidt-Clermont, Pascal J

    2013-01-01

    Sudden Cardiac Death (SCD) is responsible for at least 180,000 deaths a year and incurs an average cost of $286 billion annually in the United States alone. Herein, we present a novel discrete event simulation model of SCD, which quantifies the chains of events associated with the formation, growth, and rupture of atheroma plaques, and the subsequent formation of clots, thrombosis and on-set of arrhythmias within a population. The predictions generated by the model are in good agreement both with results obtained from pathological examinations on the frequencies of three major types of atheroma, and with epidemiological data on the prevalence and risk of SCD. These model predictions allow for identification of interventions and importantly for the optimal time of intervention leading to high potential impact on SCD risk reduction (up to 8-fold reduction in the number of SCDs in the population) as well as the increase in life expectancy.

  11. Changes in the frequency of extreme air pollution events over the Eastern United States and Europe

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Fiore, A. M.; Fang, Y.; Staehelin, J.

    2011-12-01

    Over the past few decades, thresholds for national air quality standards, intended to protect public health and welfare, have been lowered repeatedly. At the same time observations, over Europe and the Eastern U.S., demonstrate that extreme air pollution events (high O3 and PM2.5) are typically associated with stagnation events. Recent work showed that in a changing climate high air pollution events are likely to increase in frequency and duration. Within this work we examine meteorological and surface ozone observations from CASTNet over the U.S. and EMEP over Europe and "idealized" simulations with the GFDL AM3 chemistry-climate model, which isolate the role of climate change on air quality. Specifically, we examine an "idealized 1990s" simulation, forced with 20-year mean monthly climatologies for sea surface temperatures and sea ice from observations for 1981-2000, and an "idealized 2090s" simulation forced by the observed climatologies plus the multi-model mean changes in sea surface temperature and sea ice simulated by 19 IPCC AR-4 models under the A1B scenario for 2081-2100. With innovative statistical tools (empirical orthogonal functions (EOFs) and statistics of extremes (EVT)), we analyze the frequency distribution of past, present and future extreme air pollution events over the Eastern United States and Europe. The upper tail of observed values at individual stations (e.g., within the CASTNet), i.e., the extremes (maximum daily 8-hour average (MDA8) O3>60ppb) are poorly described by a Gaussian distribution. However, further analysis showed that applying Peak-Over-Threshold-models, better capture the extremes and allows us to estimate return levels of pollution events above certain threshold values of interest. We next apply EOF analysis to identify regions that vary coherently within the ground-based monitoring networks. Over the United States, the first EOF obtained from the model in both the 1990s and 2090s idealized simulations identifies the Northeast as a region that varies coherently. Correlation analysis reveals that this EOF pattern is most strongly expressed in association with high surface temperature and high surface pressure conditions, consistent with previous work showing that observed O3 episodes over this area reflect the combined impacts of stagnation and increased chemical production. Next steps include the extension of this analysis applying EVT tools to the principal component time series associated with this EOF. The combination of EOF and EVT tools applied to the GFDL AM3 1990s vs. 2090s idealized simulations will enable us to quantify changes in the return levels of air pollution extremes. Therefore the combination of observational data and numerical and statistical models should allow us to identify key driving forces between high air pollution events and to estimate changes in the frequency of such events under different climate change scenarios.

  12. Simulating statistics of lightning-induced and man made fires

    NASA Astrophysics Data System (ADS)

    Krenn, R.; Hergarten, S.

    2009-04-01

    The frequency-area distributions of forest fires show power-law behavior with scaling exponents α in a quite narrow range, relating wildfire research to the theoretical framework of self-organized criticality. Examples of self-organized critical behavior can be found in computer simulations of simple cellular automata. The established self-organized critical Drossel-Schwabl forest fire model (DS-FFM) is one of the most widespread models in this context. Despite its qualitative agreement with event-size statistics from nature, its applicability is still questioned. Apart from general concerns that the DS-FFM apparently oversimplifies the complex nature of forest dynamics, it significantly overestimates the frequency of large fires. We present a straightforward modification of the model rules that increases the scaling exponent α by approximately 1•3 and brings the simulated event-size statistics close to those observed in nature. In addition, combined simulations of both the original and the modified model predict a dependence of the overall distribution on the ratio of lightning induced and man made fires as well as a difference between their respective event-size statistics. The increase of the scaling exponent with decreasing lightning probability as well as the splitting of the partial distributions are confirmed by the analysis of the Canadian Large Fire Database. As a consequence, lightning induced and man made forest fires cannot be treated separately in wildfire modeling, hazard assessment and forest management.

  13. Staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in quality improvement: a focus group discussion study at two hospital settings in Sweden.

    PubMed

    Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria

    2017-06-06

    To explore healthcare staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Two focus group discussions were performed. Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Healthcare staff and managers (n=13) from the two settings. Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Identifying the most hazardous synoptic meteorological conditions for Winter UK PM10 exceedences

    NASA Astrophysics Data System (ADS)

    Webber, Chris; Dacre, Helen; Collins, Bill; Masato, Giacomo

    2016-04-01

    Summary We investigate the relationship between synoptic scale meteorological variability and local scale pollution concentrations within the UK. Synoptic conditions representative of atmospheric blocking highlighted significant increases in UK PM10 concentration ([PM10]), with the probability of exceeding harmful [PM10] limits also increased. Once relationships had been diagnosed, The Met Office Unified Model (UM) was used to replicate these relationships, using idealised source regions of PM10. This helped to determine the PM10 source regions most influential throughout UK PM10 exceedance events and to test whether the model was capable of capturing the relationships between UK PM10 and atmospheric blocking. Finally, a time slice simulation for 2050-2060 helped to answer the question whether PM10 exceedance events are more likely to occur within a changing climate. Introduction Atmospheric blocking events are well understood to lead to conditions, conducive to pollution events within the UK. Literature shows that synoptic conditions with the ability to deflect the Northwest Atlantic storm track from the UK, often lead to the highest UK pollution concentrations. Rossby wave breaking (RWB) has been identified as a mechanism, which results in atmospheric blocking and its relationship with UK [PM10] is explored using metrics designed in Masato, et al., 2013. Climate simulations facilitated by the Met Office UM, enable these relationships between RWB and PM10 to be found within the model. Subsequently the frequency of events that lead to hazardous PM10 concentrations ([PM10]) in a future climate, can be determined, within a climate simulation. An understanding of the impact, meteorology has on UK [PM10] within a changing climate, will help inform policy makers, regarding the importance of limiting PM10 emissions, ensuring safe air quality in the future. Methodology and Results Three Blocking metrics were used to subset RWB into four categories. These RWB categories were all shown to increase UK [PM10] and to increase the probability of exceeding a UK [PM10] threshold, when they occurred within constrained regions. Further analysis highlighted that Omega Block events lead to the greatest probability of exceeding hazardous UK [PM10] limits. These events facilitated the advection of European PM10, while also providing stagnant conditions over the UK, facilitating PM10 accumulation. The Met Office UM was used and nudged to ERA-Interim Reanalysis wind and temperature fields, to replicate the relationships found using observed UK [PM10]. Inert tracers were implemented into the model to replicate UK PM10 source regions throughout Europe. The modelled tracers were seen to correlate well with observed [PM10] and Figure 1 highlights the correlations between a RWB metric and observed (a) and modelled (b) [PM10]. A further free running model simulation highlighted the deficiency of the Met Office UM in capturing RWB frequency, with a reduction over the Northwest Atlantic/ European region. A final time slice simulation was undertaken for the period 2050-2060, using Representative Concentration Pathway 8.5, which attempted to determine the change in frequency of UK PM10 exceedance events, due to changing meteorology, in a future climate. Conclusions RWB has been shown to increase UK [PM10] and to lead to greater probabilities of exceeding a harmful [PM10] threshold. Omega block events have been determined the most hazardous RWB subset and this is due to a combination of European advection and UK stagnation. Simulations within the Met Office UM were undertaken and the relationships seen between observed UK [PM10] and RWB were replicated within the model, using inert tracers. Finally, time slice simulations were undertaken, determining the change in frequency of UK [PM10] exceedance events within a changing climate. References Masato, G., Hoskins, B. J., Woolings, T., 2013; Wave-breaking Characteristics of Northern Hemisphere Winter Blocking: A Two-Dimensional Approach. J. Climate, 26, 4535-4549.

  15. Evaluation and comparison of different RCMs simulations of the Mediterranean climate: a view on the impact of model resolution and Mediterranean sea coupling.

    NASA Astrophysics Data System (ADS)

    Panthou, Gérémy; Vrac, Mathieu; Drobinski, Philippe; Bastin, Sophie; Somot, Samuel; Li, Laurent

    2015-04-01

    As regularly stated by numerous authors, the Mediterranean climate is considered as one major climate 'hot spot'. At least, three reasons may explain this statement. First, this region is known for being regularly affected by extreme hydro-meteorological events (heavy precipitation and flash-floods during the autumn season; droughts and heat waves during spring and summer). Second, the vulnerability of populations in regard of these extreme events is expected to increase during the XXIst century (at least due to the projected population growth in this region). At last, Global Circulation Models project that this regional climate will be highly sensitive to climate change. Moreover, global warming is expected to intensify the hydrological cycle and thus to increase the frequency of extreme hydro-meteorological events. In order to propose adaptation strategies, the robust estimation of the future evolution of the Mediterranean climate and the associated extreme hydro-meteorological events (in terms of intensity/frequency) is of great relevance. However, these projections are characterized by large uncertainties. Many components of the simulation chain can explain these large uncertainties : (i) uncertainties concerning the emission scenario; (ii) climate model simulations suffer of parametrization errors and uncertainties concerning the initial state of the climate; and (iii) the additional uncertainties given by the (dynamical or statistical) downscaling techniques and the impact model. Narrowing (as fine as possible) these uncertainties is a major challenge of the actual climate research. One way for that is to reduce the uncertainties associated with each component. In this study, we are interested in evaluating the potential improvement of : (i) coupled RCM simulations (with the Mediterranean Sea) in comparison with atmosphere only (stand-alone) RCM simulations and (ii) RCM simulations at a finer resolution in comparison with larger resolution. For that, three different RCMs (WRF, ALADIN, LMDZ4) were run, forced by ERA-Interim reanalyses, within the MED-CORDEX experiment. For each RCM, different versions (coupled/stand-alone, high/low resolution) were realized. A large set of scores was developed and applied in order to evaluate the performances of these different RCMs simulations. These scores were applied for three variables (daily precipitation amount, mean daily air temperature and the dry spell lengths). A particular attention was given to the RCM capability to reproduce the seasonal and spatial pattern of extreme statistics. Results show that the differences between coupled and stand-alone RCMs are localized very near the Mediterranean sea and that the model resolution has a slight impact on the scores obtained. Globally, the main differences between the RCM simulations come from the RCM used. Keywords: Mediterranean climate, extreme hydro-meteorological events, RCM simulations, evaluation of climate simulations

  16. Storm Water Management Model Reference Manual Volume III – Water Quality

    EPA Science Inventory

    SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and gene...

  17. An event-based approach to understanding decadal fluctuations in the Atlantic meridional overturning circulation

    NASA Astrophysics Data System (ADS)

    Allison, Lesley; Hawkins, Ed; Woollings, Tim

    2015-01-01

    Many previous studies have shown that unforced climate model simulations exhibit decadal-scale fluctuations in the Atlantic meridional overturning circulation (AMOC), and that this variability can have impacts on surface climate fields. However, the robustness of these surface fingerprints across different models is less clear. Furthermore, with the potential for coupled feedbacks that may amplify or damp the response, it is not known whether the associated climate signals are linearly related to the strength of the AMOC changes, or if the fluctuation events exhibit nonlinear behaviour with respect to their strength or polarity. To explore these questions, we introduce an objective and flexible method for identifying the largest natural AMOC fluctuation events in multicentennial/multimillennial simulations of a variety of coupled climate models. The characteristics of the events are explored, including their magnitude, meridional coherence and spatial structure, as well as links with ocean heat transport and the horizontal circulation. The surface fingerprints in ocean temperature and salinity are examined, and compared with the results of linear regression analysis. It is found that the regressions generally provide a good indication of the surface changes associated with the largest AMOC events. However, there are some exceptions, including a nonlinear change in the atmospheric pressure signal, particularly at high latitudes, in HadCM3. Some asymmetries are also found between the changes associated with positive and negative AMOC events in the same model. Composite analysis suggests that there are signals that are robust across the largest AMOC events in each model, which provides reassurance that the surface changes associated with one particular event will be similar to those expected from regression analysis. However, large differences are found between the AMOC fingerprints in different models, which may hinder the prediction and attribution of such events in reality.

  18. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    NASA Astrophysics Data System (ADS)

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  19. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  20. Correlated Production and Analog Transport of Fission Neutrons and Photons using Fission Models FREYA, FIFRELIN and the Monte Carlo Code TRIPOLI-4® .

    NASA Astrophysics Data System (ADS)

    Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier

    2018-01-01

    Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.

  1. Model simulations of dense bottom currents in the Western Baltic Sea

    NASA Astrophysics Data System (ADS)

    Burchard, Hans; Janssen, Frank; Bolding, Karsten; Umlauf, Lars; Rennau, Hannes

    2009-01-01

    Only recently, medium intensity inflow events into the Baltic Sea have gained more awareness because of their potential to ventilate intermediate layers in the Southern Baltic Sea basins. With the present high-resolution model study of the Western Baltic Sea a first attempt is made to obtain model based realistic estimates of turbulent mixing in this area where dense bottom currents resulting from medium intensity inflow events are weakened by turbulent entrainment. The numerical model simulation which is carried out using the General Estuarine Transport Model (GETM) during nine months in 2003 and 2004 is first validated by means of three automatic stations at the Drogden and Darss Sills and in the Arkona Sea. In order to obtain good agreement between observations and model results, the 0.5×0.5 nautical mile bathymetry had to be adjusted in order to account for the fact that even at that scale many relevant topographic features are not resolved. Current velocity, salinity and turbulence observations during a medium intensity inflow event through the Øresund are then compared to the model results. Given the general problems of point to point comparisons between observations and model simulations, the agreement is fairly good with the characteristic features of the inflow event well represented by the model simulations. Two different bulk measures for mixing activity are then introduced, the vertically integrated decay of salinity variance, which is equal to the production of micro-scale salinity variance, and the vertically integrated turbulent salt flux, which is related to an increase of potential energy due to vertical mixing of stably stratified flow. Both measures give qualitatively similar results and identify the Drogden and Darss Sills as well as the Bornholm Channel as mixing hot spots. Further regions of strong mixing are the dense bottom current pathways from these sills into the Arkona Sea, areas around Kriegers Flak (a shoal in the western Arkona Sea) and north-west of the island of Rügen.

  2. Simulation Exploration Experience 2018 Overview

    NASA Technical Reports Server (NTRS)

    Paglialonga, Stephen; Elfrey, Priscilla; Crues, Edwin Z.

    2018-01-01

    The Simulation Exploration Experience (SEE) joins students, industry, professional associations, and faculty together for an annual modeling and simulation (M&S) challenge. SEE champions collaborative collegiate-level modeling and simulation by providing a venue for students to work in highly dispersed inter-university teams to design, develop, test, and execute simulated missions associated with space exploration. Participating teams gain valuable knowledge, skills, and increased employability by working closely with industry professionals, NASA, and faculty advisors. This presentation gives and overview of the SEE and the upcoming 2018 SEE event.

  3. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  4. Assessment of upper-ocean variability and the Madden-Julian Oscillation in extended-range air-ocean coupled mesoscale simulations

    NASA Astrophysics Data System (ADS)

    Hong, Xiaodong; Reynolds, Carolyn A.; Doyle, James D.; May, Paul; O'Neill, Larry

    2017-06-01

    Atmosphere-ocean interaction, particular the ocean response to strong atmospheric forcing, is a fundamental component of the Madden-Julian Oscillation (MJO). In this paper, we examine how model errors in previous Madden-Julian Oscillation (MJO) events can affect the simulation of subsequent MJO events due to increased errors that develop in the upper-ocean before the MJO initiation stage. Two fully coupled numerical simulations with 45-km and 27-km horizontal resolutions were integrated for a two-month period from November to December 2011 using the Navy's limited area Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS®). There are three MJO events that occurred subsequently in early November, mid-November, and mid-December during the simulations. The 45-km simulation shows an excessive warming of the SSTs during the suppressed phase that occurs before the initiation of the second MJO event due to erroneously strong surface net heat fluxes. The simulated second MJO event stalls over the Maritime Continent which prevents the recovery of the deep mixed layer and associated barrier layer. Cross-wavelet analysis of solar radiation and SSTs reveals that the diurnal warming is absent during the second suppressed phase after the second MJO event. The mixed layer heat budget indicates that the cooling is primarily caused by horizontal advection associated with the stalling of the second MJO event and the cool SSTs fail to initiate the third MJO event. When the horizontal resolution is increased to 27-km, three MJOs are simulated and compare well with observations on multi-month timescales. The higher-resolution simulation of the second MJO event and more-realistic upper-ocean response promote the onset of the third MJO event. Simulations performed with analyzed SSTs indicate that the stalling of the second MJO in the 45-km run is a robust feature, regardless of ocean forcing, while the diurnal cycle analysis indicates that both 45-km and 27-km ocean resolutions respond realistically when provided with realistic atmospheric forcing. Thus, the problem in the 45-km simulation appears to originate in the atmosphere. Additional simulations show that while the details of the simulations are sensitive to small changes in the initial integration time, the large differences between the 45-km and 27-km runs during the suppressed phase in early December are robust.

  5. Simulations of forest mortality in Colorado River basin

    NASA Astrophysics Data System (ADS)

    Wei, L.; Xu, C.; Johnson, D. J.; Zhou, H.; McDowell, N.

    2017-12-01

    The Colorado River Basin (CRB) had experienced multiple severe forest mortality events under the recent changing climate. Such forest mortality events may have great impacts on ecosystem services and water budget of the watershed. It is hence important to estimate and predict the forest mortality in the CRB with climate change. We simulated forest mortality in the CRB with a model of plant hydraulics within the FATES (the Functionally Assembled Terrestrial Ecosystem Simulator) coupled to the DOE Earth System model (ACME: Accelerated Climate Model of Energy) at a 0.5 x 0.5 degree resolution. Moreover, we incorporated a stable carbon isotope (δ13C) module to ACME(FATE) and used it as a new predictor of forest mortality. The δ13C values of plants with C3 photosynthetic pathway (almost all trees are C3 plants) can indicate the water stress plants experiencing (the more intensive stress, the less negative δ13C value). We set a δ13C threshold in model simulation, above which forest mortality initiates. We validate the mortality simulations with field data based on Forest Inventory and Analysis (FIA) data, which were aggregated into the same spatial resolution as the model simulations. Different mortality schemes in the model (carbon starvation, hydraulic failure, and δ13C) were tested and compared. Each scheme demonstrated its strength and the plant hydraulics module provided more reliable simulations of forest mortality than the earlier ACME(FATE) version. Further testing is required for better forest mortality modelling.

  6. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  7. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE model as Determined with the FLUKA Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Atwell, William; Reddell, Brandon; Rojdev, Kristina

    2010-01-01

    Analysis of both satellite and surface neutron monitor data demonstrate that the widely utilized Exponential model of solar particle event (SPE) proton kinetic energy spectra can seriously underestimate SPE proton flux, especially at the highest kinetic energies. The more recently developed Band model produces better agreement with neutron monitor data ground level events (GLEs) and is believed to be considerably more accurate at high kinetic energies. Here, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event environments (SEE) behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i. e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations have fully three dimensions with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. The effects are reported for both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. Our results, in agreement with previous studies, show that use of the Exponential form of the event

  8. Longitudinal decorrelation measures of flow magnitude and event-plane angles in ultrarelativistic nuclear collisions

    NASA Astrophysics Data System (ADS)

    BoŻek, Piotr; Broniowski, Wojciech

    2018-03-01

    We discuss the forward-backward correlations of harmonic flow in Pb +Pb collisions at the CERN Large Hadron Collider, applying standard multibin measures as well as new measures proposed here. We illustrate the methods with hydrodynamic model simulations based on event-by-event initial conditions from the wounded quark model with asymmetric rapidity emission profiles. Within the model, we examine independently the event-plane angle and the flow magnitude decorrelations. We find a specific hierarchy between various flow decorrelation measures and confirm certain factorization relations. We find qualitative agreement of the model and the data from the ATLAS and CMS Collaborations.

  9. Assessing performance and validating finite element simulations using probabilistic knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolin, Ronald M.; Rodriguez, E. A.

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less

  10. Autocalibration of a one-dimensional hydrodynamic-ecological model (DYRESM 4.0-CAEDYM 3.1) using a Monte Carlo approach: simulations of hypoxic events in a polymictic lake

    NASA Astrophysics Data System (ADS)

    Luo, Liancong; Hamilton, David; Lan, Jia; McBride, Chris; Trolle, Dennis

    2018-03-01

    Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook autocalibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS) method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand). The calibration procedure involved independently minimizing the root-mean-square error (RMSE), maximizing the Pearson correlation coefficient (r) and Nash-Sutcliffe efficient coefficient (Nr) for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10 000 simulation iterations. The "optimal" temperature calibration produced a RMSE of 0.54 °C, Nr value of 0.99, and r value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 and 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface) was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L-1, the Nr value was 0.75, and the r value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events from 15 January 2009 to 8 June 2011 (875 days). This verification produced an accurate simulation of five hypoxic events corresponding to DO < 2 mg L-1 during summer of 2009-2011. The RMSE was 2.07 mg L-1, Nr value 0.62, and r value of 0.81, based on the available data set of 738 days. The autocalibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimization than traditional manual calibration which has been the standard tool practiced for similar complex water quality models.

  11. Auto-calibration of a one-dimensional hydrodynamic-ecological model using a Monte Carlo approach: simulation of hypoxic events in a polymictic lake

    NASA Astrophysics Data System (ADS)

    Luo, L.

    2011-12-01

    Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook auto-calibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS) method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand). The calibration procedure involved independently minimising the root-mean-square-error (RMSE), maximizing the Pearson correlation coefficient (r) and Nash-Sutcliffe efficient coefficient (Nr) for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10,000 simulation iterations. The 'optimal' temperature calibration produced a RMSE of 0.54 °C, Nr-value of 0.99 and r-value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 - 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface) was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L-1, the Nr-value was 0.75 and the r-value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events for the period 15 January 2009 to 8 June 2011 (875 days). This verification produced an accurate simulation of five hypoxic events corresponding to DO < 2 mg L-1 during summer of 2009-2011. The RMSE was 2.07 mg L-1, Nr-value 0.62 and r-value of 0.81, based on the available data set of 738 days. The auto-calibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimisation than traditional manual calibration which has been the standard tool practiced for similar complex water quality models.

  12. A systematic comparison of recurrent event models for application to composite endpoints.

    PubMed

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  13. Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices

    Treesearch

    Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling

    2008-01-01

    The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...

  14. A large-signal dynamic simulation for the series resonant converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1983-01-01

    A simple nonlinear discrete-time dynamic model for the series resonant dc-dc converter is derived using approximations appropriate to most power converters. This model is useful for the dynamic simulation of a series resonant converter using only a desktop calculator. The model is compared with a laboratory converter for a large transient event.

  15. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  16. Synthetic Seismograms of Explosive Sources Calculated by the Earth Simulator

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Matsumoto, H.; Rozhkov, M.; Stachnik, J.

    2017-12-01

    We calculate broadband synthetic seismograms using the spectral-element method (Komatitsch & Tromp, 2001) for recent explosive events in northern Korean peninsula. We use supercomputer Earth Simulator system in JAMSTEC to compute synthetic seismograms using the spectral-element method. The simulations are performed on 8,100 processors, which require 2,025 nodes of the Earth Simulator. We use one chunk with the angular distance 40 degrees to compute synthetic seismograms. On this number of nodes, a simulation of 5 minutes of wave propagation accurate at periods of 1.5 seconds and longer requires about 10 hours of CPU time. We use CMT solution of Rozhkov et al (2016) as a source model for this event. One example of CMT solution for this source model has 28% double couple component and 51% isotropic component. The hypocenter depth of this solution is 1.4 km. Comparisons of the synthetic waveforms with the observation show that the arrival time of Pn and Pg waves matches well with the observation. Comparison also shows that the agreement of amplitude of other phases is not necessarily well, which demonstrates that the crustal structure should be improved to include in the simulation. The surface waves observed are also modeled well in the synthetics, which shows that the CMT solution we have used for this computation correctly grasps the source characteristics of this event. Because of characteristics of artificial explosive sources of which hypocenter location is already known, we may evaluate crustal structure along the propagation path from the waveform modeling for these sources. We may discuss the limitation of one dimensional crustal structure model by comparing the synthetic waveform of 3D crustal structure and the observed seismograms.

  17. Extreme Landfalling Atmospheric River Events in Arizona: Possible Future Changes

    NASA Astrophysics Data System (ADS)

    Singh, I.; Dominguez, F.

    2016-12-01

    Changing climate could impact the frequency and intensity of extreme atmospheric river events. This can have important consequences for regions like the Southwestern United Sates that rely upon AR-related precipitation for meeting their water demand and are prone to AR-related flooding. This study investigates the effects of climate change on extreme AR events in the Salt and Verde river basins in Central Arizona using a pseudo global warming method (PGW). First, the five most extreme events that affected the region were selected. High-resolution control simulations of these events using the Weather Research and Forecasting model realistically captured the magnitude and spatial distribution of precipitation. Subsequently, following the PGW approach, the WRF initial and lateral boundary conditions were perturbed. The perturbation signals were obtained from an ensemble of 9 General Circulation Models for two warming scenarios - Representative Concentration Pathway (RCP) 4.5 and RCP8.5. Several simulations were conducted changing the temperature and relative humidity fields. PGW simulations reveal that while the overall dynamics of the storms did not change significantly, there was marked strengthening of associated Integrated Vertical Transport (IVT) plumes. There was a general increase in the precipitation over the basins due to increased moisture availability, but heterogeneous spatial changes. Additionally, no significant changes in the strength of the pre-cold frontal low-level jet in the future simulations were observed.

  18. Kinetic Monte Carlo modeling of chemical reactions coupled with heat transfer.

    PubMed

    Castonguay, Thomas C; Wang, Feng

    2008-03-28

    In this paper, we describe two types of effective events for describing heat transfer in a kinetic Monte Carlo (KMC) simulation that may involve stochastic chemical reactions. Simulations employing these events are referred to as KMC-TBT and KMC-PHE. In KMC-TBT, heat transfer is modeled as the stochastic transfer of "thermal bits" between adjacent grid points. In KMC-PHE, heat transfer is modeled by integrating the Poisson heat equation for a short time. Either approach is capable of capturing the time dependent system behavior exactly. Both KMC-PHE and KMC-TBT are validated by simulating pure heat transfer in a rod and a square and modeling a heated desorption problem where exact numerical results are available. KMC-PHE is much faster than KMC-TBT and is used to study the endothermic desorption of a lattice gas. Interesting findings from this study are reported.

  19. Kinetic Monte Carlo modeling of chemical reactions coupled with heat transfer

    NASA Astrophysics Data System (ADS)

    Castonguay, Thomas C.; Wang, Feng

    2008-03-01

    In this paper, we describe two types of effective events for describing heat transfer in a kinetic Monte Carlo (KMC) simulation that may involve stochastic chemical reactions. Simulations employing these events are referred to as KMC-TBT and KMC-PHE. In KMC-TBT, heat transfer is modeled as the stochastic transfer of "thermal bits" between adjacent grid points. In KMC-PHE, heat transfer is modeled by integrating the Poisson heat equation for a short time. Either approach is capable of capturing the time dependent system behavior exactly. Both KMC-PHE and KMC-TBT are validated by simulating pure heat transfer in a rod and a square and modeling a heated desorption problem where exact numerical results are available. KMC-PHE is much faster than KMC-TBT and is used to study the endothermic desorption of a lattice gas. Interesting findings from this study are reported.

  20. Spiking neural network model for memorizing sequences with forward and backward recall.

    PubMed

    Borisyuk, Roman; Chik, David; Kazanovich, Yakov; da Silva Gomes, João

    2013-06-01

    We present an oscillatory network of conductance based spiking neurons of Hodgkin-Huxley type as a model of memory storage and retrieval of sequences of events (or objects). The model is inspired by psychological and neurobiological evidence on sequential memories. The building block of the model is an oscillatory module which contains excitatory and inhibitory neurons with all-to-all connections. The connection architecture comprises two layers. A lower layer represents consecutive events during their storage and recall. This layer is composed of oscillatory modules. Plastic excitatory connections between the modules are implemented using an STDP type learning rule for sequential storage. Excitatory neurons in the upper layer project star-like modifiable connections toward the excitatory lower layer neurons. These neurons in the upper layer are used to tag sequences of events represented in the lower layer. Computer simulations demonstrate good performance of the model including difficult cases when different sequences contain overlapping events. We show that the model with STDP type or anti-STDP type learning rules can be applied for the simulation of forward and backward replay of neural spikes respectively. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    NASA Astrophysics Data System (ADS)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the cold climate event 8200 years ago by meltwater outburst from lake Agassiz. Paleoceanography 19:PA3014, (2004) [2] T. Schneider von Deimling, H. Held, A. Ganopolski, S. Rahmstorf, Climate sensitivity estimated from ensemble simulations of glacial climates, Climate Dynamics 27, 149-163, DOI 10.1007/s00382-006-0126-8 (2006). [3] A. Lorenz, Diploma Thesis, U Potsdam (2007).

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Elizabeth J.; Yu, Sungduk; Kooperman, Gabriel J.

    The sensitivities of simulated mesoscale convective systems (MCSs) in the central U.S. to microphysics and grid configuration are evaluated here in a global climate model (GCM) that also permits global-scale feedbacks and variability. Since conventional GCMs do not simulate MCSs, studying their sensitivities in a global framework useful for climate change simulations has not previously been possible. To date, MCS sensitivity experiments have relied on controlled cloud resolving model (CRM) studies with limited domains, which avoid internal variability and neglect feedbacks between local convection and larger-scale dynamics. However, recent work with superparameterized (SP) GCMs has shown that eastward propagating MCS-likemore » events are captured when embedded CRMs replace convective parameterizations. This study uses a SP version of the Community Atmosphere Model version 5 (SP-CAM5) to evaluate MCS sensitivities, applying an objective empirical orthogonal function algorithm to identify MCS-like events, and harmonizing composite storms to account for seasonal and spatial heterogeneity. A five-summer control simulation is used to assess the magnitude of internal and interannual variability relative to 10 sensitivity experiments with varied CRM parameters, including ice fall speed, one-moment and two-moment microphysics, and grid spacing. MCS sensitivities were found to be subtle with respect to internal variability, and indicate that ensembles of over 100 storms may be necessary to detect robust differences in SP-GCMs. Furthermore, these results emphasize that the properties of MCSs can vary widely across individual events, and improving their representation in global simulations with significant internal variability may require comparison to long (multidecadal) time series of observed events rather than single season field campaigns.« less

  3. Proactive modeling of water quality impacts of extreme precipitation events in a drinking water reservoir.

    PubMed

    Jeznach, Lillian C; Hagemann, Mark; Park, Mi-Hyun; Tobiason, John E

    2017-10-01

    Extreme precipitation events are of concern to managers of drinking water sources because these occurrences can affect both water supply quantity and quality. However, little is known about how these low probability events impact organic matter and nutrient loads to surface water sources and how these loads may impact raw water quality. This study describes a method for evaluating the sensitivity of a water body of interest from watershed input simulations under extreme precipitation events. An example application of the method is illustrated using the Wachusett Reservoir, an oligo-mesotrophic surface water reservoir in central Massachusetts and a major drinking water supply to metropolitan Boston. Extreme precipitation event simulations during the spring and summer resulted in total organic carbon, UV-254 (a surrogate measurement for reactive organic matter), and total algae concentrations at the drinking water intake that exceeded recorded maximums. Nutrient concentrations after storm events were less likely to exceed recorded historical maximums. For this particular reservoir, increasing inter-reservoir transfers of water with lower organic matter content after a large precipitation event has been shown in practice and in model simulations to decrease organic matter levels at the drinking water intake, therefore decreasing treatment associated oxidant demand, energy for UV disinfection, and the potential for formation of disinfection byproducts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Numerical Simulation of the 9-10 June 1972 Black Hills Storm Using CSU RAMS

    NASA Technical Reports Server (NTRS)

    Nair, U. S.; Hjelmfelt, Mark R.; Pielke, Roger A., Sr.

    1997-01-01

    Strong easterly flow of low-level moist air over the eastern slopes of the Black Hills on 9-10 June 1972 generated a storm system that produced a flash flood, devastating the area. Based on observations from this storm event, and also from the similar Big Thompson 1976 storm event, conceptual models have been developed to explain the unusually high precipitation efficiency. In this study, the Black Hills storm is simulated using the Colorado State University Regional Atmospheric Modeling System. Simulations with homogeneous and inhomogeneous initializations and different grid structures are presented. The conceptual models of storm structure proposed by previous studies are examined in light of the present simulations. Both homogeneous and inhomogeneous initialization results capture the intense nature of the storm, but the inhomogeneous simulation produced a precipitation pattern closer to the observed pattern. The simulations point to stationary tilted updrafts, with precipitation falling out to the rear as the preferred storm structure. Experiments with different grid structures point to the importance of removing the lateral boundaries far from the region of activity. Overall, simulation performance in capturing the observed behavior of the storm system was enhanced by use of inhomogeneous initialization.

  5. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  6. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  7. Influence of January 2009 stratospheric warming on HF radio wave propagation in the low-latitude ionosphere

    NASA Astrophysics Data System (ADS)

    Kotova, Darya; Klimenko, Maksim; Klimenko, Vladimir; Zaharov, Veniamin; Bessarab, Fedor; Korenkov, Yuriy

    2016-12-01

    We have considered the influence of the January 23-27, 2009 sudden stratospheric warming (SSW) event on HF radio wave propagation in the equatorial ionosphere. This event took place during extremely low solar and geomagnetic activity. We use the simulation results obtained with the Global Self-consistent Model of the Thermosphere, Ionosphere and Protonosphere (GSM TIP) for simulating environmental changes during the SSW event. We both qualitatively and quantitatively reproduced total electron content disturbances obtained from global ground network receiver observations of GPS navigation satellite signals, by setting an additional electric potential and TIME-GCM model output at a height of 80 km. In order to study the influence of this SSW event on HF radio wave propagation and attenuation, we used the numerical model of radio wave propagation based on geometrical optics approximation. It is shown that the sudden stratospheric warming leads to radio signal attenuation and deterioration of radio communication in the daytime equatorial ionosphere.

  8. A coupled hydrological-hydraulic flood inundation model calibrated using post-event measurements and integrated uncertainty analysis in a poorly gauged Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois

    2017-04-01

    Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.

  9. Contribution of rainfall, snow and ice melt to the hydrological regime of the Arve upper catchment and to severe flood events

    NASA Astrophysics Data System (ADS)

    Lecourt, Grégoire; Revuelto, Jesús; Morin, Samuel; Zin, Isabella; Lafaysse, Matthieu; Condom, Thomas; Six, Delphine; Vionnet, Vincent; Charrois, Luc; Dumont, Marie; Gottardi, Frédéric; Laarman, Olivier; Coulaud, Catherine; Esteves, Michel; Lebel, Thierry; Vincent, Christian

    2016-04-01

    In Alpine catchments, the hydrological response to meteorological events is highly influenced by the precipitation phase (liquid or solid) and by snow and ice melt. It is thus necessary to simulate accurately the snowpack evolution and its spatial distribution to perform relevant hydrological simulations. This work is focused on the upper Arve Valley (Western Alps). This 205 km2 catchment has large glaciated areas (roughly 32% of the study area) and covers a large range of elevations (1000-4500 m a.s.l.). Snow presence is significant year-round. The area is also characterized by steep terrain and strong vegetation heterogeneity. Modelling hydrological processes in such a complex catchment is therefore challenging. The detailed ISBA land surface model (including the Crocus snowpack scheme) has been applied to the study area using a topography based discretization (classifying terrain by aspect, elevation, slope and presence of glacier). The meteorological forcing used to run the simulations is the reanalysis issued from the SAFRAN model which assimilates meteorological observations from the Meteo-France networks. Conceptual reservoirs with calibrated values of emptying parameters are used to represent the underground water storage. This approach has been tested to simulate the discharge on the Arve catchment and three sub-catchments over 1990-2015. The simulations were evaluated with respect to observed water discharges for several headwaters with varying glaciated areas. They allow to quantify the relative contribution of rainfall, snow and ice melt to the hydrological regime of the basin. Additionally, we present a detailed analysis of several particular flood events. For these events, the ability of the model to correctly represent the catchment behaviour is investigated, looking particularly to the relevance of the simulated snowpack. Particularly, its spatial distribution is evaluated using MODIS snow cover maps, punctual snowpack observations and summer glacier mass balance estimations.

  10. Simulations of Sea Level Rise Effects on Complex Coastal Systems

    NASA Astrophysics Data System (ADS)

    Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Reed, C. W.

    2009-12-01

    It is now established that complex coastal systems with elements such as beaches, inlets, bays, and rivers adjust their morphologies according to time-varying balances in between the processes that control the exchange of sediment. Accelerated sea level rise introduces a major perturbation into the sediment-sharing systems. A modeling framework based on a new SL-PR model which is an advanced version of the aggregate-scale CST Model and the event-scale CMS-2D and CMS-Wave combination have been used to simulate the recent evolution of a portion of the Florida panhandle coast. This combination of models provides a method to evaluate coefficients in the aggregate-scale model that were previously treated as fitted parameters. That is, by carrying out simulations of a complex coastal system with runs of the event-scale model representing more than a year it is now possible to directly relate the coefficients in the large-scale SL-PR model to measureable physical parameters in the current and wave fields. This cross-scale modeling procedure has been used to simulate the shoreline evolution at the Santa Rosa Island, a long barrier which houses significant military infrastructure at the north Gulf Coast. The model has been used to simulate 137 years of measured shoreline change and to extend these to predictions of future rates of shoreline migration.

  11. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  12. ELVES Research at the Pierre Auger Observatory: Optical Emission Simulation and Time Evolution, WWLLN-LIS-Auger Correlations, and Double ELVES Observations and Simulation.

    NASA Astrophysics Data System (ADS)

    Merenda, K. D.

    2016-12-01

    Since 2013, the Pierre Auger Cosmic Ray Observatory in Mendoza, Argentina, extended its trigger algorithm to detect emissions of light consistent with the signature from very low frequency perturbations due to electromagnetic pulse sources (ELVES). Correlations with the World Wide Lightning Location Network (WWLLN), the Lightning Imaging Sensor (LIS) and simulated events were used to assess the quality of the reconstructed data. The FD is a pixel array telescope sensitive to the deep UV emissions of ELVES. The detector provides the finest time resolution of 100 nanoseconds ever applied to the study of ELVES. Four eyes, separated by approximately 40 kilometers, consist of six telescopes and span a total of 360 degrees of azimuth angle. The detector operates at night when storms are not in the field of view. An existing 3D EMP Model solves Maxwell's equations using a three dimensional finite-difference time-domain model to describe the propagation of electromagnetic pulses from lightning sources to the ionosphere. The simulation also provides a projection of the resulting ELVES onto the pixel array of the FD. A full reconstruction of simulated events is under development. We introduce the analog signal time evolution comparison between Auger reconstructed data and simulated events on individual FD pixels. In conjunction, we will present a study of the angular distribution of light emission around the vertical and above the causative lightning source. We will also contrast, with Monte Carlo, Auger double ELVES events separated by at most 5 microseconds. These events are too short to be explained by multiple return strokes, ground reflections, or compact intra-cloud lightning sources. Reconstructed ELVES data is 40% correlated to WWLLN data and an analysis with the LIS database is underway.

  13. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  14. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  15. Mathematical Modeling of Early Cellular Innate and Adaptive Immune Responses to Ischemia/Reperfusion Injury and Solid Organ Allotransplantation

    PubMed Central

    Day, Judy D.; Metes, Diana M.; Vodovotz, Yoram

    2015-01-01

    A mathematical model of the early inflammatory response in transplantation is formulated with ordinary differential equations. We first consider the inflammatory events associated only with the initial surgical procedure and the subsequent ischemia/reperfusion (I/R) events that cause tissue damage to the host as well as the donor graft. These events release damage-associated molecular pattern molecules (DAMPs), thereby initiating an acute inflammatory response. In simulations of this model, resolution of inflammation depends on the severity of the tissue damage caused by these events and the patient’s (co)-morbidities. We augment a portion of a previously published mathematical model of acute inflammation with the inflammatory effects of T cells in the absence of antigenic allograft mismatch (but with DAMP release proportional to the degree of graft damage prior to transplant). Finally, we include the antigenic mismatch of the graft, which leads to the stimulation of potent memory T cell responses, leading to further DAMP release from the graft and concomitant increase in allograft damage. Regulatory mechanisms are also included at the final stage. Our simulations suggest that surgical injury and I/R-induced graft damage can be well-tolerated by the recipient when each is present alone, but that their combination (along with antigenic mismatch) may lead to acute rejection, as seen clinically in a subset of patients. An emergent phenomenon from our simulations is that low-level DAMP release can tolerize the recipient to a mismatched allograft, whereas different restimulation regimens resulted in an exaggerated rejection response, in agreement with published studies. We suggest that mechanistic mathematical models might serve as an adjunct for patient- or sub-group-specific predictions, simulated clinical studies, and rational design of immunosuppression. PMID:26441988

  16. Fully 3D modeling of tokamak vertical displacement events with realistic parameters

    NASA Astrophysics Data System (ADS)

    Pfefferle, David; Ferraro, Nathaniel; Jardin, Stephen; Bhattacharjee, Amitava

    2016-10-01

    In this work, we model the complex multi-domain and highly non-linear physics of Vertical Displacement Events (VDEs), one of the most damaging off-normal events in tokamaks, with the implicit 3D extended MHD code M3D-C1. The code has recently acquired the capability to include finite thickness conducting structures within the computational domain. By exploiting the possibility of running a linear 3D calculation on top of a non-linear 2D simulation, we monitor the non-axisymmetric stability and assess the eigen-structure of kink modes as the simulation proceeds. Once a stability boundary is crossed, a fully 3D non-linear calculation is launched for the remainder of the simulation, starting from an earlier time of the 2D run. This procedure, along with adaptive zoning, greatly increases the efficiency of the calculation, and allows to perform VDE simulations with realistic parameters and high resolution. Simulations are being validated with NSTX data where both axisymmetric (toroidally averaged) and non-axisymmetric induced and conductive (halo) currents have been measured. This work is supported by US DOE Grant DE-AC02-09CH11466.

  17. Using a discrete-event simulation to balance ambulance availability and demand in static deployment systems.

    PubMed

    Wu, Ching-Han; Hwang, Kevin P

    2009-12-01

    To improve ambulance response time, matching ambulance availability with the emergency demand is crucial. To maintain the standard of 90% of response times within 9 minutes, the authors introduce a discrete-event simulation method to estimate the threshold for expanding the ambulance fleet when demand increases and to find the optimal dispatching strategies when provisional events create temporary decreases in ambulance availability. The simulation model was developed with information from the literature. Although the development was theoretical, the model was validated on the emergency medical services (EMS) system of Tainan City. The data are divided: one part is for model development, and the other for validation. For increasing demand, the effect was modeled on response time when call arrival rates increased. For temporary availability decreases, the authors simulated all possible alternatives of ambulance deployment in accordance with the number of out-of-routine-duty ambulances and the durations of three types of mass gatherings: marathon races (06:00-10:00 hr), rock concerts (18:00-22:00 hr), and New Year's Eve parties (20:00-01:00 hr). Statistical analysis confirmed that the model reasonably represented the actual Tainan EMS system. The response-time standard could not be reached when the incremental ratio of call arrivals exceeded 56%, which is the threshold for the Tainan EMS system to expand its ambulance fleet. When provisional events created temporary availability decreases, the Tainan EMS system could spare at most two ambulances from the standard configuration, except between 20:00 and 01:00, when it could spare three. The model also demonstrated that the current Tainan EMS has two excess ambulances that could be dropped. The authors suggest dispatching strategies to minimize the response times in routine daily emergencies. Strategies of capacity management based on this model improved response times. The more ambulances that are out of routine duty, the better the performance of the optimal strategies that are based on this model.

  18. "Physically-based" numerical experiment to determine the dominant hillslope processes during floods?

    NASA Astrophysics Data System (ADS)

    Gaume, Eric; Esclaffer, Thomas; Dangla, Patrick; Payrastre, Olivier

    2016-04-01

    To study the dynamics of hillslope responses during flood event, a fully coupled "physically-based" model for the combined numerical simulation of surface runoff and underground flows has been developed. A particular attention has been given to the selection of appropriate numerical schemes for the modelling of both processes and of their coupling. Surprisingly, the most difficult question to solve, from a numerical point of view, was not related to the coupling of two processes with contrasted kinetics such as surface and underground flows, but to the high gradient infiltration fronts appearing in soils, source of numerical diffusion, instabilities and sometimes divergence. The model being elaborated, it has been successfully tested against results of high quality experiments conducted on a laboratory sandy slope in the early eighties, which is still considered as a reference hillslope experimental setting (Abdul & Guilham). The model appeared able to accurately simulate the pore pressure distributions observed in this 1.5 meter deep and wide laboratory hillslope, as well as its outflow hydrograph shapes and the measured respective contributions of direct runoff and groundwater to these outflow hydrographs. Based on this great success, the same model has been used to simulate the response of a theoretical 100-meter wide and 10% sloped hillslope, with a 2 meter deep pervious soil and impervious bedrock. Three rain events have been tested: a 100 millimeter rainfall event over 10 days, over 1 day or over one hour. The simulated responses are hydrologically not realistic and especially the fast component of the response, that is generally observed in the real-world and explains flood events, is almost absent of the simulated response. Thinking a little about the whole problem, the simulation results appears totally logical according to the proposed model. The simulated response, in fact a recession hydrograph, corresponds to a piston flow of a relatively uniformly saturated hillslope leading to a constant discharge over several days. Some ingredients are clearly missing in the proposed model to reproduce hydrologically sensible responses. Heterogeneities are necessary to generate a variety of residence times and especially preferential flows must clearly be present to generate the fast component of hillslope responses. The importance of preferential flows in hillslope hydrology has been confirmed since this reported failure by several hillslope field experiments. We let also the readers draw their own conclusions about the numerous numerical models, that look very much alike the model proposed here, even if generally much more simplified, but representing the watersheds as much too homogeneous neglecting heterogeneities and preferential flows and pretending to be "physically based"…

  19. On extending parallelism to serial simulators

    NASA Technical Reports Server (NTRS)

    Nicol, David; Heidelberger, Philip

    1994-01-01

    This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.

  20. Will climate change increase the risk for critical infrastructure failures in Europe due to extreme precipitation?

    NASA Astrophysics Data System (ADS)

    Nissen, Katrin; Ulbrich, Uwe

    2016-04-01

    An event based detection algorithm for extreme precipitation is applied to a multi-model ensemble of regional climate model simulations. The algorithm determines extent, location, duration and severity of extreme precipitation events. We assume that precipitation in excess of the local present-day 10-year return value will potentially exceed the capacity of the drainage systems that protect critical infrastructure elements. This assumption is based on legislation for the design of drainage systems which is in place in many European countries. Thus, events exceeding the local 10-year return value are detected. In this study we distinguish between sub-daily events (3 hourly) with high precipitation intensities and long-duration events (1-3 days) with high precipitation amounts. The climate change simulations investigated here were conducted within the EURO-CORDEX framework and exhibit a horizontal resolution of approximately 12.5 km. The period between 1971-2100 forced with observed and scenario (RCP 8.5 and RCP 4.5) greenhouse gas concentrations was analysed. Examined are changes in event frequency, event duration and size. The simulations show an increase in the number of extreme precipitation events for the future climate period over most of the area, which is strongest in Northern Europe. Strength and statistical significance of the signal increase with increasing greenhouse gas concentrations. This work has been conducted within the EU project RAIN (Risk Analysis of Infrastructure Networks in response to extreme weather).

  1. The validity of flow approximations when simulating catchment-integrated flash floods

    NASA Astrophysics Data System (ADS)

    Bout, B.; Jetten, V. G.

    2018-01-01

    Within hydrological models, flow approximations are commonly used to reduce computation time. The validity of these approximations is strongly determined by flow height, flow velocity and the spatial resolution of the model. In this presentation, the validity and performance of the kinematic, diffusive and dynamic flow approximations are investigated for use in a catchment-based flood model. Particularly, the validity during flood events and for varying spatial resolutions is investigated. The OpenLISEM hydrological model is extended to implement both these flow approximations and channel flooding based on dynamic flow. The flow approximations are used to recreate measured discharge in three catchments, among which is the hydrograph of the 2003 flood event in the Fella river basin. Furthermore, spatial resolutions are varied for the flood simulation in order to investigate the influence of spatial resolution on these flow approximations. Results show that the kinematic, diffusive and dynamic flow approximation provide least to highest accuracy, respectively, in recreating measured discharge. Kinematic flow, which is commonly used in hydrological modelling, substantially over-estimates hydrological connectivity in the simulations with a spatial resolution of below 30 m. Since spatial resolutions of models have strongly increased over the past decades, usage of routed kinematic flow should be reconsidered. The combination of diffusive or dynamic overland flow and dynamic channel flooding provides high accuracy in recreating the 2003 Fella river flood event. Finally, in the case of flood events, spatial modelling of kinematic flow substantially over-estimates hydrological connectivity and flow concentration since pressure forces are removed, leading to significant errors.

  2. Sources of suspended-sediment loads in the lower Nueces River watershed, downstream from Lake Corpus Christi to the Nueces Estuary, south Texas, 1958–2010

    USGS Publications Warehouse

    Ockerman, Darwin J.; Heitmuller, Franklin T.; Wehmeyer, Loren L.

    2013-01-01

    During 2010, additional suspended-sediment data were collected during selected runoff events to provide new data for model testing and to help better understand the sources of suspended-sediment loads. The model was updated and used to estimate and compare sediment yields from each of 64 subwatersheds comprising the lower Nueces River watershed study area for three selected runoff events: November 20-21, 2009, September 7-8, 2010, and September 20-21, 2010. These three runoff events were characterized by heavy rainfall centered near the study area and during which minimal streamflow and suspended-sediment load entered the lower Nueces River upstream from Wesley E. Seale Dam. During all three runoff events, model simulations showed that the greatest sediment yields originated from the subwatersheds, which were largely cropland. In particular, the Bayou Creek subwatersheds were major contributors of suspended-sediment load to the lower Nueces River during the selected runoff events. During the November 2009 runoff event, high suspended-sediment concentrations in the Nueces River water withdrawn for the City of Corpus Christi public-water supply caused problems during the water-treatment process, resulting in failure to meet State water-treatment standards for turbidity in drinking water. Model simulations of the November 2009 runoff event showed that the Bayou Creek subwatersheds were the primary source of suspended-sediment loads during that runoff event.

  3. Simulations of Cloud-Radiation Interaction Using Large-Scale Forcing Derived from the CINDY/DYNAMO Northern Sounding Array

    NASA Technical Reports Server (NTRS)

    Wang, Shuguang; Sobel, Adam H.; Fridlind, Ann; Feng, Zhe; Comstock, Jennifer M.; Minnis, Patrick; Nordeen, Michele L.

    2015-01-01

    The recently completed CINDY/DYNAMO field campaign observed two Madden-Julian oscillation (MJO) events in the equatorial Indian Ocean from October to December 2011. Prior work has indicated that the moist static energy anomalies in these events grew and were sustained to a significant extent by radiative feedbacks. We present here a study of radiative fluxes and clouds in a set of cloud-resolving simulations of these MJO events. The simulations are driven by the large-scale forcing data set derived from the DYNAMO northern sounding array observations, and carried out in a doubly periodic domain using the Weather Research and Forecasting (WRF) model. Simulated cloud properties and radiative fluxes are compared to those derived from the S-PolKa radar and satellite observations. To accommodate the uncertainty in simulated cloud microphysics, a number of single-moment (1M) and double-moment (2M) microphysical schemes in the WRF model are tested. The 1M schemes tend to underestimate radiative flux anomalies in the active phases of the MJO events, while the 2M schemes perform better, but can overestimate radiative flux anomalies. All the tested microphysics schemes exhibit biases in the shapes of the histograms of radiative fluxes and radar reflectivity. Histograms of radiative fluxes and brightness temperature indicate that radiative biases are not evenly distributed; the most significant bias occurs in rainy areas with OLR less than 150 W/ cu sq in the 2M schemes. Analysis of simulated radar reflectivities indicates that this radiative flux uncertainty is closely related to the simulated stratiform cloud coverage. Single-moment schemes underestimate stratiform cloudiness by a factor of 2, whereas 2M schemes simulate much more stratiform cloud.

  4. Novel high-fidelity realistic explosion damage simulation for urban environments

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  5. Effective precipitation duration for runoff peaks based on catchment modelling

    NASA Astrophysics Data System (ADS)

    Sikorska, A. E.; Viviroli, D.; Seibert, J.

    2018-01-01

    Despite precipitation intensities may greatly vary during one flood event, detailed information about these intensities may not be required to accurately simulate floods with a hydrological model which rather reacts to cumulative precipitation sums. This raises two questions: to which extent is it important to preserve sub-daily precipitation intensities and how long does it effectively rain from the hydrological point of view? Both questions might seem straightforward to answer with a direct analysis of past precipitation events but require some arbitrary choices regarding the length of a precipitation event. To avoid these arbitrary decisions, here we present an alternative approach to characterize the effective length of precipitation event which is based on runoff simulations with respect to large floods. More precisely, we quantify the fraction of a day over which the daily precipitation has to be distributed to faithfully reproduce the large annual and seasonal floods which were generated by the hourly precipitation rate time series. New precipitation time series were generated by first aggregating the hourly observed data into daily totals and then evenly distributing them over sub-daily periods (n hours). These simulated time series were used as input to a hydrological bucket-type model and the resulting runoff flood peaks were compared to those obtained when using the original precipitation time series. We define then the effective daily precipitation duration as the number of hours n, for which the largest peaks are simulated best. For nine mesoscale Swiss catchments this effective daily precipitation duration was about half a day, which indicates that detailed information on precipitation intensities is not necessarily required to accurately estimate peaks of the largest annual and seasonal floods. These findings support the use of simple disaggregation approaches to make usage of past daily precipitation observations or daily precipitation simulations (e.g. from climate models) for hydrological modeling at an hourly time step.

  6. Simulation of Runoff Changes Caused by Cropland to Forest Conversion in the Upper Yangtze River Region, SW China

    PubMed Central

    Yu, Pengtao; Wang, Yanhui; Coles, Neil; Xiong, Wei; Xu, Lihong

    2015-01-01

    The "Grain for Green Project" is a country-wide ecological program to converse marginal cropland to forest, which has been implemented in China since 2002. To quantify influence of this significant vegetation change, Guansihe Hydrological (GSH) Model, a validated physically-based distributed hydrological model, was applied to simulate runoff responses to land use change in the Guansihe watershed that is located in the upper reaches of the Yangtze River basin in Southwestern China with an area of only 21.1 km2. Runoff responses to two single rainfall events, 90 mm and 206 mm respectively, were simulated for 16 scenarios of cropland to forest conversion. The model simulations indicated that the total runoff generated after conversion to forest was strongly dependent on whether the land was initially used for dry croplands without standing water in fields or constructed (or walled) paddy fields. The simulated total runoff generated from the two rainfall events displayed limited variation for the conversion of dry croplands to forest, while it strongly decreased after paddy fields were converted to forest. The effect of paddy terraces on runoff generation was dependent on the rainfall characteristics and antecedent moisture (or saturation) conditions in the fields. The reduction in simulated runoff generated from intense rainfall events suggested that afforestation and terracing might be effective in managing runoff and had the potential to mitigate flooding in southwestern China. PMID:26192181

  7. Influence of urban surface properties and rainfall characteristics on surface water flood outputs - insights from a physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2017-04-01

    Surface water (pluvial) flooding occurs when excess rainfall from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flood events pose a major hazard to urban regions across the world, with nearly two thirds of flood damages in the UK being caused by surface water flood events. The perceived risk of surface water flooding appears to have increased in recent years due to several factors, including (i) precipitation increases associated with climatic change and variability; (ii) population growth meaning more people are occupying flood risk areas, and; (iii) land-use changes. Because urban areas are often associated with a high proportion of impermeable land-uses (e.g. tarmacked or paved surfaces and buildings) and a reduced coverage of vegetated, permeable surfaces, urban surface water flood risk during high intensity precipitation events is often exacerbated. To investigate the influence of urbanisation and terrestrial factors on surface water flood outputs, rainfall intensity, catchment slope, permeability, building density/layout scenarios were designed within a novel, 9m2 physical modelling environment. The two-tiered physical model used consists of (i) a low-cost, nozzle-type rainfall simulator component which is able to simulate consistent, uniformly distributed rainfall events of varying duration and intensity, and; (ii) a reconfigurable, modular plot surface. All experiments within the physical modelling environment were subjected to a spatiotemporally uniform 45-minute simulated rainfall event, while terrestrial factors on the physical model plot surface were altered systematically to investigate their hydrological response on modelled outflow and depth profiles. Results from the closed, controlled physical modelling experiments suggest that meteorological factors, such as the duration and intensity of simulated rainfall, and terrestrial factors, such as model slope, surface permeability and building density have a significant influence on physical model hydrological outputs. For example, changes in building density across the urban model catchment are shown to result in hydrographs having (i) a more rapid rising limb; (ii) higher peak discharges; (iii) a reduction in the total hydrograph time, and; (iv) a faster falling limb, with the dense building scenario having a 22% increase in peak discharge when compared to the no building scenario. Furthermore, the layout of buildings across the plot surface and their proximity to the outflow unit (i.e. downstream, upstream or to the side of the physical model outlet) is shown to influence outflow hydrograph response, with downstream concentrated building scenarios resulting in a delay in hydrograph onset time and a reduction in the time of the total outflow hydrograph event.

  8. The partly Aalen's model for recurrent event data with a dependent terminal event.

    PubMed

    Chen, Chyong-Mei; Shen, Pao-Sheng; Chuang, Ya-Wen

    2016-01-30

    Recurrent event data are commonly observed in biomedical longitudinal studies. In many instances, there exists a terminal event, which precludes the occurrence of additional repeated events, and usually there is also a nonignorable correlation between the terminal event and recurrent events. In this article, we propose a partly Aalen's additive model with a multiplicative frailty for the rate function of recurrent event process and assume a Cox frailty model for terminal event time. A shared gamma frailty is used to describe the correlation between the two types of events. Consequently, this joint model can provide the information of temporal influence of absolute covariate effects on the rate of recurrent event process, which is usually helpful in the decision-making process for physicians. An estimating equation approach is developed to estimate marginal and association parameters in the joint model. The consistency of the proposed estimator is established. Simulation studies demonstrate that the proposed approach is appropriate for practical use. We apply the proposed method to a peritonitis cohort data set for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Structure, Function, and Applications of the Georgetown-Einstein (GE) Breast Cancer Simulation Model.

    PubMed

    Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S

    2018-04-01

    The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.

  10. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  11. Modeling sediment yield in small catchments at event scale: Model comparison, development and evaluation

    NASA Astrophysics Data System (ADS)

    Tan, Z.; Leung, L. R.; Li, H. Y.; Tesfa, T. K.

    2017-12-01

    Sediment yield (SY) has significant impacts on river biogeochemistry and aquatic ecosystems but it is rarely represented in Earth System Models (ESMs). Existing SY models focus on estimating SY from large river basins or individual catchments so it is not clear how well they simulate SY in ESMs at larger spatial scales and globally. In this study, we compare the strengths and weaknesses of eight well-known SY models in simulating annual mean SY at about 400 small catchments ranging in size from 0.22 to 200 km2 in the US, Canada and Puerto Rico. In addition, we also investigate the performance of these models in simulating event-scale SY at six catchments in the US using high-quality hydrological inputs. The model comparison shows that none of the models can reproduce the SY at large spatial scales but the Morgan model performs the better than others despite its simplicity. In all model simulations, large underestimates occur in catchments with very high SY. A possible pathway to reduce the discrepancies is to incorporate sediment detachment by landsliding, which is currently not included in the models being evaluated. We propose a new SY model that is based on the Morgan model but including a landsliding soil detachment scheme that is being developed. Along with the results of the model comparison and evaluation, preliminary findings from the revised Morgan model will be presented.

  12. Modeling 2D and 3D diffusion.

    PubMed

    Saxton, Michael J

    2007-01-01

    Modeling obstructed diffusion is essential to the understanding of diffusion-mediated processes in the crowded cellular environment. Simple Monte Carlo techniques for modeling obstructed random walks are explained and related to Brownian dynamics and more complicated Monte Carlo methods. Random number generation is reviewed in the context of random walk simulations. Programming techniques and event-driven algorithms are discussed as ways to speed simulations.

  13. Exact subthreshold integration with continuous spike times in discrete-time neural network simulations.

    PubMed

    Morrison, Abigail; Straube, Sirko; Plesser, Hans Ekkehard; Diesmann, Markus

    2007-01-01

    Very large networks of spiking neurons can be simulated efficiently in parallel under the constraint that spike times are bound to an equidistant time grid. Within this scheme, the subthreshold dynamics of a wide class of integrate-and-fire-type neuron models can be integrated exactly from one grid point to the next. However, the loss in accuracy caused by restricting spike times to the grid can have undesirable consequences, which has led to interest in interpolating spike times between the grid points to retrieve an adequate representation of network dynamics. We demonstrate that the exact integration scheme can be combined naturally with off-grid spike events found by interpolation. We show that by exploiting the existence of a minimal synaptic propagation delay, the need for a central event queue is removed, so that the precision of event-driven simulation on the level of single neurons is combined with the efficiency of time-driven global scheduling. Further, for neuron models with linear subthreshold dynamics, even local event queuing can be avoided, resulting in much greater efficiency on the single-neuron level. These ideas are exemplified by two implementations of a widely used neuron model. We present a measure for the efficiency of network simulations in terms of their integration error and show that for a wide range of input spike rates, the novel techniques we present are both more accurate and faster than standard techniques.

  14. Application of RADSAFE to Model Single Event Upset Response of a 0.25 micron CMOS SRAM

    NASA Technical Reports Server (NTRS)

    Warren, Kevin M.; Weller, Robert A.; Sierawski, Brian; Reed, Robert A.; Mendenhall, Marcus H.; Schrimpf, Ronald D.; Massengill, Lloyd; Porter, Mark; Wilkerson, Jeff; LaBel, Kenneth A.; hide

    2006-01-01

    The RADSAFE simulation framework is described and applied to model Single Event Upsets (SEU) in a 0.25 micron CMOS 4Mbit Static Random Access Memory (SRAM). For this circuit, the RADSAFE approach produces trends similar to those expected from classical models, but more closely represents the physical mechanisms responsible for SEU in the SRAM circuit.

  15. The Role of Temporal Evolution in Modeling Atmospheric Emissions from Tropical Fires

    NASA Technical Reports Server (NTRS)

    Marlier, Miriam E.; Voulgarakis, Apostolos; Shindell, Drew T.; Faluvegi, Gregory S.; Henry, Candise L.; Randerson, James T.

    2014-01-01

    Fire emissions associated with tropical land use change and maintenance influence atmospheric composition, air quality, and climate. In this study, we explore the effects of representing fire emissions at daily versus monthly resolution in a global composition-climate model. We find that simulations of aerosols are impacted more by the temporal resolution of fire emissions than trace gases such as carbon monoxide or ozone. Daily-resolved datasets concentrate emissions from fire events over shorter time periods and allow them to more realistically interact with model meteorology, reducing how often emissions are concurrently released with precipitation events and in turn increasing peak aerosol concentrations. The magnitude of this effect varies across tropical ecosystem types, ranging from smaller changes in modeling the low intensity, frequent burning typical of savanna ecosystems to larger differences when modeling the short-term, intense fires that characterize deforestation events. The utility of modeling fire emissions at a daily resolution also depends on the application, such as modeling exceedances of particulate matter concentrations over air quality guidelines or simulating regional atmospheric heating patterns.

  16. Statistical Analysis of Notational AFL Data Using Continuous Time Markov Chains

    PubMed Central

    Meyer, Denny; Forbes, Don; Clarke, Stephen R.

    2006-01-01

    Animal biologists commonly use continuous time Markov chain models to describe patterns of animal behaviour. In this paper we consider the use of these models for describing AFL football. In particular we test the assumptions for continuous time Markov chain models (CTMCs), with time, distance and speed values associated with each transition. Using a simple event categorisation it is found that a semi-Markov chain model is appropriate for this data. This validates the use of Markov Chains for future studies in which the outcomes of AFL matches are simulated. Key Points A comparison of four AFL matches suggests similarity in terms of transition probabilities for events and the mean times, distances and speeds associated with each transition. The Markov assumption appears to be valid. However, the speed, time and distance distributions associated with each transition are not exponential suggesting that semi-Markov model can be used to model and simulate play. Team identified events and directions associated with transitions are required to develop the model into a tool for the prediction of match outcomes. PMID:24357946

  17. Statistical Analysis of Notational AFL Data Using Continuous Time Markov Chains.

    PubMed

    Meyer, Denny; Forbes, Don; Clarke, Stephen R

    2006-01-01

    Animal biologists commonly use continuous time Markov chain models to describe patterns of animal behaviour. In this paper we consider the use of these models for describing AFL football. In particular we test the assumptions for continuous time Markov chain models (CTMCs), with time, distance and speed values associated with each transition. Using a simple event categorisation it is found that a semi-Markov chain model is appropriate for this data. This validates the use of Markov Chains for future studies in which the outcomes of AFL matches are simulated. Key PointsA comparison of four AFL matches suggests similarity in terms of transition probabilities for events and the mean times, distances and speeds associated with each transition.The Markov assumption appears to be valid.However, the speed, time and distance distributions associated with each transition are not exponential suggesting that semi-Markov model can be used to model and simulate play.Team identified events and directions associated with transitions are required to develop the model into a tool for the prediction of match outcomes.

  18. Ice-shelf collapse from subsurface warming as a trigger for Heinrich events

    PubMed Central

    Marcott, Shaun A.; Clark, Peter U.; Padman, Laurie; Klinkhammer, Gary P.; Springer, Scott R.; Liu, Zhengyu; Otto-Bliesner, Bette L.; Carlson, Anders E.; Ungerer, Andy; Padman, June; He, Feng; Cheng, Jun; Schmittner, Andreas

    2011-01-01

    Episodic iceberg-discharge events from the Hudson Strait Ice Stream (HSIS) of the Laurentide Ice Sheet, referred to as Heinrich events, are commonly attributed to internal ice-sheet instabilities, but their systematic occurrence at the culmination of a large reduction in the Atlantic meridional overturning circulation (AMOC) indicates a climate control. We report Mg/Ca data on benthic foraminifera from an intermediate-depth site in the northwest Atlantic and results from a climate-model simulation that reveal basin-wide subsurface warming at the same time as large reductions in the AMOC, with temperature increasing by approximately 2 °C over a 1–2 kyr interval prior to a Heinrich event. In simulations with an ocean model coupled to a thermodynamically active ice shelf, the increase in subsurface temperature increases basal melt rate under an ice shelf fronting the HSIS by a factor of approximately 6. By analogy with recent observations in Antarctica, the resulting ice-shelf loss and attendant HSIS acceleration would produce a Heinrich event. PMID:21808034

  19. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    NASA Astrophysics Data System (ADS)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  20. Using Multi-scale Dynamic Rupture Models to Improve Ground Motion Estimates: ALCF-2 Early Science Program Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ely, Geoffrey P.

    2013-10-31

    This project uses dynamic rupture simulations to investigate high-frequency seismic energy generation. The relevant phenomena (frictional breakdown, shear heating, effective normal-stress fluctuations, material damage, etc.) controlling rupture are strongly interacting and span many orders of magnitude in spatial scale, requiring highresolution simulations that couple disparate physical processes (e.g., elastodynamics, thermal weakening, pore-fluid transport, and heat conduction). Compounding the computational challenge, we know that natural faults are not planar, but instead have roughness that can be approximated by power laws potentially leading to large, multiscale fluctuations in normal stress. The capacity to perform 3D rupture simulations that couple these processes willmore » provide guidance for constructing appropriate source models for high-frequency ground motion simulations. The improved rupture models from our multi-scale dynamic rupture simulations will be used to conduct physicsbased (3D waveform modeling-based) probabilistic seismic hazard analysis (PSHA) for California. These calculation will provide numerous important seismic hazard results, including a state-wide extended earthquake rupture forecast with rupture variations for all significant events, a synthetic seismogram catalog for thousands of scenario events and more than 5000 physics-based seismic hazard curves for California.« less

  1. SAPS simulation with GITM/UCLA-RCM coupled model

    NASA Astrophysics Data System (ADS)

    Lu, Y.; Deng, Y.; Guo, J.; Zhang, D.; Wang, C. P.; Sheng, C.

    2017-12-01

    Abstract: SAPS simulation with GITM/UCLA-RCM coupled model Author: Yang Lu, Yue Deng, Jiapeng Guo, Donghe Zhang, Chih-Ping Wang, Cheng Sheng Ion velocity in the Sub Aurora region observed by Satellites in storm time often shows a significant westward component. The high speed westward stream is distinguished with convection pattern. These kind of events are called Sub Aurora Polarization Stream (SAPS). In March 17th 2013 storm, DMSP F18 satellite observed several SAPS cases when crossing Sub Aurora region. In this study, Global Ionosphere Thermosphere Model (GITM) has been coupled to UCLA-RCM model to simulate the impact of SAPS during March 2013 event on the ionosphere/thermosphere. The particle precipitation and electric field from RCM has been used to drive GITM. The conductance calculated from GITM has feedback to RCM to make the coupling to be self-consistent. The comparison of GITM simulations with different SAPS specifications will be conducted. The neutral wind from simulation will be compared with GOCE satellite. The comparison between runs with SAPS and without SAPS will separate the effect of SAPS from others and illustrate the impact on the TIDS/TADS propagating to both poleward and equatorward directions.

  2. Projecting Future Changes in Extreme Weather During the North American Monsoon in the Southwest with High Resolution, Convective-Permitting Regional Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Chang, H. I.; Castro, C. L.; Luong, T. M.; Lahmers, T.; Jares, M.; Carrillo, C. M.

    2014-12-01

    Most severe weather during the North American monsoon in the Southwest U.S. occurs in association with organized convection, including microbursts, dust storms, flash flooding and lightning. Our objective is to project how monsoon severe weather is changing due to anthropogenic global warming. We first consider a dynamically downscaled reanalysis (35 km grid spacing), generated with the Weather Research and Forecasting (WRF) model during the period 1948-2010. Individual severe weather events, identified by favorable thermodynamic conditions of instability and precipitable water, are then simulated for short-term, numerical weather prediction-type simulations of 24h at a convective-permitting scale (2 km grid spacing). Changes in the character of severe weather events within this period likely reflect long-term climate change driven by anthropogenic forcing. Next, we apply the identical model simulation and analysis procedures to several dynamically downscaled CMIP3 and CMIP5 models for the period 1950-2100, to assess how monsoon severe weather may change in the future and if these changes correspond with what is already occurring per the downscaled renalaysis and available observational data. The CMIP5 models we are downscaling (HadGEM and MPI-ECHAM6) will be included as part of North American CORDEX. The regional model experimental design for severe weather event projection reasonably accounts for the known operational forecast prerequisites for severe monsoon weather. The convective-permitting simulations show that monsoon convection appears to be reasonably well captured with the use of the dynamically downscaled reanalysis, in comparison to Stage IV precipitation data. The regional model tends to initiate convection too early, though correctly simulates the diurnal maximum in convection in the afternoon and subsequent westward propagation of thunderstorms. Projected changes in extreme event precipitation will be described in relation to the long-term changes in thermodynamic and dynamic forcing mechanisms for severe weather. Results from this project will be used for climate change impacts assessment for U.S. military installations in the region.

  3. Dual Interlocked Logic for Single-Event Transient Mitigation

    DTIC Science & Technology

    2017-03-01

    SPICE simulation and fault-injection analysis. Exemplar SPICE simulations have been performed in a 32nm partially- depleted silicon-on-insulator...in this work. The model has been validated at the 32nm SOI technology node with extensive heavy-ion data [7]. For the SPICE simulations, three

  4. Observations and predictability of gap winds in a steep, narrow, fire-prone canyon in central Idaho, USA

    NASA Astrophysics Data System (ADS)

    Wagenbrenner, N. S.; Forthofer, J.; Gibson, C.; Lamb, B. K.

    2017-12-01

    Frequent strong gap winds were measured in a deep, steep, wildfire-prone river canyon of central Idaho, USA during July-September 2013. Analysis of archived surface pressure data indicate that the gap wind events were driven by regional scale surface pressure gradients. The events always occurred between 0400 and 1200 LT and typically lasted 3-4 hours. The timing makes these events particularly hazardous for wildland firefighting applications since the morning is typically a period of reduced fire activity and unsuspecting firefighters could be easily endangered by the onset of strong downcanyon winds. The gap wind events were not explicitly forecast by operational numerical weather prediction (NWP) models due to the small spatial scale of the canyon ( 1-2 km wide) compared to the horizontal resolution of operational NWP models (3 km or greater). Custom WRF simulations initialized with NARR data were run at 1 km horizontal resolution to assess whether higher resolution NWP could accurately simulate the observed gap winds. Here, we show that the 1 km WRF simulations captured many of the observed gap wind events, although the strength of the events was underpredicted. We also present evidence from these WRF simulations which suggests that the Salmon River Canyon is near the threshold of WRF-resolvable terrain features when the standard WRF coordinate system and discretization schemes are used. Finally, we show that the strength of the gap wind events can be predicted reasonably well as a function of the surface pressure gradient across the gap, which could be useful in the absence of high-resolution NWP. These are important findings for wildland firefighting applications in narrow gaps where routine forecasts may not provide warning for wind effects induced by high-resolution terrain features.

  5. Accounting for costs, QALYs, and capacity constraints: using discrete-event simulation to evaluate alternative service delivery and organizational scenarios for hospital-based glaucoma services.

    PubMed

    Crane, Glenis J; Kymes, Steven M; Hiller, Janet E; Casson, Robert; Martin, Adam; Karnon, Jonathan D

    2013-11-01

    Decision-analytic models are routinely used as a framework for cost-effectiveness analyses of health care services and technologies; however, these models mostly ignore resource constraints. In this study, we use a discrete-event simulation model to inform a cost-effectiveness analysis of alternative options for the organization and delivery of clinical services in the ophthalmology department of a public hospital. The model is novel, given that it represents both disease outcomes and resource constraints in a routine clinical setting. A 5-year discrete-event simulation model representing glaucoma patient services at the Royal Adelaide Hospital (RAH) was implemented and calibrated to patient-level data. The data were sourced from routinely collected waiting and appointment lists, patient record data, and the published literature. Patient-level costs and quality-adjusted life years were estimated for a range of alternative scenarios, including combinations of alternate follow-up times, booking cycles, and treatment pathways. The model shows that a) extending booking cycle length from 4 to 6 months, b) extending follow-up visit times by 2 to 3 months, and c) using laser in preference to medication are more cost-effective than current practice at the RAH eye clinic. The current simulation model provides a useful tool for informing improvements in the organization and delivery of glaucoma services at a local level (e.g., within a hospital), on the basis of expected effects on costs and health outcomes while accounting for current capacity constraints. Our model may be adapted to represent glaucoma services at other hospitals, whereas the general modeling approach could be applied to many other clinical service areas.

  6. A satellite and model based flood inundation climatology of Australia

    NASA Astrophysics Data System (ADS)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  7. Disaster Response Modeling Through Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  8. Modeling Powered Aerodynamics for the Orion Launch Abort Vehicle Aerodynamic Database

    NASA Technical Reports Server (NTRS)

    Chan, David T.; Walker, Eric L.; Robinson, Philip E.; Wilson, Thomas M.

    2011-01-01

    Modeling the aerodynamics of the Orion Launch Abort Vehicle (LAV) has presented many technical challenges to the developers of the Orion aerodynamic database. During a launch abort event, the aerodynamic environment around the LAV is very complex as multiple solid rocket plumes interact with each other and the vehicle. It is further complicated by vehicle separation events such as between the LAV and the launch vehicle stack or between the launch abort tower and the crew module. The aerodynamic database for the LAV was developed mainly from wind tunnel tests involving powered jet simulations of the rocket exhaust plumes, supported by computational fluid dynamic simulations. However, limitations in both methods have made it difficult to properly capture the aerodynamics of the LAV in experimental and numerical simulations. These limitations have also influenced decisions regarding the modeling and structure of the aerodynamic database for the LAV and led to compromises and creative solutions. Two database modeling approaches are presented in this paper (incremental aerodynamics and total aerodynamics), with examples showing strengths and weaknesses of each approach. In addition, the unique problems presented to the database developers by the large data space required for modeling a launch abort event illustrate the complexities of working with multi-dimensional data.

  9. Earthquake cycle modeling of multi-segmented faults: dynamic rupture and ground motion simulation of the 1992 Mw 7.3 Landers earthquake.

    NASA Astrophysics Data System (ADS)

    Petukhin, A.; Galvez, P.; Somerville, P.; Ampuero, J. P.

    2017-12-01

    We perform earthquake cycle simulations to study the characteristics of source scaling relations and strong ground motions and in multi-segmented fault ruptures. For earthquake cycle modeling, a quasi-dynamic solver (QDYN, Luo et al, 2016) is used to nucleate events and the fully dynamic solver (SPECFEM3D, Galvez et al., 2014, 2016) is used to simulate earthquake ruptures. The Mw 7.3 Landers earthquake has been chosen as a target earthquake to validate our methodology. The SCEC fault geometry for the three-segmented Landers rupture is included and extended at both ends to a total length of 200 km. We followed the 2-D spatial correlated Dc distributions based on Hillers et. al. (2007) that associates Dc distribution with different degrees of fault maturity. The fault maturity is related to the variability of Dc on a microscopic scale. Large variations of Dc represents immature faults and lower variations of Dc represents mature faults. Moreover we impose a taper (a-b) at the fault edges and limit the fault depth to 15 km. Using these settings, earthquake cycle simulations are performed to nucleate seismic events on different sections of the fault, and dynamic rupture modeling is used to propagate the ruptures. The fault segmentation brings complexity into the rupture process. For instance, the change of strike between fault segments enhances strong variations of stress. In fact, Oglesby and Mai (2012) show the normal stress varies from positive (clamping) to negative (unclamping) between fault segments, which leads to favorable or unfavorable conditions for rupture growth. To replicate these complexities and the effect of fault segmentation in the rupture process, we perform earthquake cycles with dynamic rupture modeling and generate events similar to the Mw 7.3 Landers earthquake. We extract the asperities of these events and analyze the scaling relations between rupture area, average slip and combined area of asperities versus moment magnitude. Finally, the simulated ground motions will be validated by comparison of simulated response spectra with recorded response spectra and with response spectra from ground motion prediction models. This research is sponsored by the Japan Nuclear Regulation Authority.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled asmore » declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical models, but instead attempts to recreate the real world system as a decision and event driven simulation. Finally, because the simulation addresses warhead confirmation, chain of custody, and warhead dismantlement in a modular fashion, a discrete-event model could be easily adapted to multiple CONOPS for the exploration of a large number of “what if” scenarios.« less

  11. Diagnosing Possible Anthropogenic Contributions to Heavy Colorado Rainfall in September 2013

    NASA Astrophysics Data System (ADS)

    Pall, Pardeep; Patricola, Christina; Wehner, Michael; Stone, Dáithí; Paciorek, Christopher; Collins, William

    2015-04-01

    Unusually heavy rainfall occurred over the Colorado Front Range during early September 2013, with record or near-record totals recorded in several locations. It was associated predominantly with a stationary large-scale weather pattern (akin to the North American Monsoon, which occurs earlier in the year) that drove a strong plume of deep moisture inland from the Gulf of Mexico against the Front Range foothills. The resulting floods across the South Platte River basin impacted several thousands of people and many homes, roads, and businesses. To diagnose possible anthropogenic contributions to the odds of such heavy rainfall, we adapt an existing event attribution paradigm of modelling an 'event that was' for September 2013 and comparing it to a modelled 'event that might have been' for that same time but for the absence of historical anthropogenic drivers of climate. Specifically, we first perform 'event that was' simulations with the regional Weather Research and Forecasting (WRF) model at 12 km resolution over North America, driven by NCEP2 re-analysis. We then re-simulate, having adjusted the re-analysis to 'event that might have been conditions' by modifying atmospheric greenhouse gas and other pollutant concentrations, temperature, humidity, and winds, as well as sea ice coverage, and sea-surface temperatures - all according to estimates from global climate model simulations. Thus our findings are highly conditional on the driving re-analysis and adjustments therein, but the setup allows us to elucidate possible mechanisms responsible for heavy Colorado rainfall in September 2013. Our model results suggests that, given an insignificant change in the pattern of large-scale driving weather, there is an increase in atmospheric water vapour under anthropogenic climate warming leading to a substantial increase in the probability of heavy rainfall occurring over the South Platte River basin in September 2013.

  12. The more extreme nature of North American monsoon precipitation in the Southwestern United States

    NASA Astrophysics Data System (ADS)

    Chang, H. I.; Luong, T. M.; Castro, C. L.; Lahmers, T. M.; Adams, D. K.; Ochoa-Moya, C.

    2017-12-01

    Most severe weather in the Southwestern United States occurs during the North American monsoon. This research examines how monsoon extreme weather events will change with respect to occurrence and intensity. A new technique to severe weather event projection has been developed, using convective perimitting regional atmospheric modeling of days with highest instabilty and atmospheric moisture. The guiding principle is to use a weather forecast based approach to climate change project, with a modeling paradigm in which organized convective structures and their behavior are explicitly physically represented in the simulation design. Of particular interest is the simulation of severe weather events caused by mesoscale convective systems (MCSs), which account for a greater proportion of monsoon rainfall downwind of the Mogollon Rim in Arizona, in the central and southwestern portions of the state. The convective-permitting model simulations are performed for identified severe weather event days for both historical and future climate projections, similar to an operational weather forecast. There have been significant long-term changes in atmospheric thermodynamic and dynamic conditions that have occurred over the past sixty years. Monsoon thunderstorms are tending to be more 'thermodynamically dominated' with less tendency to organize and propagate. Though there are tending to be a fewer number of strong, organized MCS-type convective events during the monsoon, when they do occur their associated precipitation is now tending to be more intense. The area of central and southwestern Arizona, corresponding to the area of the state most impacted by MCSs during the monsoon, appears to be a local hot spot where precipitation and downdraft winds are becoming more intense. These types of changes are very consistent with the historical observed precipitation data and model projections of historical and future climate, from dynamically downscaled CMIP3 and CMIP5 models.

  13. A modelling study of long term green roof retention performance.

    PubMed

    Stovin, Virginia; Poë, Simon; Berretta, Christian

    2013-12-15

    This paper outlines the development of a conceptual hydrological flux model for the long term continuous simulation of runoff and drought risk for green roof systems. A green roof's retention capacity depends upon its physical configuration, but it is also strongly influenced by local climatic controls, including the rainfall characteristics and the restoration of retention capacity associated with evapotranspiration during dry weather periods. The model includes a function that links evapotranspiration rates to substrate moisture content, and is validated against observed runoff data. The model's application to typical extensive green roof configurations is demonstrated with reference to four UK locations characterised by contrasting climatic regimes, using 30-year rainfall time-series inputs at hourly simulation time steps. It is shown that retention performance is dependent upon local climatic conditions. Volumetric retention ranges from 0.19 (cool, wet climate) to 0.59 (warm, dry climate). Per event retention is also considered, and it is demonstrated that retention performance decreases significantly when high return period events are considered in isolation. For example, in Sheffield the median per-event retention is 1.00 (many small events), but the median retention for events exceeding a 1 in 1 yr return period threshold is only 0.10. The simulation tool also provides useful information about the likelihood of drought periods, for which irrigation may be required. A sensitivity study suggests that green roofs with reduced moisture-holding capacity and/or low evapotranspiration rates will tend to offer reduced levels of retention, whilst high moisture-holding capacity and low evapotranspiration rates offer the strongest drought resistance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A fuzzy rumor spreading model based on transmission capacity

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Xu, Jiuping; Wu, Yue

    This paper proposes a rumor spreading model that considers three main factors: the event importance, event ambiguity, and the publics critical sense, each of which are defined by decision makers using linguistic descriptions and then transformed into triangular fuzzy numbers. To calculate the resultant force of these three factors, the transmission capacity and a new parameter category with fuzzy variables are determined. A rumor spreading model is then proposed which has fuzzy parameters rather than the fixed parameters in traditional models. As the proposed model considers the comprehensive factors affecting rumors from three aspects rather than examining special factors from a particular aspect. The proposed rumor spreading model is tested using different parameters for several different conditions on BA networks and three special cases are simulated. The simulation results for all three cases suggested that events of low importance, those that are only clarifying facts, and those that are strongly critical do not result in rumors. Therefore, the model assessment results were proven to be in agreement with reality. Parameters for the model were then determined and applied to an analysis of the 7.23 Yong-Wen line major transportation accident (YWMTA). When the simulated data were compared with the real data from this accident, the results demonstrated that the interval for the rumor spreading key point in the model was accurate, and that the key point for the YWMTA rumor spread fell into the range estimated by the model.

  15. Enterprise Systems Analysis

    DTIC Science & Technology

    2016-03-14

    flows , or continuous state changes, with feedback loops and lags modeled in the flow system. Agent based simulations operate using a discrete event...DeLand, S. M., Rutherford, B . M., Diegert, K. V., & Alvin, K. F. (2002). Error and uncertainty in modeling and simulation . Reliability Engineering...intrinsic complexity of the underlying social systems fundamentally limits the ability to make

  16. Examining the Effects of Mosaic Land Cover on Extreme Events in Historical Downscaled WRF Simulations

    EPA Science Inventory

    The representation of land use and land cover (hereby referred to as “LU”) is a challenging aspect of dynamically downscaled simulations, as a mesoscale model that is utilized as a regional climate model (RCM) may be limited in its ability to represent LU over multi-d...

  17. Implementing system simulation of C3 systems using autonomous objects

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1987-01-01

    The basis of all conflict recognition in simulation is a common frame of reference. Synchronous discrete-event simulation relies on the fixed points in time as the basic frame of reference. Asynchronous discrete-event simulation relies on fixed-points in the model space as the basic frame of reference. Neither approach provides sufficient support for autonomous objects. The use of a spatial template as a frame of reference is proposed to address these insufficiencies. The concept of a spatial template is defined and an implementation approach offered. Discussed are the uses of this approach to analyze the integration of sensor data associated with Command, Control, and Communication systems.

  18. VEEP - Vehicle Economy, Emissions, and Performance program

    NASA Technical Reports Server (NTRS)

    Heimburger, D. A.; Metcalfe, M. A.

    1977-01-01

    VEEP is a general-purpose discrete event simulation program being developed to study the performance, fuel economy, and exhaust emissions of a vehicle modeled as a collection of its separate components. It is written in SIMSCRIPT II.5. The purpose of this paper is to present the design methodology, describe the simulation model and its components, and summarize the preliminary results. Topics include chief programmer team concepts, the SDDL design language, program portability, user-oriented design, the program's user command syntax, the simulation procedure, and model validation.

  19. Exploration Supply Chain Simulation

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.

  20. Simulation of the June 11, 2010, flood along the Little Missouri River near Langley, Arkansas, using a hydrologic model coupled to a hydraulic model

    USGS Publications Warehouse

    Westerman, Drew A.; Clark, Brian R.

    2013-01-01

    The results from the precipitation-runoff hydrologic model, the one-dimensional unsteady-state hydraulic model, and a separate two-dimensional model developed as part of a coincident study, each complement the other in terms of streamflow timing, water-surface elevations, and velocities propagated by the June 11, 2010, flood event. The simulated grids for water depth and stream velocity from each model were directly compared by subtracting the one-dimensional hydraulic model grid from the two-dimensional model grid. The absolute mean difference for the simulated water depth was 0.9 foot. Additionally, the absolute mean difference for the simulated stream velocity was 1.9 feet per second.

  1. Assessing the Impacts of Flooding Caused by Extreme Rainfall Events Through a Combined Geospatial and Numerical Modeling Approach

    NASA Astrophysics Data System (ADS)

    Santillan, J. R.; Amora, A. M.; Makinano-Santillan, M.; Marqueso, J. T.; Cutamora, L. C.; Serviano, J. L.; Makinano, R. M.

    2016-06-01

    In this paper, we present a combined geospatial and two dimensional (2D) flood modeling approach to assess the impacts of flooding due to extreme rainfall events. We developed and implemented this approach to the Tago River Basin in the province of Surigao del Sur in Mindanao, Philippines, an area which suffered great damage due to flooding caused by Tropical Storms Lingling and Jangmi in the year 2014. The geospatial component of the approach involves extraction of several layers of information such as detailed topography/terrain, man-made features (buildings, roads, bridges) from 1-m spatial resolution LiDAR Digital Surface and Terrain Models (DTM/DSMs), and recent land-cover from Landsat 7 ETM+ and Landsat 8 OLI images. We then used these layers as inputs in developing a Hydrologic Engineering Center Hydrologic Modeling System (HEC HMS)-based hydrologic model, and a hydraulic model based on the 2D module of the latest version of HEC River Analysis System (RAS) to dynamically simulate and map the depth and extent of flooding due to extreme rainfall events. The extreme rainfall events used in the simulation represent 6 hypothetical rainfall events with return periods of 2, 5, 10, 25, 50, and 100 years. For each event, maximum flood depth maps were generated from the simulations, and these maps were further transformed into hazard maps by categorizing the flood depth into low, medium and high hazard levels. Using both the flood hazard maps and the layers of information extracted from remotely-sensed datasets in spatial overlay analysis, we were then able to estimate and assess the impacts of these flooding events to buildings, roads, bridges and landcover. Results of the assessments revealed increase in number of buildings, roads and bridges; and increase in areas of land-cover exposed to various flood hazards as rainfall events become more extreme. The wealth of information generated from the flood impact assessment using the approach can be very useful to the local government units and the concerned communities within Tago River Basin as an aid in determining in an advance manner all those infrastructures (buildings, roads and bridges) and land-cover that can be affected by different extreme rainfall event flood scenarios.

  2. A 30m resolution hydrodynamic model of the entire conterminous United States.

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C.; Johnson, K.; Wing, O.

    2016-12-01

    In this paper we describe the development and validation of a 30m resolution hydrodynamic model covering the entire conterminous United States. The model can be used to simulate inundation and water depths resulting from either return period flows (so equivalent to FEMA Flood Insurance Rate Maps), hindcasts of historic events or forecasts of future river flow from a rainfall-runoff or land surface model. As topographic data the model uses the U.S. Geological Survey National Elevation Dataset or NED, and return period flows are generated using a regional flood frequency analysis methodology (Smith et al., 2015. Worldwide flood frequency estimation. Water Resources Research, 51, 539-553). Flood defences nationwide are represented using data from the US Army Corps of Engineers. Using these data flows are simulated using an explicit and highly efficient finite difference solution of the local inertial form of the Shallow Water equations identical to that implemented in the LISFLOOD-FP model. Even with this efficient numerical solution a simulation at this resolution over a whole continent is a huge undertaking, and a variety of High Performance Computing technologies therefore need to be employed to make these simulations possible. The size of the output datasets is also challenging, and to solve this we use the GIS and graphical display functions of Google Earth Engine to facilitate easy visualisation and interrogation of the results. The model is validated against the return period flood extents contained in FEMA Flood Insurance Rate Maps and real flood event data from the Texas 2015 flood event which was hindcast using the model. Finally, we present an application of the model to the Upper Mississippi river basin where simulations both with and without flood defences are used to determine floodplain areas benefitting from protection in order to quantify the benefits of flood defence spending.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  4. Complex discrete dynamics from simple continuous population models.

    PubMed

    Gamarra, Javier G P; Solé, Ricard V

    2002-05-01

    Nonoverlapping generations have been classically modelled as difference equations in order to account for the discrete nature of reproductive events. However, other events such as resource consumption or mortality are continuous and take place in the within-generation time. We have realistically assumed a hybrid ODE bidimensional model of resources and consumers with discrete events for reproduction. Numerical and analytical approaches showed that the resulting dynamics resembles a Ricker map, including the doubling route to chaos. Stochastic simulations with a handling-time parameter for indirect competition of juveniles may affect the qualitative behaviour of the model.

  5. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  6. A Study of Umbilical Communication Interface of Simulator Kernel to Enhance Visibility and Controllability

    NASA Astrophysics Data System (ADS)

    Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.

  7. Working Together: An Empirical Analysis of a Multiclass Legislative-Executive Branch Simulation

    ERIC Educational Resources Information Center

    Kalaf-Hughes, Nicole; Mills, Russell W.

    2016-01-01

    Much of the research on the use of simulations in the political science classroom focuses on how simulations model different events in the real world, including political campaigns, international diplomacy, and legislative bargaining. In the case of American Politics, many simulations focus on the behavior of Congress and the legislative process,…

  8. High-fidelity numerical modeling of the Upper Mississippi River under extreme flood condition

    NASA Astrophysics Data System (ADS)

    Khosronejad, Ali; Le, Trung; DeWall, Petra; Bartelt, Nicole; Woldeamlak, Solomon; Yang, Xiaolei; Sotiropoulos, Fotis

    2016-12-01

    We present data-driven numerical simulations of extreme flooding in a large-scale river coupling coherent-structure resolving hydrodynamics with bed morphodynamics under live-bed conditions. The study area is a ∼ 3.2 km long and ∼ 300 m wide reach of the Upper Mississippi River, near Minneapolis MN, which contains several natural islands and man-made hydraulic structures. We employ the large-eddy simulation (LES) and bed-morphodynamic modules of the Virtual Flow Simulator (VFS-Rivers) model, a recently developed in-house code, to investigate the flow and bed evolution of the river during a 100-year flood event. The coupling of the two modules is carried out via a fluid-structure interaction approach using a nested domain approach to enhance the resolution of bridge scour predictions. We integrate data from airborne Light Detection and Ranging (LiDAR), sub-aqueous sonar apparatus on-board a boat and in-situ laser scanners to construct a digital elevation model of the river bathymetry and surrounding flood plain, including islands and bridge piers. A field campaign under base-flow condition is also carried out to collect mean flow measurements via Acoustic Doppler Current Profiler (ADCP) to validate the hydrodynamic module of the VFS-Rivers model. Our simulation results for the bed evolution of the river under the 100-year flood reveal complex sediment transport dynamics near the bridge piers consisting of both scour and refilling events due to the continuous passage of sand dunes. We find that the scour depth near the bridge piers can reach to a maximum of ∼ 9 m. The data-driven simulation strategy we present in this work exemplifies a practical simulation-based-engineering-approach to investigate the resilience of infrastructures to extreme flood events in intricate field-scale riverine systems.

  9. Photosynthetic productivity and its efficiencies in ISIMIP2a biome models: benchmarking for impact assessment studies

    NASA Astrophysics Data System (ADS)

    Ito, Akihiko; Nishina, Kazuya; Reyer, Christopher P. O.; François, Louis; Henrot, Alexandra-Jane; Munhoven, Guy; Jacquemin, Ingrid; Tian, Hanqin; Yang, Jia; Pan, Shufen; Morfopoulos, Catherine; Betts, Richard; Hickler, Thomas; Steinkamp, Jörg; Ostberg, Sebastian; Schaphoff, Sibyll; Ciais, Philippe; Chang, Jinfeng; Rafique, Rashid; Zeng, Ning; Zhao, Fang

    2017-08-01

    Simulating vegetation photosynthetic productivity (or gross primary production, GPP) is a critical feature of the biome models used for impact assessments of climate change. We conducted a benchmarking of global GPP simulated by eight biome models participating in the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP2a) with four meteorological forcing datasets (30 simulations), using independent GPP estimates and recent satellite data of solar-induced chlorophyll fluorescence as a proxy of GPP. The simulated global terrestrial GPP ranged from 98 to 141 Pg C yr-1 (1981-2000 mean); considerable inter-model and inter-data differences were found. Major features of spatial distribution and seasonal change of GPP were captured by each model, showing good agreement with the benchmarking data. All simulations showed incremental trends of annual GPP, seasonal-cycle amplitude, radiation-use efficiency, and water-use efficiency, mainly caused by the CO2 fertilization effect. The incremental slopes were higher than those obtained by remote sensing studies, but comparable with those by recent atmospheric observation. Apparent differences were found in the relationship between GPP and incoming solar radiation, for which forcing data differed considerably. The simulated GPP trends co-varied with a vegetation structural parameter, leaf area index, at model-dependent strengths, implying the importance of constraining canopy properties. In terms of extreme events, GPP anomalies associated with a historical El Niño event and large volcanic eruption were not consistently simulated in the model experiments due to deficiencies in both forcing data and parameterized environmental responsiveness. Although the benchmarking demonstrated the overall advancement of contemporary biome models, further refinements are required, for example, for solar radiation data and vegetation canopy schemes.

  10. High-Resolution Mesoscale Simulations of the 6-7 May 2000 Missouri Flash Flood: Impact of Model Initialization and Land Surface Treatment

    NASA Technical Reports Server (NTRS)

    Baker, R. David; Wang, Yansen; Tao, Wei-Kuo; Wetzel, Peter; Belcher, Larry R.

    2004-01-01

    High-resolution mesoscale model simulations of the 6-7 May 2000 Missouri flash flood event were performed to test the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation. In this flash flood event, a mesoscale convective system (MCS) produced over 340 mm of rain in roughly 9 hours in some locations. Two different types of model initialization were employed: 1) NCEP global reanalysis with 2.5-degree grid spacing and 12-hour temporal resolution, and 2) Eta reanalysis with 40- km grid spacing and $hour temporal resolution. In addition, two different land surface treatments were considered. A simple land scheme. (SLAB) keeps soil moisture fixed at initial values throughout the simulation, while a more sophisticated land model (PLACE) allows for r interactive feedback. Simulations with high-resolution Eta model initialization show considerable improvement in the intensity of precipitation due to the presence in the initialization of a residual mesoscale convective vortex (hlCV) from a previous MCS. Simulations with the PLACE land model show improved location of heavy precipitation. Since soil moisture can vary over time in the PLACE model, surface energy fluxes exhibit strong spatial gradients. These surface energy flux gradients help produce a strong low-level jet (LLJ) in the correct location. The LLJ then interacts with the cold outflow boundary of the MCS to produce new convective cells. The simulation with both high-resolution model initialization and time-varying soil moisture test reproduces the intensity and location of observed rainfall.

  11. Modeling of fault reactivation and induced seismicity during hydraulic fracturing of shale-gas reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Rinaldi, Antonio P.; Cappa, Frédéric

    2013-07-01

    We have conducted numerical simulation studies to assess the potential for injection-induced fault reactivation and notable seismic events associated with shale-gas hydraulic fracturing operations. The modeling is generally tuned towards conditions usually encountered in the Marcellus shale play in the Northeastern US at an approximate depth of 1500 m (~;;4,500 feet). Our modeling simulations indicate that when faults are present, micro-seismic events are possible, the magnitude of which is somewhat larger than the one associated with micro-seismic events originating from regular hydraulic fracturing because of the larger surface area that is available for rupture. The results of our simulations indicatedmore » fault rupture lengths of about 10 to 20 m, which, in rare cases can extend to over 100 m, depending on the fault permeability, the in situ stress field, and the fault strength properties. In addition to a single event rupture length of 10 to 20 m, repeated events and aseismic slip amounted to a total rupture length of 50 m, along with a shear offset displacement of less than 0.01 m. This indicates that the possibility of hydraulically induced fractures at great depth (thousands of meters) causing activation of faults and creation of a new flow path that can reach shallow groundwater resources (or even the surface) is remote. The expected low permeability of faults in producible shale is clearly a limiting factor for the possible rupture length and seismic magnitude. In fact, for a fault that is initially nearly-impermeable, the only possibility of larger fault slip event would be opening by hydraulic fracturing; this would allow pressure to penetrate the matrix along the fault and to reduce the frictional strength over a sufficiently large fault surface patch. However, our simulation results show that if the fault is initially impermeable, hydraulic fracturing along the fault results in numerous small micro-seismic events along with the propagation, effectively preventing larger events from occurring. Nevertheless, care should be taken with continuous monitoring of induced seismicity during the entire injection process to detect any runaway fracturing along faults.« less

  12. Evaluation of rainfall structure on hydrograph simulation: Comparison of radar and interpolated methods, a study case in a tropical catchment

    NASA Astrophysics Data System (ADS)

    Velasquez, N.; Ochoa, A.; Castillo, S.; Hoyos Ortiz, C. D.

    2017-12-01

    The skill of river discharge simulation using hydrological models strongly depends on the quality and spatio-temporal representativeness of precipitation during storm events. All precipitation measurement strategies have their own strengths and weaknesses that translate into discharge simulation uncertainties. Distributed hydrological models are based on evolving rainfall fields in the same time scale as the hydrological simulation. In general, rainfall measurements from a dense and well maintained rain gauge network provide a very good estimation of the total volume for each rainfall event, however, the spatial structure relies on interpolation strategies introducing considerable uncertainty in the simulation process. On the other hand, rainfall retrievals from radar reflectivity achieve a better spatial structure representation but with higher uncertainty in the surface precipitation intensity and volume depending on the vertical rainfall characteristics and radar scan strategy. To assess the impact of both rainfall measurement methodologies on hydrological simulations, and in particular the effects of the rainfall spatio-temporal variability, a numerical modeling experiment is proposed including the use of a novel QPE (Quantitative Precipitation Estimation) method based on disdrometer data in order to estimate surface rainfall from radar reflectivity. The experiment is based on the simulation of 84 storms, the hydrological simulations are carried out using radar QPE and two different interpolation methods (IDW and TIN), and the assessment of simulated peak flow. Results show significant rainfall differences between radar QPE and the interpolated fields, evidencing a poor representation of storms in the interpolated fields, which tend to miss the precise location of the intense precipitation cores, and to artificially generate rainfall in some areas of the catchment. Regarding streamflow modelling, the potential improvement achieved by using radar QPE depends on the density of the rain gauge network and its distribution relative to the precipitation events. The results for the 84 storms show a better model skill using radar QPE than the interpolated fields. Results using interpolated fields are highly affected by the dominant rainfall type and the basin scale.

  13. ADAPT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reynolds, John; Jankovsky, Zachary; Metzroth, Kyle G

    2018-04-04

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified set of simulators. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which uses explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system (or other complex system) evolution along with stochastic modeling. When DET are used to model various aspects of Probabilistic Risk Assessment (PRA), all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specifiedmore » times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at separate times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination), analysis of results, and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worst-case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  14. Radiation Damage to Nervous System: Designing Optimal Models for Realistic Neuron Morphology in Hippocampus

    NASA Astrophysics Data System (ADS)

    Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov

    2018-02-01

    The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.

  15. An assessment of the ability of Bartlett-Lewis type of rainfall models to reproduce drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-12-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as a test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis model types studied fail to preserve extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  16. A copula-based assessment of Bartlett-Lewis type of rainfall models for preserving drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-06-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis type of models studied fail in preserving extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  17. Simulation of extreme reservoir level distribution with the SCHADEX method (EXTRAFLO project)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Penot, David; Garavaglia, Federico

    2013-04-01

    The standard practice for the design of dam spillways structures and gates is to consider the maximum reservoir level reached for a given hydrologic scenario. This scenario has several components: peak discharge, flood volumes on different durations, discharge gradients etc. Within a probabilistic analysis framework, several scenarios can be associated with different return times, although a reference return level (e.g. 1000 years) is often prescribed by the local regulation rules or usual practice. Using continuous simulation method for extreme flood estimation is a convenient solution to provide a great variety of hydrological scenarios to feed a hydraulic model of dam operation: flood hydrographs are explicitly simulated by a rainfall-runoff model fed by a stochastic rainfall generator. The maximum reservoir level reached will be conditioned by the scale and the dynamics of the generated hydrograph, by the filling of the reservoir prior to the flood, and by the dam gates and spillway operation during the event. The simulation of a great number of floods will allow building a probabilistic distribution of maximum reservoir levels. A design value can be chosen at a definite return level. An alternative approach is proposed here, based on the SCHADEX method for extreme flood estimation, proposed by Paquet et al. (2006, 2013). SCHADEX is a so-called "semi-continuous" stochastic simulation method in that flood events are simulated on an event basis and are superimposed on a continuous simulation of the catchment saturation hazard using rainfall-runoff modelling. The SCHADEX process works at the study time-step (e.g. daily), and the peak flow distribution is deduced from the simulated daily flow distribution by a peak-to-volume ratio. A reference hydrograph relevant for extreme floods is proposed. In the standard version of the method, both the peak-to-volume and the reference hydrograph are constant. An enhancement of this method is presented, with variable peak-to-volume ratios and hydrographs applied to each simulated event. This allows accounting for different flood dynamics, depending on the season, the generating precipitation event, the soil saturation state, etc. In both cases, a hydraulic simulation of dam operation is performed, in order to compute the distribution of maximum reservoir levels. Results are detailed for an extreme return level, showing that a 1000 years return level reservoir level can be reached during flood events whose components (peaks, volumes) are not necessarily associated with such return level. The presentation will be illustrated by the example of a fictive dam on the Tech River at Reynes (South of France, 477 km²). This study has been carried out within the EXTRAFLO project, Task 8 (https://extraflo.cemagref.fr/). References: Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi:10.1051/lhb:2006091. Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision

  18. Sensitivity analysis of the DRAINWAT model applied to an agricultural watershed in the lower coastal plain, North Carolina, USA

    Treesearch

    Hyunwoo Kim; Devendra M. Amatya; Stephen W. Broome; Dean L. Hesterberg; Minha Choi

    2011-01-01

    The DRAINWAT, DRAINmod for WATershed model, was selected for hydrological modelling to obtain water table depths and drainage outflows at Open Grounds Farm in Carteret County, North Carolina, USA. Six simulated storm events from the study period were compared with the measured data and analysed. Simulation results from the whole study period and selected rainfall...

  19. Hydro-meteorological drought event sets in the UK based on a large ensemble of global-regional climate simulations: climatology, drivers and changes in the future

    NASA Astrophysics Data System (ADS)

    Guillod, B. P.; Massey, N.; Otto, F. E. L.; Allen, M. R.; Jones, R.; Hall, J. W.

    2016-12-01

    Extreme events being rare by definition, accurately quantifying the probabilities associated with a given event is difficult. This is particularly true for droughts, for which only few events are available in the observational record owing to their long-lasting characteristics. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying present and future risks associated with droughts in the UK. To do so, a large number of modelled weather time series for "synthetic" drought events are being fed into hydrological and impact models to assess their impacts on various sectors (social sciences, economy, industry, agriculture, and ecosystems). Here, we present and analyse the hydro-meteorological drought event sets that have been produced with a new version of weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model simulations, downscaled at 25km over Europe by a nested Regional Climate Model. Simulations include the past 100 years as well as two future time slices (2030s and 2080s), and provide a large number of sequences of spatio-temporally coherent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. Beside presenting the methodology and validation of the event sets, we provide insights into drought risk in the UK and the drivers of drought. In particular, we examine their sensitivity to sea surface temperature and sea ice patterns, both in the recent past and for future projections. How drought risk in the UK can be expected to change in the future will also be discussed. Finally, we assess the applicability of this methodology to other regions. Reference: [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  20. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2014-09-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  1. The 2010 Pakistan floods: high-resolution simulations with the WRF model

    NASA Astrophysics Data System (ADS)

    Viterbo, Francesca; Parodi, Antonio; Molini, Luca; Provenzale, Antonello; von Hardenberg, Jost; Palazzi, Elisa

    2013-04-01

    Estimating current and future water resources in high mountain regions with complex orography is a difficult but crucial task. In particular, the French-Italian project PAPRIKA is focused on two specific regions in the Hindu-Kush -- Himalaya -- Karakorum (HKKH)region: the Shigar basin in Pakistan, at the feet of K2, and the Khumbu valley in Nepal, at the feet of Mount Everest. In this framework, we use the WRF model to simulate precipitation and meteorological conditions with high resolution in areas with extreme orographic slopes, comparing the model output with station and satellite data. Once validated the model, we shall run a set of three future time-slices at very high spatial resolution, in the periods 2046-2050, 2071-2075 and 2096-2100, nested in different climate change scenarios (EXtreme PREcipitation and Hydrological climate Scenario Simulations -EXPRESS-Hydro project). As a prelude to this study, here we discuss the simulation of specific, high-intensity rainfall events in this area. In this paper we focus on the 2010 Pakistan floods which began in late July 2010, producing heavy monsoon rains in the Khyber Pakhtunkhwa, Sindh, Punjab and Balochistan regions of Pakistan and affecting the Indus River basin. Approximately one-fifth of Pakistan's total land area was underwater, with a death toll of about 2000 people. This event has been simulated with the WRF model (version 3.3.) in cloud-permitting mode (d01 14 km and d02 3.5 km): different convective closures and microphysics parameterization have been used. A deeper understanding of the processes responsible for this event has been gained through comparison with rainfall depth observations, radiosounding data and geostationary/polar satellite images.

  2. A stochastic automata network for earthquake simulation and hazard estimation

    NASA Astrophysics Data System (ADS)

    Belubekian, Maya Ernest

    1998-11-01

    This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto. The sensitivity analysis of the model results to the variation in basic parameters shows that the maximum magnitude has the most significant impact on the hazard, especially for long forecast periods.

  3. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  4. Dry matter partitioning models for the simulation of individual fruit growth in greenhouse cucumber canopies

    PubMed Central

    Wiechers, Dirk; Kahlen, Katrin; Stützel, Hartmut

    2011-01-01

    Background and Aims Growth imbalances between individual fruits are common in indeterminate plants such as cucumber (Cucumis sativus). In this species, these imbalances can be related to differences in two growth characteristics, fruit growth duration until reaching a given size and fruit abortion. Both are related to distribution, and environmental factors as well as canopy architecture play a key role in their differentiation. Furthermore, events leading to a fruit reaching its harvestable size before or simultaneously with a prior fruit can be observed. Functional–structural plant models (FSPMs) allow for interactions between environmental factors, canopy architecture and physiological processes. Here, we tested hypotheses which account for these interactions by introducing dominance and abortion thresholds for the partitioning of assimilates between growing fruits. Methods Using the L-System formalism, an FSPM was developed which combined a model for architectural development, a biochemical model of photosynthesis and a model for assimilate partitioning, the last including a fruit growth model based on a size-related potential growth rate (RP). Starting from a distribution proportional to RP, the model was extended by including abortion and dominance. Abortion was related to source strength and dominance to sink strength. Both thresholds were varied to test their influence on fruit growth characteristics. Simulations were conducted for a dense row and a sparse isometric canopy. Key Results The simple partitioning models failed to simulate individual fruit growth realistically. The introduction of abortion and dominance thresholds gave the best results. Simulations of fruit growth durations and abortion rates were in line with measurements, and events in which a fruit was harvestable earlier than an older fruit were reproduced. Conclusions Dominance and abortion events need to be considered when simulating typical fruit growth traits. By integrating environmental factors, the FSPM can be a valuable tool to analyse and improve existing knowledge about the dynamics of assimilates partitioning. PMID:21715366

  5. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  6. Modeling and evaluation of urban pollution events of atmospheric heavy metals from a large Cu-smelter.

    PubMed

    Chen, Bing; Stein, Ariel F; Castell, Nuria; Gonzalez-Castanedo, Yolanda; Sanchez de la Campa, A M; de la Rosa, J D

    2016-01-01

    Metal smelting and processing are highly polluting activities that have a strong influence on the levels of heavy metals in air, soil, and crops. We employ an atmospheric transport and dispersion model to predict the pollution levels originated from the second largest Cu-smelter in Europe. The model predicts that the concentrations of copper (Cu), zinc (Zn), and arsenic (As) in an urban area close to the Cu-smelter can reach 170, 70, and 30 ng m−3, respectively. The model captures all the observed urban pollution events, but the magnitude of the elemental concentrations is predicted to be lower than that of the observed values; ~300, ~500, and ~100 ng m−3 for Cu, Zn, and As, respectively. The comparison between model and observations showed an average correlation coefficient of 0.62 ± 0.13. The simulation shows that the transport of heavy metals reaches a peak in the afternoon over the urban area. The under-prediction in the peak is explained by the simulated stronger winds compared with monitoring data. The stronger simulated winds enhance the transport and dispersion of heavy metals to the regional area, diminishing the impact of pollution events in the urban area. This model, driven by high resolution meteorology (2 km in horizontal), predicts the hourly-interval evolutions of atmospheric heavy metal pollutions in the close by urban area of industrial hotspot.

  7. Effective Integration of Earth Observation Data and Flood Modeling for Rapid Disaster Response: The Texas 2015 Case

    NASA Astrophysics Data System (ADS)

    Schumann, G.

    2016-12-01

    Routinely obtaining real-time 2-D inundation patterns of a flood event at a meaningful spatial resolution and over large scales is at the moment only feasible with either operational aircraft flights or satellite imagery. Of course having model simulations of floodplain inundation available to complement the remote sensing data is highly desirable, for both event re-analysis and forecasting event inundation. Using the Texas 2015 flood disaster, we demonstrate the value of multi-scale EO data for large scale 2-D floodplain inundation modeling and forecasting. A dynamic re-analysis of the Texas 2015 flood disaster was run using a 2-D flood model developed for accurate large scale simulations. We simulated the major rivers entering the Gulf of Mexico and used flood maps produced from both optical and SAR satellite imagery to examine regional model sensitivities and assess associated performance. It was demonstrated that satellite flood maps can complement model simulations and add value, although this is largely dependent on a number of important factors, such as image availability, regional landscape topology, and model uncertainty. In the preferred case where model uncertainty is high, landscape topology is complex (i.e. urbanized coastal area) and satellite flood maps are available (in case of SAR for instance), satellite data can significantly reduce model uncertainty by identifying the "best possible" model parameter set. However, most often the situation is occurring where model uncertainty is low and spatially contiguous flooding can be mapped from satellites easily enough, such as in rural large inland river floodplains. Consequently, not much value from satellites can be added. Nevertheless, where a large number of flood maps are available, model credibility can be increased substantially. In the case presented here this was true for at least 60% of the many thousands of kilometers of river flow length simulated, where satellite flood maps existed. The next steps of this project is to employ a technique termed "targeted observation" approach, which is an assimilation based procedure that allows quantifying the impact observations have on model predictions at the local scale and also along the entire river system, when assimilated with the model at specific "overpass" locations.

  8. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  9. Disease management research using event graphs.

    PubMed

    Allore, H G; Schruben, L W

    2000-08-01

    Event Graphs, conditional representations of stochastic relationships between discrete events, simulate disease dynamics. In this paper, we demonstrate how Event Graphs, at an appropriate abstraction level, also extend and organize scientific knowledge about diseases. They can identify promising treatment strategies and directions for further research and provide enough detail for testing combinations of new medicines and interventions. Event Graphs can be enriched to incorporate and validate data and test new theories to reflect an expanding dynamic scientific knowledge base and establish performance criteria for the economic viability of new treatments. To illustrate, an Event Graph is developed for mastitis, a costly dairy cattle disease, for which extensive scientific literature exists. With only a modest amount of imagination, the methodology presented here can be seen to apply modeling to any disease, human, plant, or animal. The Event Graph simulation presented here is currently being used in research and in a new veterinary epidemiology course. Copyright 2000 Academic Press.

  10. Scalable File Systems for High Performance Computing Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, S A

    2007-10-03

    Simulations of mode I interlaminar fracture toughness tests of a carbon-reinforced composite material (BMS 8-212) were conducted with LSDYNA. The fracture toughness tests were performed by U.C. Berkeley. The simulations were performed to investigate the validity and practicality of employing decohesive elements to represent interlaminar bond failures that are prevalent in carbon-fiber composite structure penetration events. The simulations employed a decohesive element formulation that was verified on a simple two element model before being employed to perform the full model simulations. Care was required during the simulations to ensure that the explicit time integration of LSDYNA duplicate the near steady-statemore » testing conditions. In general, this study validated the use of employing decohesive elements to represent the interlaminar bond failures seen in carbon-fiber composite structures, but the practicality of employing the elements to represent the bond failures seen in carbon-fiber composite structures during penetration events was not established.« less

  11. Determining minimum staffing levels during snowstorms using an integrated simulation, regression, and reliability model.

    PubMed

    Kunkel, Amber; McLay, Laura A

    2013-03-01

    Emergency medical services (EMS) provide life-saving care and hospital transport to patients with severe trauma or medical conditions. Severe weather events, such as snow events, may lead to adverse patient outcomes by increasing call volumes and service times. Adequate staffing levels during such weather events are critical for ensuring that patients receive timely care. To determine staffing levels that depend on weather, we propose a model that uses a discrete event simulation of a reliability model to identify minimum staffing levels that provide timely patient care, with regression used to provide the input parameters. The system is said to be reliable if there is a high degree of confidence that ambulances can immediately respond to a given proportion of patients (e.g., 99 %). Four weather scenarios capture varying levels of snow falling and snow on the ground. An innovative feature of our approach is that we evaluate the mitigating effects of different extrinsic response policies and intrinsic system adaptation. The models use data from Hanover County, Virginia to quantify how snow reduces EMS system reliability and necessitates increasing staffing levels. The model and its analysis can assist in EMS preparedness by providing a methodology to adjust staffing levels during weather events. A key observation is that when it is snowing, intrinsic system adaptation has similar effects on system reliability as one additional ambulance.

  12. Simulating the influence of snow surface processes on soil moisture dynamics and streamflow generation in an alpine catchment

    NASA Astrophysics Data System (ADS)

    Wever, Nander; Comola, Francesco; Bavay, Mathias; Lehning, Michael

    2017-08-01

    The assessment of flood risks in alpine, snow-covered catchments requires an understanding of the linkage between the snow cover, soil and discharge in the stream network. Here, we apply the comprehensive, distributed model Alpine3D to investigate the role of soil moisture in the predisposition of the Dischma catchment in Switzerland to high flows from rainfall and snowmelt. The recently updated soil module of the physics-based multilayer snow cover model SNOWPACK, which solves the surface energy and mass balance in Alpine3D, is verified against soil moisture measurements at seven sites and various depths inside and in close proximity to the Dischma catchment. Measurements and simulations in such terrain are difficult and consequently, soil moisture was simulated with varying degrees of success. Differences between simulated and measured soil moisture mainly arise from an overestimation of soil freezing and an absence of a groundwater description in the Alpine3D model. Both were found to have an influence in the soil moisture measurements. Using the Alpine3D simulation as the surface scheme for a spatially explicit hydrologic response model using a travel time distribution approach for interflow and baseflow, streamflow simulations were performed for the discharge from the catchment. The streamflow simulations provided a closer agreement with observed streamflow when driving the hydrologic response model with soil water fluxes at 30 cm depth in the Alpine3D model. Performance decreased when using the 2 cm soil water flux, thereby mostly ignoring soil processes. This illustrates that the role of soil moisture is important to take into account when understanding the relationship between both snowpack runoff and rainfall and catchment discharge in high alpine terrain. However, using the soil water flux at 60 cm depth to drive the hydrologic response model also decreased its performance, indicating that an optimal soil depth to include in surface simulations exists and that the runoff dynamics are controlled by only a shallow soil layer. Runoff coefficients (i.e. ratio of rainfall over discharge) based on measurements for high rainfall and snowmelt events were found to be dependent on the simulated initial soil moisture state at the onset of an event, further illustrating the important role of soil moisture for the hydrological processes in the catchment. The runoff coefficients using simulated discharge were found to reproduce this dependency, which shows that the Alpine3D model framework can be successfully applied to assess the predisposition of the catchment to flood risks from both snowmelt and rainfall events.

  13. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand DES capabilities to address KSC's planning needs.

  14. Simulating Sustainment for an Unmanned Logistics System Concept of Operation in Support of Distributed Operations

    DTIC Science & Technology

    2017-06-01

    designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily

  15. Joint modeling of longitudinal data and discrete-time survival outcome.

    PubMed

    Qiu, Feiyou; Stein, Catherine M; Elston, Robert C

    2016-08-01

    A predictive joint shared parameter model is proposed for discrete time-to-event and longitudinal data. A discrete survival model with frailty and a generalized linear mixed model for the longitudinal data are joined to predict the probability of events. This joint model focuses on predicting discrete time-to-event outcome, taking advantage of repeated measurements. We show that the probability of an event in a time window can be more precisely predicted by incorporating the longitudinal measurements. The model was investigated by comparison with a two-step model and a discrete-time survival model. Results from both a study on the occurrence of tuberculosis and simulated data show that the joint model is superior to the other models in discrimination ability, especially as the latent variables related to both survival times and the longitudinal measurements depart from 0. © The Author(s) 2013.

  16. Evaluation of NASA's end-to-end data systems using DSDS+

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Davenport, William; Message, Philip

    1994-01-01

    The Data Systems Dynamic Simulator (DSDS+) is a software tool being developed by the authors to evaluate candidate architectures for NASA's end-to-end data systems. Via modeling and simulation, we are able to quickly predict the performance characteristics of each architecture, to evaluate 'what-if' scenarios, and to perform sensitivity analyses. As such, we are using modeling and simulation to help NASA select the optimal system configuration, and to quantify the performance characteristics of this system prior to its delivery. This paper is divided into the following six sections: (1) The role of modeling and simulation in the systems engineering process. In this section, we briefly describe the different types of results obtained by modeling each phase of the systems engineering life cycle, from concept definition through operations and maintenance; (2) Recent applications of DSDS+. In this section, we describe ongoing applications of DSDS+ in support of the Earth Observing System (EOS), and we present some of the simulation results generated of candidate system designs. So far, we have modeled individual EOS subsystems (e.g. the Solid State Recorders used onboard the spacecraft), and we have also developed an integrated model of the EOS end-to-end data processing and data communications systems (from the payloads onboard to the principle investigator facilities on the ground); (3) Overview of DSDS+. In this section we define what a discrete-event model is, and how it works. The discussion is presented relative to the DSDS+ simulation tool that we have developed, including it's run-time optimization algorithms that enables DSDS+ to execute substantially faster than comparable discrete-event simulation tools; (4) Summary. In this section, we summarize our findings and 'lessons learned' during the development and application of DSDS+ to model NASA's data systems; (5) Further Information; and (6) Acknowledgements.

  17. Evaluation of NCMRWF unified model vertical cloud structure with CloudSat over the Indian summer monsoon region

    NASA Astrophysics Data System (ADS)

    Jayakumar, A.; Mamgain, Ashu; Jisesh, A. S.; Mohandas, Saji; Rakhi, R.; Rajagopal, E. N.

    2016-05-01

    Representation of rainfall distribution and monsoon circulation in the high resolution versions of NCMRWF Unified model (NCUM-REG) for the short-range forecasting of extreme rainfall event is vastly dependent on the key factors such as vertical cloud distribution, convection and convection/cloud relationship in the model. Hence it is highly relevant to evaluate the vertical structure of cloud and precipitation of the model over the monsoon environment. In this regard, we utilized the synergy of the capabilities of CloudSat data for long observational period, by conditioning it for the synoptic situation of the model simulation period. Simulations were run at 4-km grid length with the convective parameterization effectively switched off and on. Since the sample of CloudSat overpasses through the monsoon domain is small, the aforementioned methodology may qualitatively evaluate the vertical cloud structure for the model simulation period. It is envisaged that the present study will open up the possibility of further improvement in the high resolution version of NCUM in the tropics for the Indian summer monsoon associated rainfall events.

  18. Using discrete event computer simulation to improve patient flow in a Ghanaian acute care hospital.

    PubMed

    Best, Allyson M; Dixon, Cinnamon A; Kelton, W David; Lindsell, Christopher J; Ward, Michael J

    2014-08-01

    Crowding and limited resources have increased the strain on acute care facilities and emergency departments worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation is a computer-based tool that can be used to estimate how changes to complex health care delivery systems such as emergency departments will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (eg, modified staff start times and roles) and resource-additional (eg, increased staff) operational interventions on patient throughput. Previously captured deidentified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). The base-case (no change) scenario had a mean LOS of 292 minutes (95% confidence interval [CI], 291-293). In isolation, adding staffing, changing staff roles, and varying shift times did not affect overall patient LOS. Specifically, adding 2 registration workers, history takers, and physicians resulted in a 23.8-minute (95% CI, 22.3-25.3) LOS decrease. However, when shift start times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI, 94-98), and with the simultaneous combination of staff roles (registration and history taking), there was an overall mean LOS reduction of 152 minutes (95% CI, 150-154). Resource-neutral interventions identified through discrete event simulation modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. Discrete event simulation offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute care in resource-limited settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Algorithms and architecture for multiprocessor based circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deutsch, J.T.

    Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less

  20. Developing a discrete event simulation model for university student shuttle buses

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Khalid, Ruzelan; Nawawi, Mohd Kamal Mohd; Hamid, Muhammad Hafizan

    2017-11-01

    Providing shuttle buses for university students to attend their classes is crucial, especially when their number is large and the distances between their classes and residential halls are far. These factors, in addition to the non-optimal current bus services, typically require the students to wait longer which eventually opens a space for them to complain. To considerably reduce the waiting time, providing the optimal number of buses to transport them from location to location and the effective route schedules to fulfil the students' demand at relevant time ranges are thus important. The optimal bus number and schedules are to be determined and tested using a flexible decision platform. This paper thus models the current services of student shuttle buses in a university using a Discrete Event Simulation approach. The model can flexibly simulate whatever changes configured to the current system and report its effects to the performance measures. How the model was conceptualized and formulated for future system configurations are the main interest of this paper.

  1. HEPPA-II model-measurement intercomparison project: EPP indirect effects during the dynamically perturbed NH winter 2008-2009

    NASA Astrophysics Data System (ADS)

    Funke, Bernd; Ball, William; Bender, Stefan; Gardini, Angela; Harvey, V. Lynn; Lambert, Alyn; López-Puertas, Manuel; Marsh, Daniel R.; Meraner, Katharina; Nieder, Holger; Päivärinta, Sanna-Mari; Pérot, Kristell; Randall, Cora E.; Reddmann, Thomas; Rozanov, Eugene; Schmidt, Hauke; Seppälä, Annika; Sinnhuber, Miriam; Sukhodolov, Timofei; Stiller, Gabriele P.; Tsvetkova, Natalia D.; Verronen, Pekka T.; Versick, Stefan; von Clarmann, Thomas; Walker, Kaley A.; Yushkov, Vladimir

    2017-03-01

    We compare simulations from three high-top (with upper lid above 120 km) and five medium-top (with upper lid around 80 km) atmospheric models with observations of odd nitrogen (NOx = NO + NO2), temperature, and carbon monoxide from seven satellite instruments (ACE-FTS on SciSat, GOMOS, MIPAS, and SCIAMACHY on Envisat, MLS on Aura, SABER on TIMED, and SMR on Odin) during the Northern Hemisphere (NH) polar winter 2008/2009. The models included in the comparison are the 3-D chemistry transport model 3dCTM, the ECHAM5/MESSy Atmospheric Chemistry (EMAC) model, FinROSE, the Hamburg Model of the Neutral and Ionized Atmosphere (HAMMONIA), the Karlsruhe Simulation Model of the Middle Atmosphere (KASIMA), the modelling tools for SOlar Climate Ozone Links studies (SOCOL and CAO-SOCOL), and the Whole Atmosphere Community Climate Model (WACCM4). The comparison focuses on the energetic particle precipitation (EPP) indirect effect, that is, the polar winter descent of NOx largely produced by EPP in the mesosphere and lower thermosphere. A particular emphasis is given to the impact of the sudden stratospheric warming (SSW) in January 2009 and the subsequent elevated stratopause (ES) event associated with enhanced descent of mesospheric air. The chemistry climate model simulations have been nudged toward reanalysis data in the troposphere and stratosphere while being unconstrained above. An odd nitrogen upper boundary condition obtained from MIPAS observations has further been applied to medium-top models. Most models provide a good representation of the mesospheric tracer descent in general, and the EPP indirect effect in particular, during the unperturbed (pre-SSW) period of the NH winter 2008/2009. The observed NOx descent into the lower mesosphere and stratosphere is generally reproduced within 20 %. Larger discrepancies of a few model simulations could be traced back either to the impact of the models' gravity wave drag scheme on the polar wintertime meridional circulation or to a combination of prescribed NOx mixing ratio at the uppermost model layer and low vertical resolution. In March-April, after the ES event, however, modelled mesospheric and stratospheric NOx distributions deviate significantly from the observations. The too-fast and early downward propagation of the NOx tongue, encountered in most simulations, coincides with a temperature high bias in the lower mesosphere (0.2-0.05 hPa), likely caused by an overestimation of descent velocities. In contrast, upper-mesospheric temperatures (at 0.05-0.001 hPa) are generally underestimated by the high-top models after the onset of the ES event, being indicative for too-slow descent and hence too-low NOx fluxes. As a consequence, the magnitude of the simulated NOx tongue is generally underestimated by these models. Descending NOx amounts simulated with medium-top models are on average closer to the observations but show a large spread of up to several hundred percent. This is primarily attributed to the different vertical model domains in which the NOx upper boundary condition is applied. In general, the intercomparison demonstrates the ability of state-of-the-art atmospheric models to reproduce the EPP indirect effect in dynamically and geomagnetically quiescent NH winter conditions. The encountered differences between observed and simulated NOx, CO, and temperature distributions during the perturbed phase of the 2009 NH winter, however, emphasize the need for model improvements in the dynamical representation of elevated stratopause events in order to allow for a better description of the EPP indirect effect under these particular conditions.

  2. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  3. User's guide to the Event Monitor: Part of Prognosis Model Version 6

    Treesearch

    Nicholas L. Crookston

    1990-01-01

    Describes how to use the Event Monitor to dynamically invoke management activities in the Prognosis Model for Stand Development. The program accepts statements of conditions -- expressed as logical expressions of stand-state variables -- to be met and sets of activities to be simulated when the conditions are met. The combination of a condition and a set of activities...

  4. Numerical simulation of a dust event in northeastern Germany with a new dust emission scheme in COSMO-ART

    USDA-ARS?s Scientific Manuscript database

    The dust emission scheme of Shao (2004) has been implemented into the regional atmospheric model COSMO-ART and has been applied to a severe dust event in northeastern Germany on 8th April 2011. The model sensitivity to soil moisture and vegetation cover has been studied. Soil moisture has been found...

  5. Learning Unknown Event Models

    DTIC Science & Technology

    2014-07-01

    Intelligence (www.aaai.org). All rights reserved. knowledge engineering, but it is often impractical due to high environment variance, or unknown events...distribution unlimited 13. SUPPLEMENTARY NOTES In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence , 27-31 July 2014...autonomy for responding to unexpected events in strategy simulations. Computational Intelligence , 29(2), 187-206. Leake, D. B. (1991), Goal-based

  6. Atmospheric icing of structures: Observations and simulations

    NASA Astrophysics Data System (ADS)

    Ágústsson, H.; Elíasson, Á. J.; Thorsteins, E.; Rögnvaldsson, Ó.; Ólafsson, H.

    2012-04-01

    This study compares observed icing in a test span in complex orography at Hallormsstaðaháls (575 m) in East-Iceland with parameterized icing based on an icing model and dynamically downscaled weather at high horizontal resolution. Four icing events have been selected from an extensive dataset of observed atmospheric icing in Iceland. A total of 86 test-spans have been erected since 1972 at 56 locations in complex terrain with more than 1000 icing events documented. The events used here have peak observed ice load between 4 and 36 kg/m. Most of the ice accretion is in-cloud icing but it may partly be mixed with freezing drizzle and wet snow icing. The calculation of atmospheric icing is made in two steps. First the atmospheric data is created by dynamically downscaling the ECMWF-analysis to high resolution using the non-hydrostatic mesoscale Advanced Research WRF-model. The horizontal resolution of 9, 3, 1 and 0.33 km is necessary to allow the atmospheric model to reproduce correctly local weather in the complex terrain of Iceland. Secondly, the Makkonen-model is used to calculate the ice accretion rate on the conductors based on the simulated temperature, wind, cloud and precipitation variables from the atmospheric data. In general, the atmospheric model correctly simulates the atmospheric variables and icing calculations based on the atmospheric variables correctly identify the observed icing events, but underestimate the load due to too slow ice accretion. This is most obvious when the temperature is slightly below 0°C and the observed icing is most intense. The model results improve significantly when additional observations of weather from an upstream weather station are used to nudge the atmospheric model. However, the large variability in the simulated atmospheric variables results in high temporal and spatial variability in the calculated ice accretion. Furthermore, there is high sensitivity of the icing model to the droplet size and the possibility that some of the icing may be due to freezing drizzle or wet snow instead of in-cloud icing of super-cooled droplets. In addition, the icing model (Makkonen) may not be accurate for the highest icing loads observed.

  7. Simulation of a dust episode over Eastern Mediterranean using a high-resolution atmospheric chemistry general circulation model

    NASA Astrophysics Data System (ADS)

    Abdel Kader, Mohamed; Zittis, Georgios; Astitha, Marina; Lelieveld, Jos; Tymvios, Fillipos

    2013-04-01

    An extended episode of low visibility took place over the Eastern Mediterranean in late September 2011, caused by a strong increase in dust concentrations, analyzed from observations of PM10 (Particulate Matter with <10μm in diameter). A high-resolution version of the atmospheric chemistry general circulation model EMAC (ECHAM5/Messy2.41 Atmospheric Chemistry) was used to simulate the emissions, transport and deposition of airborne desert dust. The model configuration involves the spectral resolution of T255 (0.5°, ~50Km) and 31 vertical levels in the troposphere and lower stratosphere. The model was nudged towards ERA40 reanalysis data to represent the actual meteorological conditions. The dust emissions were calculated online at each model time step and the aerosol microphysics using the GMXe submodel (Global Modal-aerosol eXtension). The model includes a sulphur chemistry mechanism to simulate the transformation of the dust particles from the insoluble (at emission) to soluble modes, which promotes dust removal by precipitation. The model successfully reproduces the dust distribution according to observations by the MODIS satellite instruments and ground-based AERONET stations. The PM10 concentration is also compared with in-situ measurements over Cyprus, resulting in good agreement. The model results show two subsequent dust events originating from the Negev and Sahara deserts. The first dust event resulted from the transport of dust from the Sahara on the 21st of September and lasted only briefly (hours) as the dust particles were efficiently removed by precipitation simulated by the model and observed by the TRMM (Tropical Rainfall Measuring Mission) satellites. The second event resulted from dust transport from the Negev desert to the Eastern Mediterranean during the period 26th - 30th September with a peak concentration at 2500m elevation. This event lasted for four days and diminished due to dry deposition. The observed reduced visibility over Cyprus resulted from the sedimentation of dust originating from the Negev, followed by dry deposition at the surface. The dust particles were both pristine and polluted (sulphate coated), and we evaluate the role of mixing in the duration and extent of the episodes.

  8. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    NASA Technical Reports Server (NTRS)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing and the Smackdown event. HLA software from Pitch Technology and MAK Technology were used to edit and extend FOM and provide HLA services for federation execution. The SISO Smackdown event for 2011 was held in Boston, Massachusetts. The federation execution lasted for one hour, and the event was very successful in catching the attention of university students and faculties.

  9. Biological Event Modeling for Response Planning

    NASA Astrophysics Data System (ADS)

    McGowan, Clement; Cecere, Fred; Darneille, Robert; Laverdure, Nate

    People worldwide continue to fear a naturally occurring or terrorist-initiated biological event. Responsible decision makers have begun to prepare for such a biological event, but critical policy and system questions remain: What are the best courses of action to prepare for and react to such an outbreak? Where resources should be stockpiled? How many hospital resources—doctors, nurses, intensive-care beds—will be required? Will quarantine be necessary? Decision analysis tools, particularly modeling and simulation, offer ways to address and help answer these questions.

  10. CHARYBDIS: a black hole event generator

    NASA Astrophysics Data System (ADS)

    Harris, Christopher M.; Richardson, Peter; Webber, Bryan R.

    2003-08-01

    CHARYBDIS is an event generator which simulates the production and decay of miniature black holes at hadronic colliders as might be possible in certain extra dimension models. It interfaces via the Les Houches accord to general purpose Monte Carlo programs like HERWIG and PYTHIA which then perform the parton evolution and hadronization. The event generator includes the extra-dimensional `grey-body' effects as well as the change in the temperature of the black hole as the decay progresses. Various options for modelling the Planck-scale terminal decay are provided.

  11. Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2014-01-01

    Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.

  12. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  13. Pseudo-global warming controls on the intensity and morphology of extreme convective storm events

    NASA Astrophysics Data System (ADS)

    Trapp, R. J.

    2015-12-01

    This research seeks to answer the basic question of how current-day extreme convective storm events might be represented under future anthropogenic climate change. We adapt the "pseudo-global warming" (PGW) methodology employed by Lackmann (2013, 2015) and others, who have investigated flooding and tropical cyclone events under climate change. Here, we exploit coupled atmosphere-ocean GCM data contributed to the CMIP5 archive, and take the mean 3D atmospheric state simulated during May 1990-1999 and subtract it from that simulated during May 2090-2099. Such 3D changes in temperature, humidity, geopotential height, and winds are added to synoptic/meso-scale analyses (NAM-ANL) of specific events, and this modified atmospheric state is then used for initial and boundary conditions for real-data WRF model simulations of the events at high resolution. Comparison of an ensemble of these simulations with control (CTRL) simulations facilitates assessment of PGW effects. In contrast to the robust development of supercellular convection in our CTRL simulations, the combined effects of increased CIN and decreased forcing under PGW led to a failure of convection initiation in many of our ensemble members. Those members that had sufficient matching between the CIN and forcing tended to generate stronger convective updrafts than in the CTRL simulations, because of the relatively higher CAPE under PGW. And, the members with enhanced updrafts also tended to have enhanced vertical rotation. In fact, such mesocyclonic rotation and attendant supercellular morphology were even found in simulations that were driven with PGW-reduced environmental wind shear.

  14. Characterization of extreme sea level at the European coast

    NASA Astrophysics Data System (ADS)

    Elizalde, Alberto; Jorda, Gabriel; Mathis, Moritz; Mikolajewicz, Uwe

    2015-04-01

    Extreme high sea levels arise as a combination of storm surges and particular high tides events. Future climate simulations not only project changes in the atmospheric circulation, which induces changes in the wind conditions, but also an increase in the global mean sea level by thermal expansion and ice melting. Such changes increase the risk of coastal flooding, which represents a possible hazard for human activities. Therefore, it is important to investigate the pattern of sea level variability and long-term trends at coastal areas. In order to analyze further extreme sea level events at the European coast in the future climate projections, a new setup for the global ocean model MPIOM coupled with the regional atmosphere model REMO is prepared. The MPIOM irregular grid has enhanced resolution in the European region to resolve the North and the Mediterranean Seas (up to 11 x 11 km at the North Sea). The ocean model includes as well the full luni-solar ephemeridic tidal potential for tides simulation. To simulate the air-sea interaction, the regional atmospheric model REMO is interactively coupled to the ocean model over Europe. Such region corresponds to the EuroCORDEX domain with a 50 x 50 km resolution. Besides the standard fluxes of heat, mass (freshwater), momentum and turbulent energy input, the ocean model is also forced with sea level pressure, in order to be able to capture the full variation of sea level. The hydrological budget within the study domain is closed using a hydrological discharge model. With this model, simulations for present climate and future climate scenarios are carried out to study transient changes on the sea level and extreme events. As a first step, two simulations (coupled and uncoupled ocean) driven by reanalysis data (ERA40) have been conducted. They are used as reference runs to evaluate the climate projection simulations. For selected locations at the coast side, time series of sea level are separated on its different components: tides, short time atmospheric process influence (1-30 days), seasonal cycle and interannual variability. Every sea level component is statistically compared with data from local tide gauges.

  15. A study of long-term trends in mineral dust aerosol distributions in Asia using a general circulation model

    NASA Astrophysics Data System (ADS)

    Mukai, Makiko; Nakajima, Teruyuki; Takemura, Toshihiko

    2004-10-01

    Dust events have been observed in Japan with high frequency since 2000. On the other hand, the frequency of dust storms is said to have decreased in the desert regions of China since about the middle of the 1970s. This study simulates dust storms and transportation of mineral dust aerosols in the east Asia region from 1981 to 2001 using an aerosol transport model, Spectral Radiation-Transport Model for Aerosol Species (SPRINTARS), implemented in the Center for Climate System Research/National Institute for Environmental Studies atmospheric global circulation model, in order to investigate the main factors that control a dust event and its long-term variation. The model was forced to simulate a real atmospheric condition by a nudging technique using European Centre for Medium-Range Weather Forecasts reanalysis data on wind velocities, temperature, specific humidity, soil wetness, and snow depth. From a comparison between the long-term change in the dust emission and model parameters, it is found that the wind speed near the surface level had a significant influence on the dust emission, and snow is also an important factor in the early spring dust emission. The simulated results suggested that dust emissions from northeast China have a great impact on dust mass concentration in downwind regions, such as the cities of northeastern China, Korea, and Japan. When the frequency of dust events was high in Japan, a low-pressure system tended to develop over the northeast China region that caused strong winds. From 2000 to 2001 the simulated dust emission flux decreased in the Taklimakan desert and the northwestern part of China, while it increased in the Gobi desert and the northeastern part of China. Consequently, dust particles seem to be transported more from the latter region by prevailing westerlies in the springtime to downwind areas as actually observed. In spite of the similarity, however, there is still a large disagreement between observed and simulated dust frequencies and concentrations. A more realistic land surface and uplift mechanism of dust particles should be modeled to improve the model simulation. Desertification of the northeastern China region may be another reason for this disagreement.

  16. Model-Observation "Data Cubes" for the DOE Atmospheric Radiation Measurement Program's LES ARM Symbiotic Simulation and Observation (LASSO) Workflow

    NASA Astrophysics Data System (ADS)

    Vogelmann, A. M.; Gustafson, W. I., Jr.; Toto, T.; Endo, S.; Cheng, X.; Li, Z.; Xiao, H.

    2015-12-01

    The Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facilities' Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) Workflow is currently being designed to provide output from routine LES to complement its extensive observations. The modeling portion of the LASSO workflow is presented by Gustafson et al., which will initially focus on shallow convection over the ARM megasite in Oklahoma, USA. This presentation describes how the LES output will be combined with observations to construct multi-dimensional and dynamically consistent "data cubes", aimed at providing the best description of the atmospheric state for use in analyses by the community. The megasite observations are used to constrain large-eddy simulations that provide a complete spatial and temporal coverage of observables and, further, the simulations also provide information on processes that cannot be observed. Statistical comparisons of model output with their observables are used to assess the quality of a given simulated realization and its associated uncertainties. A data cube is a model-observation package that provides: (1) metrics of model-observation statistical summaries to assess the simulations and the ensemble spread; (2) statistical summaries of additional model property output that cannot be or are very difficult to observe; and (3) snapshots of the 4-D simulated fields from the integration period. Searchable metrics are provided that characterize the general atmospheric state to assist users in finding cases of interest, such as categorization of daily weather conditions and their specific attributes. The data cubes will be accompanied by tools designed for easy access to cube contents from within the ARM archive and externally, the ability to compare multiple data streams within an event as well as across events, and the ability to use common grids and time sampling, where appropriate.

  17. Application of Satellite and Ozonesonde Data to the Study of Nighttime Tropospheric Ozone Impacts and Relationship to Air Quality

    NASA Astrophysics Data System (ADS)

    Osterman, G. B.; Eldering, A.; Neu, J. L.; Tang, Y.; McQueen, J.; Pinder, R. W.

    2011-12-01

    To help protect human health and ecosystems, regional-scale atmospheric chemistry models are used to forecast high ozone events and to design emission control strategies to decrease the frequency and severity of ozone events. Despite the impact that nighttime aloft ozone can have on surface ozone, regional-scale atmospheric chemistry models often do not simulate the nighttime ozone concentrations well and nor do they sufficiently capture the ozone transport patterns. Fully characterizing the importance of the nighttime ozone has been hampered by limited measurements of the vertical distribution of ozone and ozone-precursors. The main focus of this work is to begin to utilize remote sensing data sets to characterize the impact of nighttime aloft ozone to air quality events. We will describe our plans to use NASA satellite data sets, transport models and air quality models to study ozone transport, focusing primarily on nighttime ozone and provide initial results. We will use satellite and ozonesonde data to help understand how well the air quality models are simulating ozone in the lower free troposphere and attempt to characterize the impact of nighttime ozone to air quality events. Our specific objectives are: 1) Characterize nighttime aloft ozone using remote sensing data and sondes. 2) Evaluate the ability of the Community Multi-scale Air Quality (CMAQ) model and the National Air Quality Forecast Capability (NAQFC) model to capture the nighttime aloft ozone and its relationship to air quality events. 3) Analyze a set of air quality events and determine the relationship of air quality events to the nighttime aloft ozone. We will achieve our objectives by utilizing the ozone profile data from the NASA Earth Observing System (EOS) Tropospheric Emission Spectrometer (TES) and other sensors, ozonesonde data collected during the Aura mission (IONS), EPA AirNow ground station ozone data, the CMAQ continental-scale air quality model, and the National Air Quality Forecast model.

  18. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.

  19. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  20. Heavy precipitation in a changing climate: Does short-term summer precipitation increase faster?

    NASA Astrophysics Data System (ADS)

    Ban, Nikolina; Schmidli, Juerg; Schär, Christoph

    2015-04-01

    Climate models project that heavy precipitation events intensify with climate change. It is generally accepted that extreme day-long events will increase at a rate of about 6-7% per degree warming, consistent with the Clausius-Clapeyron relation. However, recent studies suggest that sub-daily (e.g. hourly) precipitation extremes may increase at about twice this rate (referred to as super-adiabatic scaling). Conventional climate models are not suited to assess such events, due to the limited spatial resolution and the need to parameterize convective precipitation (i.e. thunderstorms and rain showers). Here we employ a convection-resolving version of the COSMO model across an extended region (1100 km x 1100 km) covering the European Alps to investigate the differences between parameterized and explicit convection in climate-change scenarios. We conduct 10-year long integrations at resolutions of 12 and 2km. Validation using ERA-Interim driven simulations reveals major improvements with the 2km resolution, in particular regarding the diurnal cycle of mean precipitation and the representation of hourly extremes. In addition, 2km simulations replicate the observed super-adiabatic scaling at precipitation stations, i.e. peak hourly events increase faster with environmental temperature than the Clausius-Clapeyron scaling of 7%/K (see Ban et al. 2014). Convection-resolving climate change scenarios are conducted using control (1991-2000) and scenario (2081-2090) simulations driven by a CMIP5 GCM (i.e. the MPI-ESM-LR) under the IPCC RCP8.5 scenario. Consistent with previous results, projections reveal a significant decrease of mean summer precipitation (by 30%). However, unlike previous studies, we find that increase in both extreme day-long and hour-long precipitation events asymptotically intensify with the Clausius-Clapeyron relation in 2km simulation (Ban et al. 2015). Differences to previous studies might be due to the model or region considered, but we also show that it is inconsistent to extrapolate from present-day super-adiabatic precipitation scaling into the future. The applicability of the Clausius-Clapeyron scaling across the whole event spectrum is a potentially useful result for climate impact adaptation. Ban, N., J. Schmidli and C. Schär (2015): Heavy precipitation in a changing climate: Does short-term summer precipitation increase faster? Submitted to GRL. Ban, N., J. Schmidli and C. Schär (2014): Evaluation of the convection-resolving regional climate modeling approach in decade-long simulations. J. Geophys. Res. Atmos.,119, 7889-7907, doi:10.1002/2014JD021478

  1. Arctic daily temperature and precipitation extremes: Observed and simulated physical behavior

    NASA Astrophysics Data System (ADS)

    Glisan, Justin Michael

    Simulations using a six-member ensemble of Pan-Arctic WRF (PAW) were produced on two Arctic domains with 50-km resolution to analyze precipitation and temperature extremes for various periods. The first study used a domain developed for the Regional Arctic Climate Model (RACM). Initial simulations revealed deep atmospheric circulation biases over the northern Pacific Ocean, manifested in pressure, geopotential height, and temperature fields. Possible remedies to correct these large biases, such as modifying the physical domain or using different initial/boundary conditions, were unsuccessful. Spectral (interior) nudging was introduced as a way of constraining the model to be more consistent with observed behavior. However, such control over numerical model behavior raises concerns over how much nudging may affect unforced variability and extremes. Strong nudging may reduce or filter out extreme events, since the nudging pushes the model toward a relatively smooth, large-scale state. The question then becomes---what is the minimum spectral nudging needed to correct biases while not limiting the simulation of extreme events? To determine this, we use varying degrees of spectral nudging, using WRF's standard nudging as a reference point during January and July 2007. Results suggest that there is a marked lack of sensitivity to varying degrees of nudging. Moreover, given that nudging is an artificial forcing applied in the model, an important outcome of this work is that nudging strength apparently can be considerably smaller than WRF's standard strength and still produce reliable simulations. In the remaining studies, we used the same PAW setup to analyze daily precipitation extremes simulated over a 19-year period on the CORDEX Arctic domain for winter and summer. We defined these seasons as the three-month period leading up to and including the climatological sea ice maximum and minimum, respectively. Analysis focused on four North American regions defined using climatological records, regional weather patterns, and geographical/topographical features. We compared simulated extremes with those occurring at corresponding observing stations in the U.S. National Climate Data Center's (NCDC's) Global Summary of the Day. Our analysis focused on variations in features of the extremes such as magnitudes, spatial scales, and temporal regimes. Using composites of extreme events, we also analyzed the processes producing these extremes, comparing circulation, pressure, temperature and humidity fields from the ERA-Interim reanalysis and the model output. The analysis revealed the importance of atmospheric convection in the Arctic for some extreme precipitation events and the overall importance of topographic precipitation. The analysis established the physical credibility of the simulations for extreme behavior, laying a foundation for examining projected changes in extreme precipitation. It also highlighted the utility of the model for extracting behavior that one cannot discern directly from the observations, such as summer convective precipitation.

  2. A Study of Heavy Precipitation Events in Taiwan During 10-13 August, 1994. Part 2; Mesoscale Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei Kuo; Chen, C.-S.; Jia, Y.; Baker, D.; Lang, S.; Wetzel, P.; Lau, W. K.-M.

    2001-01-01

    Several heavy precipitation episodes occurred over Taiwan from August 10 to 13, 1994. Precipitation patterns and characteristics are quite different between the precipitation events that occurred from August 10 and I I and from August 12 and 13. In Part I (Chen et al. 2001), the environmental situation and precipitation characteristics are analyzed using the EC/TOGA data, ground-based radar data, surface rainfall patterns, surface wind data, and upper air soundings. In this study (Part II), the Penn State/NCAR Mesoscale Model (MM5) is used to study the precipitation characteristics of these heavy precipitation events. Various physical processes (schemes) developed at NASA Goddard Space Flight Center (i.e., cloud microphysics scheme, radiative transfer model, and land-soil-vegetation surface model) have recently implemented into the MM5. These physical packages are described in the paper, Two way interactive nested grids are used with horizontal resolutions of 45, 15 and 5 km. The model results indicated that Cloud physics, land surface and radiation processes generally do not change the location (horizontal distribution) of heavy precipitation. The Goddard 3-class ice scheme produced more rainfall than the 2-class scheme. The Goddard multi-broad-band radiative transfer model reduced precipitation compared to a one-broad band (emissivity) radiation model. The Goddard land-soil-vegetation surface model also reduce the rainfall compared to a simple surface model in which the surface temperature is computed from a Surface energy budget following the "force-re store" method. However, model runs including all Goddard physical processes enhanced precipitation significantly for both cases. The results from these runs are in better agreement with observations. Despite improved simulations using different physical schemes, there are still some deficiencies in the model simulations. Some potential problems are discussed. Sensitivity tests (removing either terrain or radiative processes) are performed to identify the physical processes that determine the precipitation patterns and characteristics for heavy rainfall events. These sensitivity tests indicated that terrain can play a major role in determining the exact location for both precipitation events. The terrain can also play a major role in determining the intensity of precipitation for both events. However, it has a large impact on one event but a smaller one on the other. The radiative processes are also important for determining, the precipitation patterns for one case but. not the other. The radiative processes can also effect the total rainfall for both cases to different extents.

  3. Debris flow run-out simulation and analysis using a dynamic model

    NASA Astrophysics Data System (ADS)

    Melo, Raquel; van Asch, Theo; Zêzere, José L.

    2018-02-01

    Only two months after a huge forest fire occurred in the upper part of a valley located in central Portugal, several debris flows were triggered by intense rainfall. The event caused infrastructural and economic damage, although no lives were lost. The present research aims to simulate the run-out of two debris flows that occurred during the event as well as to calculate via back-analysis the rheological parameters and the excess rain involved. Thus, a dynamic model was used, which integrates surface runoff, concentrated erosion along the channels, propagation and deposition of flow material. Afterwards, the model was validated using 32 debris flows triggered during the same event that were not considered for calibration. The rheological and entrainment parameters obtained for the most accurate simulation were then used to perform three scenarios of debris flow run-out on the basin scale. The results were confronted with the existing buildings exposed in the study area and the worst-case scenario showed a potential inundation that may affect 345 buildings. In addition, six streams where debris flow occurred in the past and caused material damage and loss of lives were identified.

  4. Broadening of cloud droplet spectra through turbulent entrainment and eddy hopping

    NASA Astrophysics Data System (ADS)

    Abade, Gustavo; Grabowski, Wojciech; Pawlowska, Hanna

    2017-11-01

    This work discusses the effect of cloud turbulence and turbulent entrainment on the evolution of the cloud droplet-size spectrum. We simulate an ensemble of idealized turbulent cloud parcels that are subject to entrainment events, modeled as a random Poisson process. Entrainment events, subsequent turbulent mixing inside the parcel, supersaturation fluctuations, and the resulting stochastic droplet growth by condensation are simulated using a Monte Carlo scheme. Quantities characterizing the turbulence intensity, entrainment rate and the mean fraction of environmental air entrained in an event are specified as external parameters. Cloud microphysics is described by applying Lagrangian particles, the so-called superdroplets. They are either unactivated cloud condensation nuclei (CCN) or cloud droplets that form from activated CCN. The model accounts for the transport of environmental CCN into the cloud by the entraining eddies at the cloud edge. Turbulent mixing of the entrained dry air with cloudy air is described using a linear model. We show that turbulence plays an important role in aiding entrained CCN to activate, providing a source of small cloud droplets and thus broadening the droplet size distribution. Further simulation results will be reported at the meeting.

  5. Optimization Routine for Generating Medical Kits for Spaceflight Using the Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Graham, Kimberli; Myers, Jerry; Goodenow, Deb

    2017-01-01

    The Integrated Medical Model (IMM) is a MATLAB model that provides probabilistic assessment of the medical risk associated with human spaceflight missions.Different simulations or profiles can be run in which input conditions regarding both mission characteristics and crew characteristics may vary. For each simulation, the IMM records the total medical events that occur and “treats” each event with resources drawn from import scripts. IMM outputs include Total Medical Events (TME), Crew Health Index (CHI), probability of Evacuation (pEVAC), and probability of Loss of Crew Life (pLOCL).The Crew Health Index is determined by the amount of quality time lost (QTL). Previously, an optimization code was implemented in order to efficiently generate medical kits. The kits were optimized to have the greatest benefit possible, given amass and/or volume constraint. A 6-crew, 14-day lunar mission was chosen for the simulation and run through the IMM for 100,000 trials. A built-in MATLAB solver, mixed-integer linear programming, was used for the optimization routine. Kits were generated in 10% increments ranging from 10%-100% of the benefit constraints. Conditions wheremass alone was minimized, volume alone was minimized, and where mass and volume were minimizedjointly were tested.

  6. The Impact of the Geometrical Structure of the DNA on Parameters of the Track-Event Theory for Radiation Induced Cell Kill.

    PubMed

    Schneider, Uwe; Vasi, Fabiano; Besserer, Jürgen

    2016-01-01

    When fractionation schemes for hypofractionation and stereotactic body radiotherapy are considered, a reliable cell survival model at high dose is needed for calculating doses of similar biological effectiveness. An alternative to the LQ-model is the track-event theory which is based on the probabilities for one- and two two-track events. A one-track-event (OTE) is always represented by at least two simultaneous double strand breaks. A two-track-event (TTE) results in one double strand break. Therefore at least two two-track-events on the same or different chromosomes are necessary to produce an event which leads to cell sterilization. It is obvious that the probabilities of OTEs and TTEs must somehow depend on the geometrical structure of the chromatin. In terms of the track-event theory the ratio ε of the probabilities of OTEs and TTEs includes the geometrical dependence and is obtained in this work by simple Monte Carlo simulations. For this work it was assumed that the anchors of loop forming chromatin are most sensitive to radiation induced cell deaths. Therefore two adjacent tetranucleosomes representing the loop anchors were digitized. The probability ratio ε of OTEs and TTEs was factorized into a radiation quality dependent part and a geometrical part: ε = εion ∙ εgeo. εgeo was obtained for two situations, by applying Monte Carlo simulation for DNA on the tetranucleosomes itself and for linker DNA. Low energy electrons were represented by randomly distributed ionizations and high energy electrons by ionizations which were simulated on rays. εion was determined for electrons by using results from nanodosimetric measurements. The calculated ε was compared to the ε obtained from fits of the track event model to 42 sets of experimental human cell survival data. When the two tetranucleosomes are in direct contact and the hits are randomly distributed εgeo and ε are 0.12 and 0.85, respectively. When the hits are simulated on rays εgeo and ε are 0.10 and 0.71. For the linker-DNA εgeo and ε for randomly distributed hits are 0.010 and 0.073, and for hits on rays 0.0058 and 0.041, respectively. The calculated ε fits the experimentally obtained ε = 0.64±0.32 best for hits on the tetranucleosome when they are close to each other both, for high and low energy electrons. The parameter εgeo of the track event model was obtained by pure geometrical considerations of the chromatin structure and is 0.095 ± 0.022. It can be used as a fixed parameter in the track-event theory.

  7. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  8. Discrete Event Simulation of Distributed Team Communication

    DTIC Science & Technology

    2012-03-22

    performs, and auditory information that is provided through multiple audio devices with speech response. This paper extends previous discrete event workload...2008, pg. 1) notes that “Architecture modeling furnishes abstrac- tions for use in managing complexities, allowing engineers to visualise the proposed

  9. Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, L.G.; Norman, P.I.; Leadbeater, T.W.

    Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less

  10. Persistent cold air outbreaks over North America in a warming climate

    DOE PAGES

    Gao, Yang; Leung, L. Ruby; Lu, Jian; ...

    2015-03-30

    This study examines future changes of cold air outbreaks (CAO) using a multi-model ensemble of global climate simulations from the Coupled Model Intercomparison Project Phase 5 as well as regional high resolution climate simulations. In the future, while robust decrease of CAO duration dominates in most regions, the magnitude of decrease over northwestern U.S. is much smaller than the surrounding regions. We identified statistically significant increases in sea level pressure during CAO events centering over Yukon, Alaska, and Gulf of Alaska that advects continental cold air to northwestern U.S., leading to blocking and CAO events. Changes in large scale circulationmore » contribute to about 50% of the enhanced sea level pressure anomaly conducive to CAO in northwestern U.S. in the future. High resolution regional simulations revealed potential contributions of increased existing snowpack to increased CAO in the near future over the Rocky Mountain, southwestern U.S., and Great Lakes areas through surface albedo effects, despite winter mean snow water equivalent decreases in the future. Overall, the multi-model projections emphasize that cold extremes do not completely disappear in a warming climate. Concomitant with the relatively smaller reduction in CAO events in northwestern U.S., the top 5 most extreme CAO events may still occur in the future, and wind chill warning will continue to have societal impacts in that region.« less

  11. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    PubMed

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A framework for modeling scenario-based barrier island storm impacts

    USGS Publications Warehouse

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  13. Application of the finite element groundwater model FEWA to the engineered test facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, P.M.; Davis, E.C.

    1985-09-01

    A finite element model for water transport through porous media (FEWA) has been applied to the unconfined aquifer at the Oak Ridge National Laboratory Solid Waste Storage Area 6 Engineered Test Facility (ETF). The model was developed in 1983 as part of the Shallow Land Burial Technology - Humid Task (ONL-WL14) and was previously verified using several general hydrologic problems for which an analytic solution exists. Model application and calibration, as described in this report, consisted of modeling the ETF water table for three specialized cases: a one-dimensional steady-state simulation, a one-dimensional transient simulation, and a two-dimensional transient simulation. Inmore » the one-dimensional steady-state simulation, the FEWA output accurately predicted the water table during a long period in which there were no man-induced or natural perturbations to the system. The input parameters of most importance for this case were hydraulic conductivity and aquifer bottom elevation. In the two transient cases, the FEWA output has matched observed water table responses to a single rainfall event occurring in February 1983, yielding a calibrated finite element model that is useful for further study of additional precipitation events as well as contaminant transport at the experimental site.« less

  14. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled urban setting. Terrestrial factors investigated include altering the physical model's catchment slope (0°- 20°), as well as simulating a number of spatially-varied impermeability and building density/configuration scenarios. Additionally, the influence of different storm dynamics and intensities were investigated. Preliminary results demonstrate that rainfall-runoff responses in the physical modelling environment are highly sensitive to slight increases in catchment gradient and rainfall intensity and that more densely distributed building layouts significantly increase peak flows recorded at the physical model outflow when compared to sparsely distributed building layouts under comparable simulated rainfall conditions.

  15. sedFlow - a tool for simulating fractional bedload transport and longitudinal profile evolution in mountain streams

    NASA Astrophysics Data System (ADS)

    Heimann, F. U. M.; Rickenmann, D.; Turowski, J. M.; Kirchner, J. W.

    2015-01-01

    Especially in mountainous environments, the prediction of sediment dynamics is important for managing natural hazards, assessing in-stream habitats and understanding geomorphic evolution. We present the new modelling tool {sedFlow} for simulating fractional bedload transport dynamics in mountain streams. sedFlow is a one-dimensional model that aims to realistically reproduce the total transport volumes and overall morphodynamic changes resulting from sediment transport events such as major floods. The model is intended for temporal scales from the individual event (several hours to few days) up to longer-term evolution of stream channels (several years). The envisaged spatial scale covers complete catchments at a spatial discretisation of several tens of metres to a few hundreds of metres. sedFlow can deal with the effects of streambeds that slope uphill in a downstream direction and uses recently proposed and tested approaches for quantifying macro-roughness effects in steep channels. sedFlow offers different options for bedload transport equations, flow-resistance relationships and other elements which can be selected to fit the current application in a particular catchment. Local grain-size distributions are dynamically adjusted according to the transport dynamics of each grain-size fraction. sedFlow features fast calculations and straightforward pre- and postprocessing of simulation data. The high simulation speed allows for simulations of several years, which can be used, e.g., to assess the long-term impact of river engineering works or climate change effects. In combination with the straightforward pre- and postprocessing, the fast calculations facilitate efficient workflows for the simulation of individual flood events, because the modeller gets the immediate results as direct feedback to the selected parameter inputs. The model is provided together with its complete source code free of charge under the terms of the GNU General Public License (GPL) (www.wsl.ch/sedFlow). Examples of the application of sedFlow are given in a companion article by Heimann et al. (2015).

  16. Efficient Simulation of Tropical Cyclone Pathways with Stochastic Perturbations

    NASA Astrophysics Data System (ADS)

    Webber, R.; Plotkin, D. A.; Abbot, D. S.; Weare, J.

    2017-12-01

    Global Climate Models (GCMs) are known to statistically underpredict intense tropical cyclones (TCs) because they fail to capture the rapid intensification and high wind speeds characteristic of the most destructive TCs. Stochastic parametrization schemes have the potential to improve the accuracy of GCMs. However, current analysis of these schemes through direct sampling is limited by the computational expense of simulating a rare weather event at fine spatial gridding. The present work introduces a stochastically perturbed parametrization tendency (SPPT) scheme to increase simulated intensity of TCs. We adapt the Weighted Ensemble algorithm to simulate the distribution of TCs at a fraction of the computational effort required in direct sampling. We illustrate the efficiency of the SPPT scheme by comparing simulations at different spatial resolutions and stochastic parameter regimes. Stochastic parametrization and rare event sampling strategies have great potential to improve TC prediction and aid understanding of tropical cyclogenesis. Since rising sea surface temperatures are postulated to increase the intensity of TCs, these strategies can also improve predictions about climate change-related weather patterns. The rare event sampling strategies used in the current work are not only a novel tool for studying TCs, but they may also be applied to sampling any range of extreme weather events.

  17. A model for the perception of environmental sound based on notice-events.

    PubMed

    De Coensel, Bert; Botteldooren, Dick; De Muer, Tom; Berglund, Birgitta; Nilsson, Mats E; Lercher, Peter

    2009-08-01

    An approach is proposed to shed light on the mechanisms underlying human perception of environmental sound that intrudes in everyday living. Most research on exposure-effect relationships aims at relating overall effects to overall exposure indicators in an epidemiological fashion, without including available knowledge on the possible underlying mechanisms. Here, it is proposed to start from available knowledge on audition and perception to construct a computational framework for the effect of environmental sound on individuals. Obviously, at the individual level additional mechanisms (inter-sensory, attentional, cognitive, emotional) play a role in the perception of environmental sound. As a first step, current knowledge is made explicit by building a model mimicking some aspects of human auditory perception. This model is grounded in the hypothesis that long-term perception of environmental sound is determined primarily by short notice-events. The applicability of the notice-event model is illustrated by simulating a synthetic population exposed to typical Flemish environmental noise. From these simulation results, it is demonstrated that the notice-event model is able to mimic the differences between the annoyance caused by road traffic noise exposure and railway traffic noise exposure that are also observed empirically in other studies and thus could provide an explanation for these differences.

  18. Modeling solar energetic particle events using ENLIL heliosphere simulations

    NASA Astrophysics Data System (ADS)

    Luhmann, J. G.; Mays, M. L.; Odstrcil, D.; Li, Yan; Bain, H.; Lee, C. O.; Galvin, A. B.; Mewaldt, R. A.; Cohen, C. M. S.; Leske, R. A.; Larson, D.; Futaana, Y.

    2017-07-01

    Solar energetic particle (SEP) event modeling has gained renewed attention in part because of the availability of a decade of multipoint measurements from STEREO and L1 spacecraft at 1 AU. These observations are coupled with improving simulations of the geometry and strength of heliospheric shocks obtained by using coronagraph images to send erupted material into realistic solar wind backgrounds. The STEREO and ACE measurements in particular have highlighted the sometimes surprisingly widespread nature of SEP events. It is thus an opportune time for testing SEP models, which typically focus on protons 1-100 MeV, toward both physical insight to these observations and potentially useful space radiation environment forecasting tools. Some approaches emphasize the concept of particle acceleration and propagation from close to the Sun, while others emphasize the local field line connection to a traveling, evolving shock source. Among the latter is the previously introduced SEPMOD treatment, based on the widely accessible and well-exercised WSA-ENLIL-cone model. SEPMOD produces SEP proton time profiles at any location within the ENLIL domain. Here we demonstrate a SEPMOD version that accommodates multiple, concurrent shock sources occurring over periods of several weeks. The results illustrate the importance of considering longer-duration time periods and multiple CME contributions in analyzing, modeling, and forecasting SEP events.

  19. Study on wet scavenging of atmospheric pollutants in south Brazil

    NASA Astrophysics Data System (ADS)

    Wiegand, Flavio; Pereira, Felipe Norte; Teixeira, Elba Calesso

    2011-09-01

    The present paper presents the study of in-cloud and below-cloud SO 2 and SO 42-scavenging processes by applying numerical models in the Candiota region, located in the state of Rio Grande do Sul, South Brazil. The BRAMS (Brazilian Regional Atmospheric Modeling System) model was applied to simulate the vertical structure of the clouds, and the B.V.2 (Below-Cloud Beheng Version 2) scavenging model was applied to simulate in-cloud and below-cloud scavenging processes of the pollutants SO 2 and SO 42-. Five events in 2004 were selected for this study and were sampled at the Candiota Airport station. The concentrations of SO 2 and SO 42- sampled in the air and the simulated meteorological parameters of rainfall episodes were used as input data in the B.V.2, which simulates raindrop interactions associated with the scavenging process. Results for the Candiota region showed that in-cloud scavenging processes were more significant than below-cloud scavenging processes for two of the five events studied, with a contribution of approximately 90-100% of SO 2 and SO 42- concentrations in rainwater. A few adjustments to the original version of B.V.2 were made to allow simulation of scavenging processes in several types of clouds, not only cumulus humilis and cumulus congestus.

  20. Discrete Event Simulation for Decision Modeling in Health Care: Lessons from Abdominal Aortic Aneurysm Screening

    PubMed Central

    Jones, Edmund; Masconi, Katya L.; Sweeting, Michael J.; Thompson, Simon G.; Powell, Janet T.

    2018-01-01

    Markov models are often used to evaluate the cost-effectiveness of new healthcare interventions but they are sometimes not flexible enough to allow accurate modeling or investigation of alternative scenarios and policies. A Markov model previously demonstrated that a one-off invitation to screening for abdominal aortic aneurysm (AAA) for men aged 65 y in the UK and subsequent follow-up of identified AAAs was likely to be highly cost-effective at thresholds commonly adopted in the UK (£20,000 to £30,000 per quality adjusted life-year). However, new evidence has emerged and the decision problem has evolved to include exploration of the circumstances under which AAA screening may be cost-effective, which the Markov model is not easily able to address. A new model to handle this more complex decision problem was needed, and the case of AAA screening thus provides an illustration of the relative merits of Markov models and discrete event simulation (DES) models. An individual-level DES model was built using the R programming language to reflect possible events and pathways of individuals invited to screening v. those not invited. The model was validated against key events and cost-effectiveness, as observed in a large, randomized trial. Different screening protocol scenarios were investigated to demonstrate the flexibility of the DES. The case of AAA screening highlights the benefits of DES, particularly in the context of screening studies.

  1. Model simulations of the impact of energetic particle precipitation onto the upper and middle atmosphere

    NASA Astrophysics Data System (ADS)

    Wieters, Nadine; Sinnhuber, Miriam; Winkler, Holger; Berger, Uwe; Maik Wissing, Jan; Stiller, Gabriele; Funke, Bernd; Notholt, Justus

    Solar eruptions and geomagnetic storms can produce fluxes of high-energy protons and elec-trons, so-called Solar Energetic Particle Events, which can enter the Earth's atmosphere espe-cially in polar regions. These particle fluxes primarily cause ionisation and excitation in the upper atmosphere, and thereby the production of HOx and NOx species, which are catalysts for the reduction of ozone. To simulate such particle events, ionisation rates, calculated by the Atmospheric Ionization Module Osnabrück AIMOS (University of Osnabrück), have been implemented into the Bremen 3D Chemistry and Transport Model. To cover altitudes up to the mesopause, the model is driven by meteorological data, provided by the Leibniz-Institute Middle Atmosphere Model LIMA (IAP Kühlungsborn). For several electron and proton events during the highly solar-active period 2003/2004, model calculations have been carried out. To investigate the accordance of modeled to observed changes for atmospheric constituents like NO, NO2 , HNO3 , N2 O5 , ClO, and O3 , results of these calculations will be compared to measurements by the Michelson Interferometer for Passive Atmospheric Sounding MIPAS (ENVISAT) instrument. Computed model results and comparisons with measurements will be presented.

  2. Survival curve estimation with dependent left truncated data using Cox's model.

    PubMed

    Mackenzie, Todd

    2012-10-19

    The Kaplan-Meier and closely related Lynden-Bell estimators are used to provide nonparametric estimation of the distribution of a left-truncated random variable. These estimators assume that the left-truncation variable is independent of the time-to-event. This paper proposes a semiparametric method for estimating the marginal distribution of the time-to-event that does not require independence. It models the conditional distribution of the time-to-event given the truncation variable using Cox's model for left truncated data, and uses inverse probability weighting. We report the results of simulations and illustrate the method using a survival study.

  3. Modeling and simulation of queuing system for customer service improvement: A case study

    NASA Astrophysics Data System (ADS)

    Xian, Tan Chai; Hong, Chai Weng; Hawari, Nurul Nazihah

    2016-10-01

    This study aims to develop a queuing model at UniMall by using discrete event simulation approach in analyzing the service performance that affects customer satisfaction. The performance measures that considered in this model are such as the average time in system, the total number of student served, the number of student in waiting queue, the waiting time in queue as well as the maximum length of buffer. ARENA simulation software is used to develop a simulation model and the output is analyzed. Based on the analysis of output, it is recommended that management of UniMall consider introducing shifts and adding another payment counter in the morning.

  4. A hybrid method for flood simulation in small catchments combining hydrodynamic and hydrological techniques

    NASA Astrophysics Data System (ADS)

    Bellos, Vasilis; Tsakiris, George

    2016-09-01

    The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.

  5. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  6. Predicting Pilot Performance in Off-Nominal Conditions: A Meta-Analysis and Model Validation

    NASA Technical Reports Server (NTRS)

    Wickens, C.D.; Hooey, B.L.; Gore, B.F.; Sebok, A.; Koenecke, C.; Salud, E.

    2009-01-01

    Pilot response to off-nominal (very rare) events represents a critical component to understanding the safety of next generation airspace technology and procedures. We describe a meta-analysis designed to integrate the existing data regarding pilot accuracy of detecting rare, unexpected events such as runway incursions in realistic flight simulations. Thirty-five studies were identified and pilot responses were categorized by expectancy, event location, and whether the pilot was flying with a highway-in-the-sky display. All three dichotomies produced large, significant effects on event miss rate. A model of human attention and noticing, N-SEEV, was then used to predict event noticing performance as a function of event salience and expectancy, and retinal eccentricity. Eccentricity is predicted from steady state scanning by the SEEV model of attention allocation. The model was used to predict miss rates for the expectancy, location and highway-in-the-sky (HITS) effects identified in the meta-analysis. The correlation between model-predicted results and data from the meta-analysis was 0.72.

  7. Importance of vesicle release stochasticity in neuro-spike communication.

    PubMed

    Ramezani, Hamideh; Akan, Ozgur B

    2017-07-01

    Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.

  8. Numerical modeling of an intense precipitation event and its associated lightning activity over northern Greece

    NASA Astrophysics Data System (ADS)

    Pytharoulis, I.; Kotsopoulos, S.; Tegoulias, I.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2016-03-01

    This study investigates an intense precipitation event and its lightning activity that affected northern Greece and primarily Thessaloniki on 15 July 2014. The precipitation measurement of 98.5 mm in 15 h at the Aristotle University of Thessaloniki set a new absolute record maximum. The thermodynamic analysis indicated that the event took place in an environment that could support deep thunderstorm activity. The development of this intense event was associated with significant low-level convergence and upper-level divergence even before its triggering and a positive vertical gradient of relative vorticity advection. The high resolution (1.667 km × 1.667 km) non-hydrostatic WRF-ARW numerical weather prediction model was used to simulate this intense precipitation event, while the Lightning Potential Index was utilized to calculate the potential for lightning activity. Sensitivity experiments suggested that although the strong synoptic forcing assumed primary role in the occurrence of intense precipitation and lightning activity, their spatiotemporal variability was affected by topography. The application of the very fine resolution topography of NASA Shuttle Radar Topographic Mission improved the simulated precipitation and the calculated lightning potential.

  9. Hydrology of Fritchie Marsh, coastal Louisiana

    USGS Publications Warehouse

    Kuniansky, E.L.

    1985-01-01

    Fritchie Marsh, near Slidell, Louisiana, is being considered as a disposal site for sewage effluent. A two-dimensional, finite element, surface water modeling systems was used to solve the shallow water equations for flow. Factors affecting flow patterns are channel locations, inlets, outlets, islands, marsh vegetation, marsh geometry, stage of the West Pearl River, flooding over the lower Pearl River basin, gravity tides, wind-induced currents, and sewage discharge to the marsh. Four steady-state simulations were performed for two hydrologic events at two rates of sewage discharge. The events, near tide with no wind or rain and neap tide with a tide differential across the marsh, were selected as worst-case events for sewage effluent dispersion and were assumed as steady state events. Because inflows and outflows to the marsh are tidally affected, steady state simulations cannot fully define the hydraulic characteristics of the marsh for all hydrologic events. Model results and field data indicate that, during near tide with little or no rain, large parts of the marsh are stagnant; and sewage effluent, at existing and projected flows, has minimal effect on marsh flows. (USGS)

  10. Deep Space Storm Shelter Simulation Study

    NASA Technical Reports Server (NTRS)

    Dugan, Kathryn; Phojanamongkolkij, Nipa; Cerro, Jeffrey; Simon, Matthew

    2015-01-01

    Missions outside of Earth's magnetic field are impeded by the presence of radiation from galactic cosmic rays and solar particle events. To overcome this issue, NASA's Advanced Exploration Systems Radiation Works Storm Shelter (RadWorks) has been studying different radiation protective habitats to shield against the onset of solar particle event radiation. These habitats have the capability of protecting occupants by utilizing available materials such as food, water, brine, human waste, trash, and non-consumables to build short-term shelters. Protection comes from building a barrier with the materials that dampens the impact of the radiation on astronauts. The goal of this study is to develop a discrete event simulation, modeling a solar particle event and the building of a protective shelter. The main hallway location within a larger habitat similar to the International Space Station (ISS) is analyzed. The outputs from this model are: 1) the total area covered on the shelter by the different materials, 2) the amount of radiation the crew members receive, and 3) the amount of time for setting up the habitat during specific points in a mission given an event occurs.

  11. Evaluating hourly rainfall characteristics over the U.S. Great Plains in dynamically downscaled climate model simulations using NASA-Unified WRF

    NASA Astrophysics Data System (ADS)

    Lee, Huikyo; Waliser, Duane E.; Ferraro, Robert; Iguchi, Takamichi; Peters-Lidard, Christa D.; Tian, Baijun; Loikith, Paul C.; Wright, Daniel B.

    2017-07-01

    Accurate simulation of extreme precipitation events remains a challenge in climate models. This study utilizes hourly precipitation data from ground stations and satellite instruments to evaluate rainfall characteristics simulated by the NASA-Unified Weather Research and Forecasting (NU-WRF) regional climate model at horizontal resolutions of 4, 12, and 24 km over the Great Plains of the United States. We also examined the sensitivity of the simulated precipitation to different spectral nudging approaches and the cumulus parameterizations. The rainfall characteristics in the observations and simulations were defined as an hourly diurnal cycle of precipitation and a joint probability distribution function (JPDF) between duration and peak intensity of precipitation events over the Great Plains in summer. We calculated a JPDF for each data set and the overlapping area between observed and simulated JPDFs to measure the similarity between the two JPDFs. Comparison of the diurnal precipitation cycles between observations and simulations does not reveal the added value of high-resolution simulations. However, the performance of NU-WRF simulations measured by the JPDF metric strongly depends on horizontal resolution. The simulation with the highest resolution of 4 km shows the best agreement with the observations in simulating duration and intensity of wet spells. Spectral nudging does not affect the JPDF significantly. The effect of cumulus parameterizations on the JPDFs is considerable but smaller than that of horizontal resolution. The simulations with lower resolutions of 12 and 24 km show reasonable agreement but only with the high-resolution observational data that are aggregated into coarse resolution and spatially averaged.

  12. Spatial and temporal variability in the R-5 infiltration data set: Déjà vu and rainfall-runoff simulations

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Kyriakidis, Phaedon C.

    1997-12-01

    This paper is a continuation of the event-based rainfall-runoff model evaluation study reported by Loague and Freeze [1985[. Here we reevaluate the performance of a quasi-physically based rainfall-runoff model for three large events from the well-known R-5 catchment. Five different statistical criteria are used to quantitatively judge model performance. Temporal variability in the large R-5 infiltration data set [Loague and Gander, 1990] is filtered by working in terms of permeability. The transformed data set is reanalyzed via geostatistical methods to model the spatial distribution of permeability across the R-5 catchment. We present new estimates of the spatial distribution of infiltration that are in turn used in our rainfall-runoff simulations with the Horton rainfall-runoff model. The new rainfall-runoff simulations, complicated by reinfiltration impacts at the smaller scales of characterization, indicate that the near-surface hydrologic response of the R-5 catchment is most probably dominated by a combination of the Horton and Dunne overland flow mechanisms.

  13. Extreme weather: Subtropical floods and tropical cyclones

    NASA Astrophysics Data System (ADS)

    Shaevitz, Daniel A.

    Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the intensity of this event may be greatly increased if it occurs in a future climate. In the second part of this thesis, I examine the ability of high-resolution global atmospheric models to simulate TCs. Specifically, I present an intercomparison of several models' ability to simulate the global characteristics of TCs in the current climate. This is a necessary first step before using these models to project future changes in TCs. Overall, the models were able to reproduce the geographic distribution of TCs reasonably well, with some of the models performing remarkably well. The intensity of TCs varied widely between the models, with some of this difference being due to model resolution.

  14. A Numerical Study of the Effect of Periodic Nutrient Supply on Pathways of Carbon in a Coastal Upwelling Regime

    NASA Technical Reports Server (NTRS)

    Carr, Mary-Elena

    1998-01-01

    A size-based ecosystem model was modified to include periodic upwelling events and used to evaluate the effect of episodic nutrient supply on the standing stock, carbon uptake, and carbon flow into mesozooplankton grazing and sinking flux in a coastal upwelling regime. Two ecosystem configurations were compared: a single food chain made up of net phytoplankton and mesozooplankton (one autotroph and one heterotroph, A1H1), and three interconnected food chains plus bacteria (three autotrophs and four heterotrophs, A3H4). The carbon pathways in the A1H1 simulations were under stronger physical control than those of the A3H4 runs, where the small size classes are not affected by frequent upwelling events. In the more complex food web simulations, the microbial pathway determines the total carbon uptake and grazing rates, and regenerated nitrogen accounts for more than half of the total primary production for periods of 20 days or longer between events. By contrast, new production, export of carbon through sinking and mesozooplankton grazing are more important in the A1H1 simulations. In the A3H4 simulations, the turnover time scale of the autotroph biomass increases as the period between upwelling events increases, because of the larger contribution of slow-growing net phytoplankton. The upwelling period was characterized for three upwelling sites from the alongshore wind speed measured by the NASA Scatterometer (NSCAT) and the corresponding model output compared with literature data. This validation exercise for three upwelling sites and a downstream embayment suggests that standing stock, carbon uptake and size fractionation were best supported by the A3H4 simulations, while the simulated sinking fluxes are not distinguishable in the two configurations.

  15. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  16. Relative sea-level data from southwest Scotland constrain meltwater-driven sea-level jumps prior to the 8.2 kyr BP event

    NASA Astrophysics Data System (ADS)

    Lawrence, Thomas; Long, Antony J.; Gehrels, W. Roland; Jackson, Luke P.; Smith, David E.

    2016-11-01

    The most significant climate cooling of the Holocene is centred on 8.2 kyr BP (the '8.2 event'). Its cause is widely attributed to an abrupt slowdown of the Atlantic Meridional Overturning Circulation (AMOC) associated with the sudden drainage of Laurentide proglacial Lakes Agassiz and Ojibway, but model simulations have difficulty reproducing the event with a single-pulse scenario of freshwater input. Several lines of evidence point to multiple episodes of freshwater release from the decaying Laurentide Ice Sheet (LIS) between ∼8900 and ∼8200 cal yr BP, yet the precise number, timing and magnitude of these events - critical constraints for AMOC simulations - are far from resolved. Here we present a high-resolution relative sea level (RSL) record for the period 8800 to 7800 cal yr BP developed from estuarine and salt-marsh deposits in SW Scotland. We find that RSL rose abruptly in three steps by 0.35 m, 0.7 m and 0.4 m (mean) at 8760-8640, 8595-8465, 8323-8218 cal yr BP respectively. The timing of these RSL steps correlate closely with short-lived events expressed in North Atlantic proxy climate and oceanographic records, providing evidence of at least three distinct episodes of enhanced meltwater discharge from the decaying LIS prior to the 8.2 event. Our observations can be used to test the fidelity of both climate and ice-sheet models in simulating abrupt change during the early Holocene.

  17. A proposed model for economic evaluations of major depressive disorder.

    PubMed

    Haji Ali Afzali, Hossein; Karnon, Jonathan; Gray, Jodi

    2012-08-01

    In countries like UK and Australia, the comparability of model-based analyses is an essential aspect of reimbursement decisions for new pharmaceuticals, medical services and technologies. Within disease areas, the use of models with alternative structures, type of modelling techniques and/or data sources for common parameters reduces the comparability of evaluations of alternative technologies for the same condition. The aim of this paper is to propose a decision analytic model to evaluate long-term costs and benefits of alternative management options in patients with depression. The structure of the proposed model is based on the natural history of depression and includes clinical events that are important from both clinical and economic perspectives. Considering its greater flexibility with respect to handling time, discrete event simulation (DES) is an appropriate simulation platform for modelling studies of depression. We argue that the proposed model can be used as a reference model in model-based studies of depression improving the quality and comparability of studies.

  18. Analysis and hindcast simulations of an extreme rainfall event in the Mediterranean area: The Genoa 2011 case

    NASA Astrophysics Data System (ADS)

    Fiori, E.; Comellas, A.; Molini, L.; Rebora, N.; Siccardi, F.; Gochis, D. J.; Tanelli, S.; Parodi, A.

    2014-03-01

    The city of Genoa, which places between the Tyrrhenian Sea and the Apennine mountains (Liguria, Italy) was rocked by severe flash floods on the 4th of November, 2011. Nearly 500 mm of rain, a third of the average annual rainfall, fell in six hours. Six people perished and millions of Euros in damages occurred. The synoptic-scale meteorological system moved across the Atlantic Ocean and into the Mediterranean generating floods that killed 5 people in Southern France, before moving over the Ligurian Sea and Genoa producing the extreme event studied here. Cloud-permitting simulations (1 km) of the finger-like convective system responsible for the torrential event over Genoa have been performed using Advanced Research Weather and Forecasting Model (ARW-WRF, version 3.3). Two different microphysics (WSM6 and Thompson) as well as three different convection closures (explicit, Kain-Fritsch, and Betts-Miller-Janjic) were evaluated to gain a deeper understanding of the physical processes underlying the observed heavy rain event and the model's capability to predict, in hindcast mode, its structure and evolution. The impact of forecast initialization and of model vertical discretization on hindcast results is also examined. Comparison between model hindcasts and observed fields provided by raingauge data, satellite data, and radar data show that this particular event is strongly sensitive to the details of the mesoscale initialization despite being evolved from a relatively large scale weather system. Only meso-γ details of the event were not well captured by the best setting of the ARW-WRF model and so peak hourly rainfalls were not exceptionally well reproduced. The results also show that specification of microphysical parameters suitable to these events have a positive impact on the prediction of heavy precipitation intensity values.

  19. Alternative Stable States, Coral Reefs, and Smooth Dynamics with a Kick.

    PubMed

    Ippolito, Stephen; Naudot, Vincent; Noonburg, Erik G

    2016-03-01

    We consider a computer simulation, which was found to be faithful to time series data for Caribbean coral reefs, and an analytical model to help understand the dynamics of the simulation. The analytical model is a system of ordinary differential equations (ODE), and the authors claim this model demonstrates the existence of alternative stable states. The existence of an alternative stable state should consider a sudden shift in coral and macroalgae populations, while the grazing rate remains constant. The results of such shifts, however, are often confounded by changes in grazing rate. Although the ODE suggest alternative stable states, the ODE need modification to explicitly account for shifts or discrete events such as hurricanes. The goal of this paper will be to study the simulation dynamics through a simplified analytical representation. We proceed by modifying the original analytical model through incorporating discrete changes into the ODE. We then analyze the resulting dynamics and their bifurcations with respect to changes in grazing rate and hurricane frequency. In particular, a "kick" enabling the ODE to consider impulse events is added. Beyond adding a "kick" we employ the grazing function that is suggested by the simulation. The extended model was fit to the simulation data to support its use and predicts the existence cycles depending nonlinearly on grazing rates and hurricane frequency. These cycles may bring new insights into consideration for reef health, restoration and dynamics.

  20. Using the Statecharts paradigm for simulation of patient flow in surgical care.

    PubMed

    Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian

    2008-03-01

    Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.

Top