Sample records for years simulation models

  1. Dynamical Downscaling of NASA/GISS ModelE: Continuous, Multi-Year WRF Simulations

    NASA Astrophysics Data System (ADS)

    Otte, T.; Bowden, J. H.; Nolte, C. G.; Otte, M. J.; Herwehe, J. A.; Faluvegi, G.; Shindell, D. T.

    2010-12-01

    The WRF Model is being used at the U.S. EPA for dynamical downscaling of the NASA/GISS ModelE fields to assess regional impacts of climate change in the United States. The WRF model has been successfully linked to the ModelE fields in their raw hybrid vertical coordinate, and continuous, multi-year WRF downscaling simulations have been performed. WRF will be used to downscale decadal time slices of ModelE for recent past, current, and future climate as the simulations being conducted for the IPCC Fifth Assessment Report become available. This presentation will focus on the sensitivity to interior nudging within the RCM. The use of interior nudging for downscaled regional climate simulations has been somewhat controversial over the past several years but has been recently attracting attention. Several recent studies that have used reanalysis (i.e., verifiable) fields as a proxy for GCM input have shown that interior nudging can be beneficial toward achieving the desired downscaled fields. In this study, the value of nudging will be shown using fields from ModelE that are downscaled using WRF. Several different methods of nudging are explored, and it will be shown that the method of nudging and the choices made with respect to how nudging is used in WRF are critical to balance the constraint of ModelE against the freedom of WRF to develop its own fields.

  2. CHARMM-GUI 10 Years for Biomolecular Modeling and Simulation

    PubMed Central

    Jo, Sunhwan; Cheng, Xi; Lee, Jumin; Kim, Seonghoon; Park, Sang-Jun; Patel, Dhilon S.; Beaven, Andrew H.; Lee, Kyu Il; Rui, Huan; Roux, Benoît; MacKerell, Alexander D.; Klauda, Jeffrey B.; Qi, Yifei

    2017-01-01

    CHARMM-GUI, http://www.charmm-gui.org, is a web-based graphical user interface that prepares complex biomolecular systems for molecular simulations. CHARMM-GUI creates input files for a number of programs including CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Since its original development in 2006, CHARMM-GUI has been widely adopted for various purposes and now contains a number of different modules designed to set up a broad range of simulations: (1) PDB Reader & Manipulator, Glycan Reader, and Ligand Reader & Modeler for reading and modifying molecules; (2) Quick MD Simulator, Membrane Builder, Nanodisc Builder, HMMM Builder, Monolayer Builder, Micelle Builder, and Hex Phase Builder for building all-atom simulation systems in various environments; (3) PACE CG Builder and Martini Maker for building coarse-grained simulation systems; (4) DEER Facilitator and MDFF/xMDFF Utilizer for experimentally guided simulations; (5) Implicit Solvent Modeler, PBEQ-Solver, and GCMC/BD Ion Simulator for implicit solvent related calculations; (6) Ligand Binder for ligand solvation and binding free energy simulations; and (7) Drude Prepper for preparation of simulations with the CHARMM Drude polarizable force field. Recently, new modules have been integrated into CHARMM-GUI, such as Glycolipid Modeler for generation of various glycolipid structures, and LPS Modeler for generation of lipopolysaccharide structures from various Gram-negative bacteria. These new features together with existing modules are expected to facilitate advanced molecular modeling and simulation thereby leading to an improved understanding of the molecular details of the structure and dynamics of complex biomolecular systems. Here, we briefly review these capabilities and discuss potential future directions in the CHARMM-GUI development project. PMID:27862047

  3. CHARMM-GUI 10 years for biomolecular modeling and simulation.

    PubMed

    Jo, Sunhwan; Cheng, Xi; Lee, Jumin; Kim, Seonghoon; Park, Sang-Jun; Patel, Dhilon S; Beaven, Andrew H; Lee, Kyu Il; Rui, Huan; Park, Soohyung; Lee, Hui Sun; Roux, Benoît; MacKerell, Alexander D; Klauda, Jeffrey B; Qi, Yifei; Im, Wonpil

    2017-06-05

    CHARMM-GUI, http://www.charmm-gui.org, is a web-based graphical user interface that prepares complex biomolecular systems for molecular simulations. CHARMM-GUI creates input files for a number of programs including CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Since its original development in 2006, CHARMM-GUI has been widely adopted for various purposes and now contains a number of different modules designed to set up a broad range of simulations: (1) PDB Reader & Manipulator, Glycan Reader, and Ligand Reader & Modeler for reading and modifying molecules; (2) Quick MD Simulator, Membrane Builder, Nanodisc Builder, HMMM Builder, Monolayer Builder, Micelle Builder, and Hex Phase Builder for building all-atom simulation systems in various environments; (3) PACE CG Builder and Martini Maker for building coarse-grained simulation systems; (4) DEER Facilitator and MDFF/xMDFF Utilizer for experimentally guided simulations; (5) Implicit Solvent Modeler, PBEQ-Solver, and GCMC/BD Ion Simulator for implicit solvent related calculations; (6) Ligand Binder for ligand solvation and binding free energy simulations; and (7) Drude Prepper for preparation of simulations with the CHARMM Drude polarizable force field. Recently, new modules have been integrated into CHARMM-GUI, such as Glycolipid Modeler for generation of various glycolipid structures, and LPS Modeler for generation of lipopolysaccharide structures from various Gram-negative bacteria. These new features together with existing modules are expected to facilitate advanced molecular modeling and simulation thereby leading to an improved understanding of the structure and dynamics of complex biomolecular systems. Here, we briefly review these capabilities and discuss potential future directions in the CHARMM-GUI development project. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. The effect of year-to-year variability of leaf area index on Variable Infiltration Capacity model performance and simulation of runoff

    NASA Astrophysics Data System (ADS)

    Tesemma, Z. K.; Wei, Y.; Peel, M. C.; Western, A. W.

    2015-09-01

    This study assessed the effect of using observed monthly leaf area index (LAI) on hydrological model performance and the simulation of runoff using the Variable Infiltration Capacity (VIC) hydrological model in the Goulburn-Broken catchment of Australia, which has heterogeneous vegetation, soil and climate zones. VIC was calibrated with both observed monthly LAI and long-term mean monthly LAI, which were derived from the Global Land Surface Satellite (GLASS) leaf area index dataset covering the period from 1982 to 2012. The model performance under wet and dry climates for the two different LAI inputs was assessed using three criteria, the classical Nash-Sutcliffe efficiency, the logarithm transformed flow Nash-Sutcliffe efficiency and the percentage bias. Finally, the deviation of the simulated monthly runoff using the observed monthly LAI from simulated runoff using long-term mean monthly LAI was computed. The VIC model predicted monthly runoff in the selected sub-catchments with model efficiencies ranging from 61.5% to 95.9% during calibration (1982-1997) and 59% to 92.4% during validation (1998-2012). Our results suggest systematic improvements, from 4% to 25% in Nash-Sutcliffe efficiency, in sparsely forested sub-catchments when the VIC model was calibrated with observed monthly LAI instead of long-term mean monthly LAI. There was limited systematic improvement in tree dominated sub-catchments. The results also suggest that the model overestimation or underestimation of runoff during wet and dry periods can be reduced to 25 mm and 35 mm respectively by including the year-to-year variability of LAI in the model, thus reflecting the responses of vegetation to fluctuations in climate and other factors. Hence, the year-to-year variability in LAI should not be neglected; rather it should be included in model calibration as well as simulation of monthly water balance.

  5. Effect of year-to-year variability of leaf area index on variable infiltration capacity model performance and simulation of streamflow during drought

    NASA Astrophysics Data System (ADS)

    Tesemma, Z. K.; Wei, Y.; Peel, M. C.; Western, A. W.

    2014-09-01

    This study assessed the effect of using observed monthly leaf area index (LAI) on hydrologic model performance and the simulation of streamflow during drought using the variable infiltration capacity (VIC) hydrological model in the Goulburn-Broken catchment of Australia, which has heterogeneous vegetation, soil and climate zones. VIC was calibrated with both observed monthly LAI and long-term mean monthly LAI, which were derived from the Global Land Surface Satellite (GLASS) observed monthly LAI dataset covering the period from 1982 to 2012. The model performance under wet and dry climates for the two different LAI inputs was assessed using three criteria, the classical Nash-Sutcliffe efficiency, the logarithm transformed flow Nash-Sutcliffe efficiency and the percentage bias. Finally, the percentage deviation of the simulated monthly streamflow using the observed monthly LAI from simulated streamflow using long-term mean monthly LAI was computed. The VIC model predicted monthly streamflow in the selected sub-catchments with model efficiencies ranging from 61.5 to 95.9% during calibration (1982-1997) and 59 to 92.4% during validation (1998-2012). Our results suggest systematic improvements from 4 to 25% in the Nash-Sutcliffe efficiency in pasture dominated catchments when the VIC model was calibrated with the observed monthly LAI instead of the long-term mean monthly LAI. There was limited systematic improvement in tree dominated catchments. The results also suggest that the model overestimation or underestimation of streamflow during wet and dry periods can be reduced to some extent by including the year-to-year variability of LAI in the model, thus reflecting the responses of vegetation to fluctuations in climate and other factors. Hence, the year-to-year variability in LAI should not be neglected; rather it should be included in model calibration as well as simulation of monthly water balance.

  6. A finite element model of a six-year-old child for simulating pedestrian accidents.

    PubMed

    Meng, Yunzhu; Pak, Wansoo; Guleyupoglu, Berkan; Koya, Bharath; Gayzik, F Scott; Untaroiu, Costin D

    2017-01-01

    Child pedestrian protection deserves more attention in vehicle safety design since they are the most vulnerable road users who face the highest mortality rate. Pediatric Finite Element (FE) models could be used to simulate and understand the pedestrian injury mechanisms during crashes in order to mitigate them. Thus, the objective of the study was to develop a computationally efficient (simplified) six-year-old (6YO-PS) pedestrian FE model and validate it based on the latest published pediatric data. The 6YO-PS FE model was developed by morphing the existing GHBMC adult pedestrian model. Retrospective scan data were used to locally adjust the geometry as needed for accuracy. Component test simulations focused only the lower extremities and pelvis, which are the first body regions impacted during pedestrian accidents. Three-point bending test simulations were performed on the femur and tibia with adult material properties and then updated using child material properties. Pelvis impact and knee bending tests were also simulated. Finally, a series of pediatric Car-to-Pedestrian Collision (CPC) were simulated with pre-impact velocities ranging from 20km/h up to 60km/h. The bone models assigned pediatric material properties showed lower stiffness and a good match in terms of fracture force to the test data (less than 6% error). The pelvis impact force predicted by the child model showed a similar trend with test data. The whole pedestrian model was stable during CPC simulations and predicted common pedestrian injuries. Overall, the 6YO-PS FE model developed in this study showed good biofidelity at component level (lower extremity and pelvis) and stability in CPC simulations. While more validations would improve it, the current model could be used to investigate the lower limb injury mechanisms and in the prediction of the impact parameters as specified in regulatory testing protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A precipitation-runoff model for simulating natural streamflow conditions in the Smith River watershed, Montana, water years 1996-2008

    USGS Publications Warehouse

    Chase, Katherine J.; Caldwell, Rodney R.; Stanley, Andrea K.

    2014-01-01

    This report documents the construction of a precipitation-runoff model for simulating natural streamflow in the Smith River watershed, Montana. This Precipitation-Runoff Modeling System model, constructed in cooperation with the Meagher County Conservation District, can be used to examine the general hydrologic framework of the Smith River watershed, including quantification of precipitation, evapotranspiration, and streamflow; partitioning of streamflow between surface runoff and subsurface flow; and quantifying contributions to streamflow from several parts of the watershed. The model was constructed by using spatial datasets describing watershed topography, the streams, and the hydrologic characteristics of the basin soils and vegetation. Time-series data (daily total precipitation, and daily minimum and maximum temperature) were input to the model to simulate daily streamflow. The model was calibrated for water years 2002–2007 and evaluated for water years 1996–2001. Though water year 2008 was included in the study period to evaluate water-budget components, calibration and evaluation data were unavailable for that year. During the calibration and evaluation periods, simulated-natural flow values were compared to reconstructed-natural streamflow data. These reconstructed-natural streamflow data were calculated by adding Bureau of Reclamation’s depletions data to the observed streamflows. Reconstructed-natural streamflows represent estimates of streamflows for water years 1996–2007 assuming there was no agricultural water-resources development in the watershed. Additional calibration targets were basin mean monthly solar radiation and potential evapotranspiration. The model estimated the hydrologic processes in the Smith River watershed during the calibration and evaluation periods. Simulated-natural mean annual and mean monthly flows generally were the same or higher than the reconstructed-natural streamflow values during the calibration period, whereas

  8. Global climate simulations at 3000-year intervals for the last 21 000 years with the GENMOM coupled atmosphere–ocean model

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We apply GENMOM, a coupled atmosphere–ocean climate model, to simulate eight equilibrium time slices at 3000-year intervals for the past 21 000 years forced by changes in Earth–Sun geometry, atmospheric greenhouse gases (GHGs), continental ice sheets, and sea level. Simulated global cooling during the Last Glacial Maximum (LGM) is 3.8 ◦C and the rate of post-glacial warming is in overall agreement with recently published temperature reconstructions. The greatest rate of warming occurs between 15 and 12 ka (2.4 ◦C over land, 0.7 ◦C over oceans, and 1.4 ◦C globally) in response to changes in radiative forcing from the diminished extent of the Northern Hemisphere (NH) ice sheets and increases in GHGs and NH summer insolation. The modeled LGM and 6 ka temperature and precipitation climatologies are generally consistent with proxy reconstructions, the PMIP2 and PMIP3 simulations, and other paleoclimate data–model analyses. The model does not capture the mid-Holocene “thermal maximum” and gradual cooling to preindustrial (PI) global temperature found in the data. Simulated monsoonal precipitation in North Africa peaks between 12 and 9 ka at values ∼ 50 % greater than those of the PI, and Indian monsoonal precipitation peaks at 12 and 9 ka at values ∼ 45 % greater than the PI. GENMOM captures the reconstructed LGM extent of NH and Southern Hemisphere (SH) sea ice. The simulated present-day Antarctica Circumpolar Current (ACC) is ∼ 48 % weaker than the observed (62 versus 119 Sv). The simulated present-day Atlantic Meridional Overturning Circulation (AMOC) of 19.3 ± 1.4 Sv on the Bermuda Rise (33◦ N) is comparable with observed value of 18.7 ± 4.8 Sv. AMOC at 33◦ N is reduced by ∼ 15 % during the LGM, and the largest post-glacial increase (∼ 11 %) occurs during the 15 ka time slice.

  9. Urology Residents' Experience and Attitude Toward Surgical Simulation: Presenting our 4-Year Experience With a Multi-institutional, Multi-modality Simulation Model.

    PubMed

    Chow, Alexander K; Sherer, Benjamin A; Yura, Emily; Kielb, Stephanie; Kocjancic, Ervin; Eggener, Scott; Turk, Thomas; Park, Sangtae; Psutka, Sarah; Abern, Michael; Latchamsetty, Kalyan C; Coogan, Christopher L

    2017-11-01

    To evaluate the Urological resident's attitude and experience with surgical simulation in residency education using a multi-institutional, multi-modality model. Residents from 6 area urology training programs rotated through simulation stations in 4 consecutive sessions from 2014 to 2017. Workshops included GreenLight photovaporization of the prostate, ureteroscopic stone extraction, laparoscopic peg transfer, 3-dimensional laparoscopy rope pass, transobturator sling placement, intravesical injection, high definition video system trainer, vasectomy, and Urolift. Faculty members provided teaching assistance, objective scoring, and verbal feedback. Participants completed a nonvalidated questionnaire evaluating utility of the workshop and soliciting suggestions for improvement. Sixty-three of 75 participants (84%) (postgraduate years 1-6) completed the exit questionnaire. Median rating of exercise usefulness on a scale of 1-10 ranged from 7.5 to 9. On a scale of 0-10, cumulative median scores of the course remained high over 4 years: time limit per station (9; interquartile range [IQR] 2), faculty instruction (9, IQR 2), ease of use (9, IQR 2), face validity (8, IQR 3), and overall course (9, IQR 2). On multivariate analysis, there was no difference in rating of domains between postgraduate years. Sixty-seven percent (42/63) believe that simulation training should be a requirement of Urology residency. Ninety-seven percent (63/65) viewed the laboratory as beneficial to their education. This workshop model is a valuable training experience for residents. Most participants believe that surgical simulation is beneficial and should be a requirement for Urology residency. High ratings of usefulness for each exercise demonstrated excellent face validity provided by the course. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  11. Marine radiocarbon reservoir age simulations for the past 50,000 years

    NASA Astrophysics Data System (ADS)

    Butzin, M.; Köhler, P.; Lohmann, G.

    2017-08-01

    Radiocarbon (14C) dating calibration for the last glacial period largely relies on cross-dated marine 14C records. However, marine reservoirs are isotopically depleted with respect to the atmosphere and therefore have to be corrected by the Marine Radiocarbon Ages of surface waters (MRAs), whose temporal variabilities are largely unknown. Here we present simulations of the spatial and temporal variability in MRAs using a three-dimensional ocean circulation model covering the past 50,000 years. Our simulations are compared to reconstructions of past surface ocean Δ14C. Running the model with different climatic boundary conditions, we find that low-latitude to midlatitude MRAs have varied between 400 and 1200 14C years, with values of about 780 14C years at the Last Glacial Maximum. Reservoir ages exceeding 2000 14C years are simulated in the polar oceans. Our simulation results can be used as first-order approximation of the MRA variability in future radiocarbon calibration efforts.

  12. Modeling and Simulation at NASA

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2009-01-01

    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  13. Spin-up simulation behaviors in a climate model to build a basement of long-time simulation

    NASA Astrophysics Data System (ADS)

    Lee, J.; Xue, Y.; De Sales, F.

    2015-12-01

    It is essential to develop start-up information when conducting long-time climate simulation. In case that the initial condition is already available from the previous simulation of same type model this does not necessary; however, if not, model needs spin-up simulation to have adjusted and balanced initial condition with the model climatology. Otherwise, a severe spin may take several years. Some of model variables such as deep soil temperature fields and temperature in ocean deep layers in initial fields would affect model's further long-time simulation due to their long residual memories. To investigate the important factor for spin-up simulation in producing an atmospheric initial condition, we had conducted two different spin-up simulations when no atmospheric condition is available from exist datasets. One simulation employed atmospheric global circulation model (AGCM), namely Global Forecast System (GFS) of National Center for Environmental Prediction (NCEP), while the other employed atmosphere-ocean coupled global circulation model (CGCM), namely Climate Forecast System (CFS) of NCEP. Both models share the atmospheric modeling part and only difference is in applying of ocean model coupling, which is conducted by Modular Ocean Model version 4 (MOM4) of Geophysical Fluid Dynamics Laboratory (GFDL) in CFS. During a decade of spin-up simulation, prescribed sea-surface temperature (SST) fields of target year is forced to the GFS daily basis, while CFS digested only first time step ocean condition and freely iterated for the rest of the period. Both models were forced by CO2 condition and solar constant given from the target year. Our analyses of spin-up simulation results indicate that freely conducted interaction between the ocean and the atmosphere is more helpful to produce the initial condition for the target year rather than produced by fixed SST forcing. Since the GFS used prescribed forcing exactly given from the target year, this result is unexpected

  14. Nursing Simulation: A Review of the Past 40 Years

    ERIC Educational Resources Information Center

    Nehring, Wendy M.; Lashley, Felissa R.

    2009-01-01

    Simulation, in its many forms, has been a part of nursing education and practice for many years. The use of games, computer-assisted instruction, standardized patients, virtual reality, and low-fidelity to high-fidelity mannequins have appeared in the past 40 years, whereas anatomical models, partial task trainers, and role playing were used…

  15. Simulated root dynamics of a 160-year-old sugar maple (Acer saccharum Marsh.) tree with and without ozone exposure using the TREGRO model.

    PubMed

    Retzlaff, W. A.; Weinstein, D. A.; Laurence, J. A.; Gollands, B.

    1996-01-01

    Because of difficulties in directly assessing root responses of mature forest trees exposed to atmospheric pollutants, we have used the model TREGRO to analyze the effects of a 3- and a 10-year exposure to ozone (O(3)) on root dynamics of a simulated 160-year-old sugar maple (Acer saccharum Marsh.) tree. We used existing phenological, allometric, and growth data to parameterize TREGRO to produce a simulated 160-year-old tree. Simulations were based on literature values for sugar maple fine root production and senescence and the photosynthetic responses of sugar maple seedlings exposed to O(3) in open-top chambers. In the simulated 3-year exposure to O(3), 2 x ambient atmospheric O(3) concentrations reduced net carbon (C) gain of the 160-year-old tree. This reduction occurred in the C storage pools (total nonstructural carbohydrate, TNC), with most of the reduction occurring in coarse (woody) roots. Total fine root production and senescence were unaffected by the simulated 3-year exposure to O(3). However, extending the simulated O(3) exposure period to 10 years depleted the TNC pools of the coarse roots and reduced total fine root production. Similar reductions in TNC pools have been observed in forest-grown sugar maple trees exhibiting symptoms of stress. We conclude that modeling can aid in evaluating the belowground response of mature forest trees to atmospheric pollution stress and could indicate the potential for gradual deterioration of tree health under conditions of long-term stress, a situation similar to that underlying the decline of sugar maple trees.

  16. Magnetic biosensors: Modelling and simulation.

    PubMed

    Nabaei, Vahid; Chandrawati, Rona; Heidari, Hadi

    2018-04-30

    In the past few years, magnetoelectronics has emerged as a promising new platform technology in various biosensors for detection, identification, localisation and manipulation of a wide spectrum of biological, physical and chemical agents. The methods are based on the exposure of the magnetic field of a magnetically labelled biomolecule interacting with a complementary biomolecule bound to a magnetic field sensor. This Review presents various schemes of magnetic biosensor techniques from both simulation and modelling as well as analytical and numerical analysis points of view, and the performance variations under magnetic fields at steady and nonstationary states. This is followed by magnetic sensors modelling and simulations using advanced Multiphysics modelling software (e.g. Finite Element Method (FEM) etc.) and home-made developed tools. Furthermore, outlook and future directions of modelling and simulations of magnetic biosensors in different technologies and materials are critically discussed. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  17. Monte Carlo Simulation of Microscopic Stock Market Models

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich

    Computer simulations with random numbers, that is, Monte Carlo methods, have been considerably applied in recent years to model the fluctuations of stock market or currency exchange rates. Here we concentrate on the percolation model of Cont and Bouchaud, to simulate, not to predict, the market behavior.

  18. Microscopic Car Modeling for Intelligent Traffic and Scenario Generation in the UCF Driving Simulator : Year 2

    DOT National Transportation Integrated Search

    2000-01-01

    A multi-year project was initiated to introduce autonomous vehicles in the University of Central Florida (UCF) Driving Simulator for real-time interaction with the simulator vehicle. This report describes the progress during the second year. In the f...

  19. Preliminary analysis of one year long space climate simulation

    NASA Astrophysics Data System (ADS)

    Facsko, G.; Honkonen, I. J.; Juusola, L.; Viljanen, A.; Vanhamäki, H.; Janhunen, P.; Palmroth, M.; Milan, S. E.

    2013-12-01

    One full year (155 Cluster orbits, from January 29, 2002 to February 2, 2003) is simulated using the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS) in the European Cluster Assimilation Technology project (ECLAT). This enables us to study the performance of a global magnetospheric model in an unprecedented scale both in terms of the amount of available observations and the length of the timeseries that can be compared. The solar wind for the simulated period, obtained from OMNIWeb, is used as input to GUMICS. We present an overview of various comparisons of GUMICS results to observations for the simulated year. Results along the Cluster reference spacecraft orbit to are compared to Cluster measurements. The Cross Polar Cap Potential (CPCP) results are compared to SuperDARN measurements. The IMAGE electrojet indicators (IU, IL) calculated from the ionospheric currents of GUMICS are compared to observations. Finally, Geomagnetically Induced Currents (GIC) calculated from GUMICS results along the Finnish mineral gas pipeline at Mätsälä are also compared to measurements.

  20. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The

  1. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  2. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  3. Development and validation of a modified Hybrid-III six-year-old dummy model for simulating submarining in motor-vehicle crashes.

    PubMed

    Hu, Jingwen; Klinich, Kathleen D; Reed, Matthew P; Kokkolaras, Michael; Rupp, Jonathan D

    2012-06-01

    In motor-vehicle crashes, young school-aged children restrained by vehicle seat belt systems often suffer from abdominal injuries due to submarining. However, the current anthropomorphic test device, so-called "crash dummy", is not adequate for proper simulation of submarining. In this study, a modified Hybrid-III six-year-old dummy model capable of simulating and predicting submarining was developed using MADYMO (TNO Automotive Safety Solutions). The model incorporated improved pelvis and abdomen geometry and properties previously tested in a modified physical dummy. The model was calibrated and validated against four sled tests under two test conditions with and without submarining using a multi-objective optimization method. A sensitivity analysis using this validated child dummy model showed that dummy knee excursion, torso rotation angle, and the difference between head and knee excursions were good predictors for submarining status. It was also shown that restraint system design variables, such as lap belt angle, D-ring height, and seat coefficient of friction (COF), may have opposite effects on head and abdomen injury risks; therefore child dummies and dummy models capable of simulating submarining are crucial for future restraint system design optimization for young school-aged children. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. Combining Genome-Wide Information with a Functional Structural Plant Model to Simulate 1-Year-Old Apple Tree Architecture.

    PubMed

    Migault, Vincent; Pallas, Benoît; Costes, Evelyne

    2016-01-01

    In crops, optimizing target traits in breeding programs can be fostered by selecting appropriate combinations of architectural traits which determine light interception and carbon acquisition. In apple tree, architectural traits were observed to be under genetic control. However, architectural traits also result from many organogenetic and morphological processes interacting with the environment. The present study aimed at combining a FSPM built for apple tree, MAppleT, with genetic determinisms of architectural traits, previously described in a bi-parental population. We focused on parameters related to organogenesis (phyllochron and immediate branching) and morphogenesis processes (internode length and leaf area) during the first year of tree growth. Two independent datasets collected in 2004 and 2007 on 116 genotypes, issued from a 'Starkrimson' × 'Granny Smith' cross, were used. The phyllochron was estimated as a function of thermal time and sylleptic branching was modeled subsequently depending on phyllochron. From a genetic map built with SNPs, marker effects were estimated on four MAppleT parameters with rrBLUP, using 2007 data. These effects were then considered in MAppleT to simulate tree development in the two climatic conditions. The genome wide prediction model gave consistent estimations of parameter values with correlation coefficients between observed values and estimated values from SNP markers ranging from 0.79 to 0.96. However, the accuracy of the prediction model following cross validation schemas was lower. Three integrative traits (the number of leaves, trunk length, and number of sylleptic laterals) were considered for validating MAppleT simulations. In 2007 climatic conditions, simulated values were close to observations, highlighting the correct simulation of genetic variability. However, in 2004 conditions which were not used for model calibration, the simulations differed from observations. This study demonstrates the possibility of

  5. Development of NASA's Models and Simulations Standard

    NASA Technical Reports Server (NTRS)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  6. INTEGRATING MESO-AND MICRO-SIMULATION MODELS TO EVALUATE TRAFFIC MANAGEMENT STRATEGIES, YEAR 2

    DOT National Transportation Integrated Search

    2017-07-04

    In the Year 1 Report, the Arizona State University (ASU) Project Team described the development of a hierarchical multi-resolution simulation platform to test proactive traffic management strategies. The scope was to integrate an easily available mic...

  7. Computable general equilibrium model fiscal year 2013 capability development report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Rivera, Michael Kelly; Boero, Riccardo

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less

  8. Simulating spin models on GPU

    NASA Astrophysics Data System (ADS)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  9. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  10. Behaviour of oceanic 137Cs following the Fukushima Daiichi Nuclear Power Plant accident for four years simulated numerically by a regional ocean model

    NASA Astrophysics Data System (ADS)

    Tsumune, D.; Tsubono, T.; Aoyama, M.; Misumi, K.; Tateda, Y.

    2015-12-01

    A series of accidents at the Fukushima Dai-ichi Nuclear Power Plant (1F NPP) following the earthquake and tsunami of 11 March 2011 resulted in the release of radioactive materials to the ocean by two major pathways, direct release from the accident site and atmospheric deposition.We reconstructed spatiotemporal variability of 137Cs activity in the regional ocean for four years by numerical model, such as a regional scale and the North Pacific scale oceanic dispersion models, an atmospheric transport model, a sediment transport model, a dynamic biological compartment model for marine biota and river runoff model. Direct release rate of 137Cs were estimated for four years after the accident by comparing simulated results and observed activities very close to the site. The estimated total amounts of directly release was 3.6±0.7 PBq. Directly release rate of 137Cs decreased exponentially with time by the end of December 2012 and then, was almost constant. Decrease rate were quite small after 2013. The daily release rate of 137Cs was estimated to be the order of magnitude of 1010 Bq/day by the end of March 2015. The activity of directly released 137Cs was detectable only in the coastal zone after December 2012. Simulated 137Cs activities attributable to direct release were in good agreement with observed activities, a result that implies the estimated direct release rate was reasonable. There is no observed data of 137Cs activity in the ocean from 11 to 21 March 2011. Observed data of marine biota should reflect the history of 137Cs activity in this early period. We reconstructed the history of 137Cs activity in this early period by considering atmospheric deposition, river input, rain water runoff from the 1F NPP site. The comparisons between simulated 137Cs activity of marine biota by a dynamic biological compartment and observed data also suggest that simulated 137Cs activity attributable to atmospheric deposition was underestimated in this early period. The

  11. Behaviour of oceanic 137Cs following the Fukushima Daiichi Nuclear Power Plant accident for four years simulated numerically by a regional ocean model

    NASA Astrophysics Data System (ADS)

    Torn, M. S.; Koven, C. D.; Riley, W. J.; Zhu, B.; Hicks Pries, C.; Phillips, C. L.

    2014-12-01

    A series of accidents at the Fukushima Dai-ichi Nuclear Power Plant (1F NPP) following the earthquake and tsunami of 11 March 2011 resulted in the release of radioactive materials to the ocean by two major pathways, direct release from the accident site and atmospheric deposition.We reconstructed spatiotemporal variability of 137Cs activity in the regional ocean for four years by numerical model, such as a regional scale and the North Pacific scale oceanic dispersion models, an atmospheric transport model, a sediment transport model, a dynamic biological compartment model for marine biota and river runoff model. Direct release rate of 137Cs were estimated for four years after the accident by comparing simulated results and observed activities very close to the site. The estimated total amounts of directly release was 3.6±0.7 PBq. Directly release rate of 137Cs decreased exponentially with time by the end of December 2012 and then, was almost constant. Decrease rate were quite small after 2013. The daily release rate of 137Cs was estimated to be the order of magnitude of 1010 Bq/day by the end of March 2015. The activity of directly released 137Cs was detectable only in the coastal zone after December 2012. Simulated 137Cs activities attributable to direct release were in good agreement with observed activities, a result that implies the estimated direct release rate was reasonable. There is no observed data of 137Cs activity in the ocean from 11 to 21 March 2011. Observed data of marine biota should reflect the history of 137Cs activity in this early period. We reconstructed the history of 137Cs activity in this early period by considering atmospheric deposition, river input, rain water runoff from the 1F NPP site. The comparisons between simulated 137Cs activity of marine biota by a dynamic biological compartment and observed data also suggest that simulated 137Cs activity attributable to atmospheric deposition was underestimated in this early period. The

  12. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  13. Regional model simulations of New Zealand climate

    NASA Astrophysics Data System (ADS)

    Renwick, James A.; Katzfey, Jack J.; Nguyen, Kim C.; McGregor, John L.

    1998-03-01

    Simulation of New Zealand climate is examined through the use of a regional climate model nested within the output of the Commonwealth Scientific and Industrial Research Organisation nine-level general circulation model (GCM). R21 resolution GCM output is used to drive a regional model run at 125 km grid spacing over the Australasian region. The 125 km run is used in turn to drive a simulation at 50 km resolution over New Zealand. Simulations with a full seasonal cycle are performed for 10 model years. The focus is on the quality of the simulation of present-day climate, but results of a doubled-CO2 run are discussed briefly. Spatial patterns of mean simulated precipitation and surface temperatures improve markedly as horizontal resolution is increased, through the better resolution of the country's orography. However, increased horizontal resolution leads to a positive bias in precipitation. At 50 km resolution, simulated frequency distributions of daily maximum/minimum temperatures are statistically similar to those of observations at many stations, while frequency distributions of daily precipitation appear to be statistically different to those of observations at most stations. Modeled daily precipitation variability at 125 km resolution is considerably less than observed, but is comparable to, or exceeds, observed variability at 50 km resolution. The sensitivity of the simulated climate to changes in the specification of the land surface is discussed briefly. Spatial patterns of the frequency of extreme temperatures and precipitation are generally well modeled. Under a doubling of CO2, the frequency of precipitation extremes changes only slightly at most locations, while air frosts become virtually unknown except at high-elevation sites.

  14. Comparison of an empirical forest growth and yield simulator and a forest gap simulator using actual 30-year growth from two even-aged forests in Kentucky

    Treesearch

    Daniel A. Yaussy

    2000-01-01

    Two individual-tree growth simulators are used to predict the growth and mortality on a 30-year-old forest site and an 80-year-old forest site in eastern Kentucky. The empirical growth and yield model (NE-TWIGS) was developed to simulate short-term (

  15. ENSO effects on MLT diurnal tides: A 21 year reanalysis data-driven GAIA model simulation

    NASA Astrophysics Data System (ADS)

    Liu, Huixin; Sun, Yang-Yi; Miyoshi, Yasunobu; Jin, Hidekatsu

    2017-05-01

    Tidal responses to El Niño-Southern Oscillation (ENSO) in the mesosphere and lower thermosphere (MLT) are investigated for the first time using reanalysis data-driven simulations covering 21 years. The simulation is carried out with the Ground-to-topside Atmosphere-Ionosphere model for Aeronomy (GAIA) during 1996-2016, which covers nine ENSO events. ENSO impacts on diurnal tides at 100 km altitude are analyzed and cross-compared among temperature (T), zonal wind (U), and meridional wind (V), which reveals the following salient features: (1) Tidal response can differ significantly among T, U, and V in terms of magnitude and latitudinal structure, making detection of ENSO effects sensitive to the parameter used and the location of a ground station; (2) the nonmigrating DE3 tide in T and U shows a prominent hemisphere asymmetric response to La Niña, with an increase between 0° and 30°N and a decrease between 30° and 0°S. In contrast, DE3 in V exhibits no significant response; (3) the migrating DW1 enhances during El Niño in equatorial regions for T and U but in off-equatorial regions for V. As the first ENSO study based on reanalysis-driven simulations, GAIA's full set of tidal responses in T, U, and V provides us with a necessary global context to better understand and cross-compare observations during ENSO events. Comparisons with observations during the 1997-98 El Niño and 2010-11 La Niña reveal good agreement in both magnitude and timing. Comparisons with "free-run" WACCM simulations (T) show consistent results in nonmigrating tides DE2 and DE3 but differences in the migrating DW1 tide.

  16. Workforce Modeling & Simulation Education and Training for Lifelong Learning: Modeling & Simulation Education Catalog

    DTIC Science & Technology

    2007-03-01

    LEARNING : MODELING & SIMULATION EDUCATION CATALOG by Jean Catalano Jarema M. Didoszak March 2007...Technical Report, 11/06 – 02/07 4. TITLE AND SUBTITLE: Workforce Modeling & Simulation Education and Training for Lifelong Learning ...Modeling and Simulation Education and Training for Lifelong Learning project. The catalog contains searchable information about 253 courses from 23 U.S

  17. Southern Arizona hydroclimate over the last 3000 years: a comparison of speleothem elemental data and climate model simulations

    NASA Astrophysics Data System (ADS)

    King, J.; Harrington, M. D.; Cole, J. E.; Drysdale, R.; Woodhead, J. D.; Fasullo, J.; Stevenson, S.; Otto-Bliesner, B. L.; Overpeck, J. T.; Edwards, R. L.; Henderson, G. M.

    2017-12-01

    Understanding long-term hydroclimate is particularly important in semiarid regions where prolonged droughts may be exacerbated by a warming climate. In many regions, speleothem trace elements correlate with regional wet and dry climate signals. In the drought-prone Southwestern US (SW), wet and dry episodes are strongly influenced by seasonal changes in atmospheric circulation and teleconnections to remote forcing. Here, we address the need for seasonal moisture reconstructions using paleoclimate and climate model approaches. First, we present a high-resolution (sub-annual) record of speleothem trace elements spanning the last 3000 years from Fort Huachuca Cave, AZ, to investigate the variability of regional seasonal precipitation and sustained regional droughts. In a principal component (PC) analysis of the speleothem, trace elements associated with wet (Sr, Ba) and dry (P, Y, Zn) episodes load strongly and inversely, and the associated PC signals correlate with local gridded precipitation data over the last 50 years (R > 0.6, p < 0.1). These results suggest that the elemental signals provide a seasonal moisture record for Southern Arizona. We use the record to examine the frequency and timing of extreme droughts in the region and compare the speleothem record's frequency domain characteristics with other regional moisture records and with climate model output. The speleothem record demonstrates strong low-frequency variability with pronounced multi-decadal dry periods, a feature notably lacking in drought metrics from simulations of the last millennium. We also examine the seasonal SW precipitation response to modes of climate variability and external forcings, including volcanic eruptions, in both the speleothem record and the Community Earth System Model's Last Millennium Ensemble (CESM-LME). Notably, ENSO and volcanic forcing have a discernable effect on SW seasonal precipitation in model simulations, particularly when the two processes combine to shift the

  18. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    NASA Astrophysics Data System (ADS)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (< 10 days ) and longer ( 1 year) Perturbed Parameters Ensemble (PPE) simulations at low resolution to identify model feature sensitivity to parameter changes. The CAPT tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning

  19. Aviation Safety Simulation Model

    NASA Technical Reports Server (NTRS)

    Houser, Scott; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    The Aviation Safety Simulation Model is a software tool that enables users to configure a terrain, a flight path, and an aircraft and simulate the aircraft's flight along the path. The simulation monitors the aircraft's proximity to terrain obstructions, and reports when the aircraft violates accepted minimum distances from an obstruction. This model design facilitates future enhancements to address other flight safety issues, particularly air and runway traffic scenarios. This report shows the user how to build a simulation scenario and run it. It also explains the model's output.

  20. Simulation of pesticide dissipation in soil at the catchment scale over 23 years

    NASA Astrophysics Data System (ADS)

    Queyrel, Wilfried; Florence, Habets; Hélène, Blanchoud; Céline, Schott; Laurine, Nicola

    2014-05-01

    Pesticide applications lead to contamination risks of environmental compartments causing harmful effects on water resource used for drinking water. Pesticide fate modeling is assumed to be a relevant approach to study pesticide dissipation at the catchment scale. Simulations of five herbicides (atrazine, simazine, isoproturon, chlortoluron, metolachor) and one metabolite (DEA) were carried out with the crop model STICS over a 23-year period (1990-2012). The model application was performed using real agricultural practices over a small rural catchment (104 km²) located at 60km east from Paris (France). Model applications were established for two crops: wheat and maize. The objectives of the study were i) to highlight the main processes implied in pesticide fate and transfer at long-term; ii) to assess the influence of dynamics of the remaining mass of pesticide in soil on transfer; iii) to determine the most sensitive parameters related to pesticide losses by leaching over a 23-year period. The simulated data related to crop yield, water transfer, nitrates and pesticide concentrations were first compared to observations over the 23-year period, when measurements were available at the catchment scale. Then, the evaluation of the main processes related to pesticide fate and transfer was performed using long-term simulations at a yearly time step and monthly average variations. Analyses of the monthly average variations were oriented on the impact of pesticide application, water transfer and pesticide transformation on pesticide leaching. The evolution of the remaining mass of pesticide in soil, including the mobile phase (the liquid phase) and non-mobile (adsorbed at equilibrium and non-equilibrium), was studied to evaluate the impact of pesticide stored in soil on the fraction available for leaching. Finally, a sensitivity test was performed to evaluate the more sensitive parameters regarding the remaining mass of pesticide in soil and leaching. The findings of the

  1. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less

  2. Knowledge Based Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle.

    DTIC Science & Technology

    1988-04-13

    Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle Mark S. Fox, Nizwer Husain, Malcolm...McRoberts and Y.V.Reddy CMU-RI-TR-88-5 Intelligent Systems Laboratory The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania D T T 13...years of research in the application of Artificial Intelligence to Simulation. Our focus has been in two areas: the use of Al knowledge representation

  3. Simulated groundwater flow in the Ogallala and Arikaree aquifers, Rosebud Indian Reservation area, South Dakota - Revisions with data through water year 2008 and simulations of potential future scenarios

    USGS Publications Warehouse

    Long, Andrew J.; Putnam, Larry D.

    2010-01-01

    The Ogallala and Arikaree aquifers are important water resources in the Rosebud Indian Reservation area and are used extensively for irrigation, municipal, and domestic water supplies. Drought or increased withdrawals from the Ogallala and Arikaree aquifers in the Rosebud Indian Reservation area have the potential to affect water levels in these aquifers. This report documents revisions and recalibration of a previously published three-dimensional, numerical groundwater-flow model for this area. Data for a 30-year period (water years 1979 through 2008) were used in steady-state and transient numerical simulations of groundwater flow. In the revised model, revisions include (1) extension of the transient calibration period by 10 years, (2) the use of inverse modeling for steady-state calibration, (3) model calibration to base flow for an additional four surface-water drainage basins, (4) improved estimation of transient aquifer recharge, (5) improved delineation of vegetation types, and (6) reduced cell size near large capacity water-supply wells. In addition, potential future scenarios were simulated to assess the potential effects of drought and increased groundwater withdrawals.The model comprised two layers: the upper layer represented the Ogallala aquifer and the lower layer represented the Arikaree aquifer. The model’s grid had 168 rows and 202 columns, most of which were 1,640 feet (500 meters) wide, with narrower rows and columns near large water-supply wells. Recharge to the Ogallala and Arikaree aquifers occurs from precipitation on the outcrop areas. The average recharge rates used for the steady-state simulation were 2.91 and 1.45 inches per year for the Ogallala aquifer and Arikaree aquifer, respectively, for a total rate of 255.4 cubic feet per second (ft3/s). Discharge from the aquifers occurs through evapotranspiration, discharge to streams as base flow and spring flow, and well withdrawals. Discharge rates for the steady-state simulation were 171

  4. Modeling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanham, R.; Vogt, W.G.; Mickle, M.H.

    1986-01-01

    This book presents the papers given at a conference on computerized simulation. Topics considered at the conference included expert systems, modeling in electric power systems, power systems operating strategies, energy analysis, a linear programming approach to optimum load shedding in transmission systems, econometrics, simulation in natural gas engineering, solar energy studies, artificial intelligence, vision systems, hydrology, multiprocessors, and flow models.

  5. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    PubMed

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-09-01

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  6. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  7. Comparison of Two Stochastic Daily Rainfall Models and their Ability to Preserve Multi-year Rainfall Variability

    NASA Astrophysics Data System (ADS)

    Kamal Chowdhury, AFM; Lockart, Natalie; Willgoose, Garry; Kuczera, George; Kiem, Anthony; Parana Manage, Nadeeka

    2016-04-01

    Stochastic simulation of rainfall is often required in the simulation of streamflow and reservoir levels for water security assessment. As reservoir water levels generally vary on monthly to multi-year timescales, it is important that these rainfall series accurately simulate the multi-year variability. However, the underestimation of multi-year variability is a well-known issue in daily rainfall simulation. Focusing on this issue, we developed a hierarchical Markov Chain (MC) model in a traditional two-part MC-Gamma Distribution modelling structure, but with a new parameterization technique. We used two parameters of first-order MC process (transition probabilities of wet-to-wet and dry-to-dry days) to simulate the wet and dry days, and two parameters of Gamma distribution (mean and standard deviation of wet day rainfall) to simulate wet day rainfall depths. We found that use of deterministic Gamma parameter values results in underestimation of multi-year variability of rainfall depths. Therefore, we calculated the Gamma parameters for each month of each year from the observed data. Then, for each month, we fitted a multi-variate normal distribution to the calculated Gamma parameter values. In the model, we stochastically sampled these two Gamma parameters from the multi-variate normal distribution for each month of each year and used them to generate rainfall depth in wet days using the Gamma distribution. In another study, Mehrotra and Sharma (2007) proposed a semi-parametric Markov model. They also used a first-order MC process for rainfall occurrence simulation. But, the MC parameters were modified by using an additional factor to incorporate the multi-year variability. Generally, the additional factor is analytically derived from the rainfall over a pre-specified past periods (e.g. last 30, 180, or 360 days). They used a non-parametric kernel density process to simulate the wet day rainfall depths. In this study, we have compared the performance of our

  8. Domestic Ice Breaking Simulation Model User Guide

    DTIC Science & Technology

    2012-04-01

    Temperatures” sub-module. Notes on Ice Data Sources Selected Historical Ice Data *** D9 Historical (SIGRID Coded) NBL Waterways * D9 Waterway...numbers in NBL scheme D9 Historical Ice Data (Feet Thickness) Main Model Waterways * SIGRID code conversion to feet of ice thickness D9 Historical Ice Data...Feet Thickness) NBL Waterways * SIGRID codes Years for Ice Data ** Types of Ice Waterway Time Selected Ice and Weather Data Years DOMICE Simulation

  9. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  10. Simulation of Aerosols and Chemistry with a Unified Global Model

    NASA Technical Reports Server (NTRS)

    Chin, Mian

    2004-01-01

    This project is to continue the development of the global simulation capabilities of tropospheric and stratospheric chemistry and aerosols in a unified global model. This is a part of our overall investigation of aerosol-chemistry-climate interaction. In the past year, we have enabled the tropospheric chemistry simulations based on the GEOS-CHEM model, and added stratospheric chemical reactions into the GEOS-CHEM such that a globally unified troposphere-stratosphere chemistry and transport can be simulated consistently without any simplifications. The tropospheric chemical mechanism in the GEOS-CHEM includes 80 species and 150 reactions. 24 tracers are transported, including O3, NOx, total nitrogen (NOy), H2O2, CO, and several types of hydrocarbon. The chemical solver used in the GEOS-CHEM model is a highly accurate sparse-matrix vectorized Gear solver (SMVGEAR). The stratospheric chemical mechanism includes an additional approximately 100 reactions and photolysis processes. Because of the large number of total chemical reactions and photolysis processes and very different photochemical regimes involved in the unified simulation, the model demands significant computer resources that are currently not practical. Therefore, several improvements will be taken, such as massive parallelization, code optimization, or selecting a faster solver. We have also continued aerosol simulation (including sulfate, dust, black carbon, organic carbon, and sea-salt) in the global model to cover most of year 2002. These results have been made available to many groups worldwide and accessible from the website http://code916.gsfc.nasa.gov/People/Chin/aot.html.

  11. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  12. Evolutionary Development of the Simulation by Logical Modeling System (SIBYL)

    NASA Technical Reports Server (NTRS)

    Wu, Helen

    1995-01-01

    Through the evolutionary development of the Simulation by Logical Modeling System (SIBYL) we have re-engineered the expensive and complex IBM mainframe based Long-term Hardware Projection Model (LHPM) to a robust cost-effective computer based mode that is easy to use. We achieved significant cost reductions and improved productivity in preparing long-term forecasts of Space Shuttle Main Engine (SSME) hardware. The LHPM for the SSME is a stochastic simulation model that projects the hardware requirements over 10 years. SIBYL is now the primary modeling tool for developing SSME logistics proposals and Program Operating Plan (POP) for NASA and divisional marketing studies.

  13. Simulation as a surgical teaching model.

    PubMed

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  15. Notes on modeling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redondo, Antonio

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  16. Can climate models be tuned to simulate the global mean absolute temperature correctly?

    NASA Astrophysics Data System (ADS)

    Duan, Q.; Shi, Y.; Gong, W.

    2016-12-01

    The Inter-government Panel on Climate Change (IPCC) has already issued five assessment reports (ARs), which include the simulation of the past climate and the projection of the future climate under various scenarios. The participating models can simulate reasonably well the trend in global mean temperature change, especially of the last 150 years. However, there is a large, constant discrepancy in terms of global mean absolute temperature simulations over this period. This discrepancy remained in the same range between IPCC-AR4 and IPCC-AR5, which amounts to about 3oC between the coldest model and the warmest model. This discrepancy has great implications to the land processes, particularly the processes related to the cryosphere, and casts doubts over if land-atmosphere-ocean interactions are correctly considered in those models. This presentation aims to explore if this discrepancy can be reduced through model tuning. We present an automatic model calibration strategy to tune the parameters of a climate model so the simulated global mean absolute temperature would match the observed data over the last 150 years. An intermediate complexity model known as LOVECLIM is used in the study. This presentation will show the preliminary results.

  17. A new synoptic scale resolving global climate simulation using the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Small, R. Justin; Bacmeister, Julio; Bailey, David; Baker, Allison; Bishop, Stuart; Bryan, Frank; Caron, Julie; Dennis, John; Gent, Peter; Hsu, Hsiao-ming; Jochum, Markus; Lawrence, David; Muñoz, Ernesto; diNezio, Pedro; Scheitlin, Tim; Tomas, Robert; Tribbia, Joseph; Tseng, Yu-heng; Vertenstein, Mariana

    2014-12-01

    High-resolution global climate modeling holds the promise of capturing planetary-scale climate modes and small-scale (regional and sometimes extreme) features simultaneously, including their mutual interaction. This paper discusses a new state-of-the-art high-resolution Community Earth System Model (CESM) simulation that was performed with these goals in mind. The atmospheric component was at 0.25° grid spacing, and ocean component at 0.1°. One hundred years of "present-day" simulation were completed. Major results were that annual mean sea surface temperature (SST) in the equatorial Pacific and El-Niño Southern Oscillation variability were well simulated compared to standard resolution models. Tropical and southern Atlantic SST also had much reduced bias compared to previous versions of the model. In addition, the high resolution of the model enabled small-scale features of the climate system to be represented, such as air-sea interaction over ocean frontal zones, mesoscale systems generated by the Rockies, and Tropical Cyclones. Associated single component runs and standard resolution coupled runs are used to help attribute the strengths and weaknesses of the fully coupled run. The high-resolution run employed 23,404 cores, costing 250 thousand processor-hours per simulated year and made about two simulated years per day on the NCAR-Wyoming supercomputer "Yellowstone."

  18. Variability simulations with a steady, linearized primitive equations model

    NASA Technical Reports Server (NTRS)

    Kinter, J. L., III; Nigam, S.

    1985-01-01

    Solutions of the steady, primitive equations on a sphere, linearized about a zonally symmetric basic state are computed for the purpose of simulating monthly mean variability in the troposphere. The basic states are observed, winter monthly mean, zonal means of zontal and meridional velocities, temperatures and surface pressures computed from the 15 year NMC time series. A least squares fit to a series of Legendre polynomials is used to compute the basic states between 20 H and the equator, and the hemispheres are assumed symmetric. The model is spectral in the zonal direction, and centered differences are employed in the meridional and vertical directions. Since the model is steady and linear, the solution is obtained by inversion of a block, pente-diagonal matrix. The model simulates the climatology of the GFDL nine level, spectral general circulation model quite closely, particularly in middle latitudes above the boundary layer. This experiment is an extension of that simulation to examine variability of the steady, linear solution.

  19. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emily Snyder; John Drake; Ryan James

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer becamemore » the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors. The interaction of

  20. Effect of a Simulation Exercise on Restorative Identification Skills of First Year Dental Hygiene Students.

    PubMed

    Lemaster, Margaret; Flores, Joyce M; Blacketer, Margaret S

    2016-02-01

    This study explored the effectiveness of simulated mouth models to improve identification and recording of dental restorations when compared to using traditional didactic instruction combined with 2-dimensional images. Simulation has been adopted into medical and dental education curriculum to improve both student learning and patient safety outcomes. A 2-sample, independent t-test analysis of data was conducted to compare graded dental recordings of dental hygiene students using simulated mouth models and dental hygiene students using 2-dimensional photographs. Evaluations from graded dental charts were analyzed and compared between groups of students using the simulated mouth models containing random placement of custom preventive and restorative materials and traditional 2-dimensional representations of didactically described conditions. Results demonstrated a statistically significant (p≤0.0001) difference: for experimental group, students using the simulated mouth models to identify and record dental conditions had a mean of 86.73 and variance of 33.84. The control group students using traditional 2-dimensional images mean graded dental chart scores were 74.43 and variance was 14.25. Using modified simulation technology for dental charting identification may increase level of dental charting skill competency in first year dental hygiene students. Copyright © 2016 The American Dental Hygienists’ Association.

  1. Standards in Modeling and Simulation: The Next Ten Years MODSIM World Paper 2010

    NASA Technical Reports Server (NTRS)

    Collins, Andrew J.; Diallo, Saikou; Sherfey, Solomon R.; Tolk, Andreas; Turnitsa, Charles D.; Petty, Mikel; Wiesel, Eric

    2011-01-01

    The world has moved on since the introduction of the Distributed Interactive Simulation (DIS) standard in the early 1980s. The cold-war maybe over but there is still a requirement to train for and analyze the next generation of threats that face the free world. With the emergence of new and more powerful computer technology and techniques means that modeling and simulation (M&S) has become an important and growing, part in satisfying this requirement. As an industry grows, the benefits from standardization within that industry grow with it. For example, it is difficult to imagine what the USA would be like without the 110 volts standard for domestic electricity supply. This paper contains an overview of the outcomes from a recent workshop to investigate the possible future of M&S standards within the federal government.

  2. Multi-millennia simulation of Greenland deglaciation from the Max-Plank-Institute Model (MPI-ISM) 2xCO2 simulation

    NASA Astrophysics Data System (ADS)

    Cabot, Vincent; Vizcaino, Miren; Mikolajewicz, Uwe

    2016-04-01

    Long-term ice sheet and climate coupled simulations are of great interest since they assess how the Greenland Ice Sheet (GrIS) will respond to global warming and how GrIS changes will impact on the climate system. We have run the Max-Plank-Institute Earth System Model coupled with an Ice Sheet Model (SICOPOLIS) over a time period of 10500 years under two times CO2 forcing. This is a coupled atmosphere (ECHAM5T31), ocean (MPI-OM), dynamic vegetation (LPJ), and ice sheet (SICOPOLIS, 10 km horizontal resolution) model. Given the multi-millennia simulation, the horizontal spatial resolution of the atmospheric component is relatively coarse (3.75°). A time-saving technique (asynchronous coupling) is used once the global climate reaches quasi-equilibrium. In our doubling-CO2 simulation, the GrIS is expected to break up into two pieces (one ice cap in the far north on one ice sheet in the south and east) after 3000 years. During the first 500 simulation years, the GrIS climate and surface mass balance (SMB) are mainly affected by the greenhouse effect-forced climate change. After the simulated year 500, the global climate reaches quasi-equilibrium. Henceforth Greenland climate change is mainly due to ice sheet decay. GrIS albedo reduction enhances melt and acts as a powerful feedback for deglaciation. Due to increased cloudiness in the Arctic region as a result of global climate change, summer incoming shortwave radiation is substantially reduced over Greenland, reducing deglaciation rates. At the end of the simulation, Greenland becomes green with forest growing over the newly deglaciated regions. References: Helsen, M. M., van de Berg, W. J., van de Wal, R. S. W., van den Broeke, M. R., and Oerlemans, J. (2013), Coupled regional climate-ice-sheet simulation shows limited Greenland ice loss during the Eemian, Climate of the Past, 9, 1773-1788, doi: 10.5194/cp-9-1773-2013 Helsen, M. M., van de Wal, R. S. W., van den Broeke, M. R., van de Berg, W. J., and Oerlemans, J

  3. Theory and observations: Model simulations of the period 1955-1985

    NASA Technical Reports Server (NTRS)

    Isaksen, Ivar S. A.; Eckman, R.; Lacis, A.; Ko, Malcolm K. W.; Prather, M.; Pyle, J.; Rodhe, H.; Stordal, Frode; Stolarski, R. S.; Turco, R. P.

    1989-01-01

    The main objective of the theoretical studies presented here is to apply models of stratospheric chemistry and transport in order to understand the processes that control stratospheric ozone and that are responsible for the observed variations. The model calculations are intended to simulate the observed behavior of atmospheric ozone over the past three decades (1955-1985), for which there exists a substantial record of both ground-based and, more recently, satellite measurements. Ozone concentrations in the atmosphere vary on different time scales and for several different causes. The models described here were designed to simulate the effect on ozone of changes in the concentration of such trace gases as CFC, CH4, N2O, and CO2. Changes from year to year in ultraviolet radiation associated with the solar cycle are also included in the models. A third source of variability explicitly considered is the sporadic introduction of large amounts of NO sub x into the stratosphere during atmospheric nuclear tests.

  4. DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory

    The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...

  5. Progress in modeling and simulation.

    PubMed

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  6. Research on monocentric model of urbanization by agent-based simulation

    NASA Astrophysics Data System (ADS)

    Xue, Ling; Yang, Kaizhong

    2008-10-01

    Over the past years, GIS have been widely used for modeling urbanization from a variety of perspectives such as digital terrain representation and overlay analysis using cell-based data platform. Similarly, simulation of urban dynamics has been achieved with the use of Cellular Automata. In contrast to these approaches, agent-based simulation provides a much more powerful set of tools. This allows researchers to set up a counterpart for real environmental and urban systems in computer for experimentation and scenario analysis. This Paper basically reviews the research on the economic mechanism of urbanization and an agent-based monocentric model is setup for further understanding the urbanization process and mechanism in China. We build an endogenous growth model with dynamic interactions between spatial agglomeration and urban development by using agent-based simulation. It simulates the migration decisions of two main types of agents, namely rural and urban households between rural and urban area. The model contains multiple economic interactions that are crucial in understanding urbanization and industrial process in China. These adaptive agents can adjust their supply and demand according to the market situation by a learning algorithm. The simulation result shows this agent-based urban model is able to perform the regeneration and to produce likely-to-occur projections of reality.

  7. Bringing consistency to simulation of population models--Poisson simulation as a bridge between micro and macro simulation.

    PubMed

    Gustafsson, Leif; Sternad, Mikael

    2007-10-01

    Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.

  8. Hydrological and water quality processes simulation by the integrated MOHID model

    NASA Astrophysics Data System (ADS)

    Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-04-01

    Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).

  9. Modeling and Simulation: PowerBoosting Productivity with Simulation.

    ERIC Educational Resources Information Center

    Riley, Suzanne

    Minnesota high school students and teachers are learning the technology of simulation and integrating it into business and industrial technology courses. Modeling and simulation is the science of using software to construct a system within an organization and then running simulations of proposed changes to assess results before funds are spent. In…

  10. Digital Simulation and Modelling.

    ERIC Educational Resources Information Center

    Hawthorne, G. B., Jr.

    A basically tutorial point of view is taken in this general discussion. The author examines the basic concepts and principles of simulation and modelling and the application of digital computers to these tasks. Examples of existing simulations, a discussion of the applicability and feasibility of simulation studies, a review of simulation…

  11. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    NASA Technical Reports Server (NTRS)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  12. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  13. Integrated Water Resources Simulation Model for Rural Community

    NASA Astrophysics Data System (ADS)

    Li, Y.-H.; Liao, W.-T.; Tung, C.-P.

    2012-04-01

    The purpose of this study is to develop several water resources simulation models for residence houses, constructed wetlands and farms and then integrate these models for a rural community. Domestic and irrigation water uses are the major water demand in rural community. To build up a model estimating domestic water demand for residence houses, the average water use per person per day should be accounted first, including water uses of kitchen, bathroom, toilet and laundry. On the other hand, rice is the major crop in the study region, and its productive efficiency sometimes depends on the quantity of irrigation water. The water demand can be estimated by crop water use, field leakage and water distribution loss. Irrigation water comes from rainfall, water supply system and reclaimed water which treated by constructed wetland. In recent years, constructed wetlands play an important role in water resources recycle. They can purify domestic wastewater for water recycling and reuse. After treating from constructed wetlands, the reclaimed water can be reused in washing toilets, watering gardens and irrigating farms. Constructed wetland is one of highly economic benefits for treating wastewater through imitating the processing mechanism of natural wetlands. In general, the treatment efficiency of constructed wetlands is determined by evapotranspiration, inflow, and water temperature. This study uses system dynamics modeling to develop models for different water resource components in a rural community. Furthermore, these models are integrated into a whole system. The model not only is utilized to simulate how water moves through different components, including residence houses, constructed wetlands and farms, but also evaluates the efficiency of water use. By analyzing the flow of water, the water resource simulation model can optimizes water resource distribution under different scenarios, and the result can provide suggestions for designing water resource system of a

  14. Stratospheric temperatures and tracer transport in a nudged 4-year middle atmosphere GCM simulation

    NASA Astrophysics Data System (ADS)

    van Aalst, M. K.; Lelieveld, J.; Steil, B.; Brühl, C.; Jöckel, P.; Giorgetta, M. A.; Roelofs, G.-J.

    2005-02-01

    We have performed a 4-year simulation with the Middle Atmosphere General Circulation Model MAECHAM5/MESSy, while slightly nudging the model's meteorology in the free troposphere (below 113 hPa) towards ECMWF analyses. We show that the nudging 5 technique, which leaves the middle atmosphere almost entirely free, enables comparisons with synoptic observations. The model successfully reproduces many specific features of the interannual variability, including details of the Antarctic vortex structure. In the Arctic, the model captures general features of the interannual variability, but falls short in reproducing the timing of sudden stratospheric warmings. A 10 detailed comparison of the nudged model simulations with ECMWF data shows that the model simulates realistic stratospheric temperature distributions and variabilities, including the temperature minima in the Antarctic vortex. Some small (a few K) model biases were also identified, including a summer cold bias at both poles, and a general cold bias in the lower stratosphere, most pronounced in midlatitudes. A comparison 15 of tracer distributions with HALOE observations shows that the model successfully reproduces specific aspects of the instantaneous circulation. The main tracer transport deficiencies occur in the polar lowermost stratosphere. These are related to the tropopause altitude as well as the tracer advection scheme and model resolution. The additional nudging of equatorial zonal winds, forcing the quasi-biennial oscillation, sig20 nificantly improves stratospheric temperatures and tracer distributions.

  15. Modeling and simulation of flow field in giant magnetostrictive pump

    NASA Astrophysics Data System (ADS)

    Zhao, Yapeng; Ren, Shiyong; Lu, Quanguo

    2017-09-01

    Recent years, there has been significant research in the design and analysis of giant magnetostrictive pump. In this paper, the flow field model of giant magnetostrictive pump was established and the relationship between pressure loss and working frequency of piston was studied by numerical simulation method. Then, the influence of different pump chamber height on pressure loss in giant magnetostrictive pump was studied by means of flow field simulation. Finally, the fluid pressure and velocity vector distribution in giant magnetostrictive pump chamber were simulated.

  16. Reimplementation of the Biome-BGC model to simulate successional change.

    PubMed

    Bond-Lamberty, Ben; Gower, Stith T; Ahl, Douglas E; Thornton, Peter E

    2005-04-01

    Biogeochemical process models are increasingly employed to simulate current and future forest dynamics, but most simulate only a single canopy type. This limitation means that mixed stands, canopy succession and understory dynamics cannot be modeled, severe handicaps in many forests. The goals of this study were to develop a version of Biome-BGC that supported multiple, interacting vegetation types, and to assess its performance and limitations by comparing modeled results to published data from a 150-year boreal black spruce (Picea mariana (Mill.) BSP) chronosequence in northern Manitoba, Canada. Model data structures and logic were modified to support an arbitrary number of interacting vegetation types; an explicit height calculation was necessary to prioritize radiation and precipitation interception. Two vegetation types, evergreen needle-leaf and deciduous broadleaf, were modeled based on site-specific meteorological and physiological data. The new version of Biome-BGC reliably simulated observed changes in leaf area, net primary production and carbon stocks, and should be useful for modeling the dynamics of mixed-species stands and ecological succession. We discuss the strengths and limitations of Biome-BGC for this application, and note areas in which further work is necessary for reliable simulation of boreal biogeochemical cycling at a landscape scale.

  17. Simulation workshops with first year midwifery students.

    PubMed

    Catling, Christine; Hogan, Rosemarie; Fox, Deborah; Cummins, Allison; Kelly, Michelle; Sheehan, Athena

    2016-03-01

    Simulated teaching methods enable a safe learning environment that are structured, constructive and reflective. We prepared a 2-day simulation project to help prepare students for their first clinical practice. A quasi-experimental pre-test - post-test design was conducted. Qualitative data from the open-ended survey questions were analysed using content analysis. Confidence intervals and p-values were calculated to demonstrate the changes in participants' levels of understanding/ability or confidence in clinical midwifery skills included in the simulation. 71 midwifery students participated. Students rated their understanding, confidence, and abilities as higher after the simulation workshop, and higher still after their clinical experience. There were five main themes arising from the qualitative data: having a learning experience, building confidence, identifying learning needs, developing communication skills and putting skills into practise. First year midwifery students felt well prepared for the clinical workplace following the simulation workshops. Self-rated understanding, confidence and abilities in clinical midwifery skills were significantly higher following consolidation during clinical placement. Longitudinal studies on the relationship between simulation activities and student's overall clinical experience, their intentions to remain in midwifery, and facility feedback, would be desirable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Atmospheric Sulfur Cycle Simulated in The Global Model GOCART: Model Description and Global Properties

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Rood, Richard B.; Lin, Shian-Jiann; Mueller, Jean-Francois; Thompson, Anne M.

    2000-01-01

    The Georgia Tech/Goddard Global Ozone Chemistry Aerosol Radiation and Transport (GOCART) model is used to simulate the atmospheric sulfur cycle. The model uses the simulated meteorological data from the Goddard Earth Observing System Data Assimilation System (GEOS DAS). Global sulfur budgets from a 6-year simulation for SO2, sulfate, dimethylsulfide (DMS), and methanesulfonic acid (MSA) are presented in this paper. In a normal year without major volcanic perturbations, about 20% of the sulfate precursor emission is from natural sources (biogenic and volcanic) and 80% is anthropogenic: the same sources contribute 339% and 67% respectively to the total sulfate burden. A sulfate production efficiency of 0.41 - 0.42 is estimated in the model, an efficiency which is defined as a ratio of the amount oi sulfate produced to the total amount of SO2 emitted and produced in the atmosphere. This value indicates that less than half of the SO2 entering the atmosphere contributes to the sulfate production, the rest being removed by dry and wet depositions. In a simulation for 1990, we estimate a total sulfate production of 39 Tg S /yr with 36% and 64% respectively from in-air and in-cloud oxidation of SO2. We also demonstrate that major volcanic eruptions, such as the Mt. Pinatubo eruption in 1991, can significantly change the sulfate formation pathways, distributions, abundance, and lifetime. Comparison with other models shows that the parameterizations for wet removal or wet production of sulfate are the most critical factors in determining the burdens of SO2 and sulfate. Therefore, a priority for future research should be to reduce the large uncertainties associated with the wet physical and chemical processes.

  19. Transient climate simulations of the deglaciation 21-9 thousand years before present; PMIP4 Core experiment design and boundary conditions

    NASA Astrophysics Data System (ADS)

    Ivanovic, Ruza; Gregoire, Lauren; Kageyama, Masa; Roche, Didier; Valdes, Paul; Burke, Andrea; Drummond, Rosemarie; Peltier, W. Richard; Tarasov, Lev

    2016-04-01

    The last deglaciation, which marked the transition between the last glacial and present interglacial periods, was punctuated by a series of rapid (centennial and decadal) climate changes. Numerical climate models are useful for investigating mechanisms that underpin the events, especially now that some of the complex models can be run for multiple millennia. We have set up a Paleoclimate Modelling Intercomparison Project (PMIP) working group to coordinate efforts to run transient simulations of the last deglaciation, and to facilitate the dissemination of expertise between modellers and those engaged with reconstructing the climate of the last 21 thousand years. Here, we present the design of a coordinated Core simulation over the period 21-9 thousand years before present (ka) with time varying orbital forcing, greenhouse gases, ice sheets, and other geographical changes. A choice of two ice sheet reconstructions is given. Additional focussed simulations will also be coordinated on an ad-hoc basis by the working group, for example to investigate the effect of ice sheet and iceberg meltwater, and the uncertainty in other forcings. Some of these focussed simulations will concentrate on shorter durations around specific events to allow the more computationally expensive models to take part. Ivanovic, R. F., Gregoire, L. J., Kageyama, M., Roche, D. M., Valdes, P. J., Burke, A., Drummond, R., Peltier, W. R., and Tarasov, L.: Transient climate simulations of the deglaciation 21-9 thousand years before present; PMIP4 Core experiment design and boundary conditions, Geosci. Model Dev. Discuss., 8, 9045-9102, doi:10.5194/gmdd-8-9045-2015, 2015.

  20. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  1. Consistent biases in Antarctic sea ice concentration simulated by climate models

    NASA Astrophysics Data System (ADS)

    Roach, Lettie A.; Dean, Samuel M.; Renwick, James A.

    2018-01-01

    The simulation of Antarctic sea ice in global climate models often does not agree with observations. In this study, we examine the compactness of sea ice, as well as the regional distribution of sea ice concentration, in climate models from the latest Coupled Model Intercomparison Project (CMIP5) and in satellite observations. We find substantial differences in concentration values between different sets of satellite observations, particularly at high concentrations, requiring careful treatment when comparing to models. As a fraction of total sea ice extent, models simulate too much loose, low-concentration sea ice cover throughout the year, and too little compact, high-concentration cover in the summer. In spite of the differences in physics between models, these tendencies are broadly consistent across the population of 40 CMIP5 simulations, a result not previously highlighted. Separating models with and without an explicit lateral melt term, we find that inclusion of lateral melt may account for overestimation of low-concentration cover. Targeted model experiments with a coupled ocean-sea ice model show that choice of constant floe diameter in the lateral melt scheme can also impact representation of loose ice. This suggests that current sea ice thermodynamics contribute to the inadequate simulation of the low-concentration regime in many models.

  2. Integrating Visualizations into Modeling NEST Simulations

    PubMed Central

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  3. A Scoping Review: Conceptualizations and Pedagogical Models of Learning in Nursing Simulation

    ERIC Educational Resources Information Center

    Poikela, Paula; Teräs, Marianne

    2015-01-01

    Simulations have been implemented globally in nursing education for years with diverse conceptual foundations. The aim of this scoping review is to examine the literature regarding the conceptualizations of learning and pedagogical models in nursing simulations. A scoping review of peer-reviewed articles published between 2000 and 2013 was…

  4. Simulating Mass Removal of Groundwater Contaminant Plumes with Complex and Simple Models

    NASA Astrophysics Data System (ADS)

    Lopez, J.; Guo, Z.; Fogg, G. E.

    2016-12-01

    Chlorinated solvents used in industrial, commercial, and other applications continue to pose significant threats to human health through contamination of groundwater resources. A recent National Research Council report concludes that it is unlikely that remediation of these complex sites will be achieved in a time frame of 50-100 years under current methods and standards (NRC, 2013). Pump and treat has been a common strategy at many sites to contain and treat groundwater contamination. In these sites, extensive retention of contaminant mass in low-permeability materials (tailing) has been observed after years or decades of pumping. Although transport models can be built that contain enough of the complex, 3D heterogeneity to simulate the tailing and long cleanup times, this is seldom done because of the large data and computational burdens. Hence, useful, reliable models to simulate various cleanup strategies are rare. The purpose of this study is to explore other potential ways to simulate the mass-removal processes with shorter time and less cost but still produce robust results by capturing effects of the heterogeneity and long-term retention of mass. A site containing a trichloroethylene groundwater plume was selected as the study area. The plume is located within alluvial sediments in the Tucson Basin. A fully heterogeneous domain is generated first and MODFLOW is used to simulate the flow field. Contaminant transport is simulated using both MT3D and RWHet for the fully heterogeneous model. Other approaches, including dual-domain mass transfer and heterogeneous chemical reactions, are manipulated to simulate the mass removal in a less heterogeneous, or homogeneous, domain and results are compared to the results obtained from complex models. The capability of these simpler models to simulate remediation processes, especially capture the late-time tailing, are examined.

  5. A new approach to flow simulation using hybrid models

    NASA Astrophysics Data System (ADS)

    Solgi, Abazar; Zarei, Heidar; Nourani, Vahid; Bahmani, Ramin

    2017-11-01

    The necessity of flow prediction in rivers, for proper management of water resource, and the need for determining the inflow to the dam reservoir, designing efficient flood warning systems and so forth, have always led water researchers to think about models with high-speed response and low error. In the recent years, the development of Artificial Neural Networks and Wavelet theory and using the combination of models help researchers to estimate the river flow better and better. In this study, daily and monthly scales were used for simulating the flow of Gamasiyab River, Nahavand, Iran. The first simulation was done using two types of ANN and ANFIS models. Then, using wavelet theory and decomposing input signals of the used parameters, sub-signals were obtained and were fed into the ANN and ANFIS to obtain hybrid models of WANN and WANFIS. In this study, in addition to the parameters of precipitation and flow, parameters of temperature and evaporation were used to analyze their effects on the simulation. The results showed that using wavelet transform improved the performance of the models in both monthly and daily scale. However, it had a better effect on the monthly scale and the WANFIS was the best model.

  6. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    DTIC Science & Technology

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  7. A Data Stream Model For Runoff Simulation In A Changing Environment

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Shao, J.; Zhang, H.; Wang, G.

    2017-12-01

    Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.

  8. Modeling of Army Research Laboratory EMP simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miletta, J.R.; Chase, R.J.; Luu, B.B.

    1993-12-01

    Models are required that permit the estimation of emitted field signatures from EMP simulators to design the simulator antenna structure, to establish the usable test volumes, and to estimate human exposure risk. This paper presents the capabilities and limitations of a variety of EMP simulator models useful to the Army's EMP survivability programs. Comparisons among frequency and time-domain models are provided for two powerful US Army Research Laboratory EMP simulators: AESOP (Army EMP Simulator Operations) and VEMPS II (Vertical EMP Simulator II).

  9. Comparison of simulator fidelity model predictions with in-simulator evaluation data

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.

    1983-01-01

    A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.

  10. SEIR model simulation for Hepatitis B

    NASA Astrophysics Data System (ADS)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B. With approval from the proceedings editor article 020185 titled, "SEIR model simulation for Hepatitis B," is retracted from the public record, as it is a duplication of article 020198 published in the same volume.

  11. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  12. Magnetosphere Modeling: From Cartoons to Simulations

    NASA Astrophysics Data System (ADS)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  13. A Generic Multibody Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason Richard; Kenney, Patrick Sean

    2006-01-01

    Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.

  14. SEIR model simulation for Hepatitis B

    NASA Astrophysics Data System (ADS)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  15. One hundred years of Arctic ice cover variations as simulated by a one-dimensional, ice-ocean model

    NASA Astrophysics Data System (ADS)

    Hakkinen, S.; Mellor, G. L.

    1990-09-01

    A one-dimensional ice-ocean model consisting of a second moment, turbulent closure, mixed layer model and a three-layer snow-ice model has been applied to the simulation of Arctic ice mass and mixed layer properties. The results for the climatological seasonal cycle are discussed first and include the salt and heat balance in the upper ocean. The coupled model is then applied to the period 1880-1985, using the surface air temperature fluctuations from Hansen et al. (1983) and from Wigley et al. (1981). The analysis of the simulated large variations of the Arctic ice mass during this period (with similar changes in the mixed layer salinity) shows that the variability in the summer melt determines to a high degree the variability in the average ice thickness. The annual oceanic heat flux from the deep ocean and the maximum freezing rate and associated nearly constant minimum surface salinity flux did not vary significantly interannually. This also implies that the oceanic influence on the Arctic ice mass is minimal for the range of atmospheric variability tested.

  16. Climate Simulations based on a different-grid nested and coupled model

    NASA Astrophysics Data System (ADS)

    Li, Dan; Ji, Jinjun; Li, Yinpeng

    2002-05-01

    An atmosphere-vegetation interaction model (A VIM) has been coupled with a nine-layer General Cir-culation Model (GCM) of Institute of Atmospheic Physics/State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics (IAP/LASG), which is rhomboidally truncated at zonal wave number 15, to simulate global climatic mean states. A VIM is a model having inter-feedback between land surface processes and eco-physiological processes on land. As the first step to couple land with atmosphere completely, the physiological processes are fixed and only the physical part (generally named the SVAT (soil-vegetation-atmosphere-transfer scheme) model) of AVIM is nested into IAP/LASG L9R15 GCM. The ocean part of GCM is prescribed and its monthly sea surface temperature (SST) is the climatic mean value. With respect to the low resolution of GCM, i.e., each grid cell having lon-gitude 7.5° and latitude 4.5°, the vegetation is given a high resolution of 1.5° by 1.5° to nest and couple the fine grid cells of land with the coarse grid cells of atmosphere. The coupling model has been integrated for 15 years and its last ten-year mean of outputs was chosen for analysis. Compared with observed data and NCEP reanalysis, the coupled model simulates the main characteris-tics of global atmospheric circulation and the fields of temperature and moisture. In particular, the simu-lated precipitation and surface air temperature have sound results. The work creates a solid base on coupling climate models with the biosphere.

  17. Shuttle operations simulation model programmers'/users' manual

    NASA Technical Reports Server (NTRS)

    Porter, D. G.

    1972-01-01

    The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.

  18. Reliable results from stochastic simulation models

    Treesearch

    Donald L., Jr. Gochenour; Leonard R. Johnson

    1973-01-01

    Development of a computer simulation model is usually done without fully considering how long the model should run (e.g. computer time) before the results are reliable. However construction of confidence intervals (CI) about critical output parameters from the simulation model makes it possible to determine the point where model results are reliable. If the results are...

  19. Simulating soil C stability with mechanistic systems models: a multisite comparison of measured fractions and modelled pools

    NASA Astrophysics Data System (ADS)

    Robertson, Andy; Schipanski, Meagan; Sherrod, Lucretia; Ma, Liwang; Ahuja, Lajpat; McNamara, Niall; Smith, Pete; Davies, Christian

    2016-04-01

    Agriculture, covering more than 30% of global land area, has an exciting opportunity to help combat climate change by effectively managing its soil to promote increased C sequestration. Further, newly sequestered soil carbon (C) through agriculture needs to be stored in more stable forms in order to have a lasting impact on reducing atmospheric CO2 concentrations. While land uses in different climates and soils require different management strategies, the fundamental mechanisms that regulate C sequestration and stabilisation remain the same. These mechanisms are used by a number of different systems models to simulate C dynamics, and thus assess the impacts of change in management or climate. To evaluate the accuracy of these model simulations, our research uses a multidirectional approach to compare C stocks of physicochemical soil fractions collected at two long-term agricultural sites. Carbon stocks for a number of soil fractions were measured at two sites (Lincoln, UK; Colorado, USA) over 8 and 12 years, respectively. Both sites represent managed agricultural land but have notably different climates and levels of disturbance. The measured soil fractions act as proxies for varying degrees of stability, with C contained within these fractions relatable to the C simulated within the soil pools of mechanistic systems models1. Using stable isotope techniques at the UK site, specific turnover times of C within the different fractions were determined and compared with those simulated in the pools of 3 different models of varying complexity (RothC, DayCent and RZWQM2). Further, C dynamics and N-mineralisation rates of the measured fractions at the US site were assessed and compared to results of the same three models. The UK site saw a significant increase in C stocks within the most stable fractions, with topsoil (0-30cm) sequestration rates of just over 0.3 tC ha-1 yr-1 after only 8 years. Further, the sum of all fractions reported C sequestration rates of nearly 1

  20. Conceptual and numerical models of groundwater flow in the Ogallala aquifer in Gregory and Tripp Counties, South Dakota, water years 1985--2009

    USGS Publications Warehouse

    Davis, Kyle W.; Putnam, Larry D.

    2013-01-01

    The Ogallala aquifer is an important water resource for the Rosebud Sioux Tribe in Gregory and Tripp Counties in south-central South Dakota and is used for irrigation, public supply, domestic, and stock water supplies. To better understand groundwater flow in the Ogallala aquifer, conceptual and numerical models of groundwater flow were developed for the aquifer. A conceptual model of the Ogallala aquifer was used to analyze groundwater flow and develop a numerical model to simulate groundwater flow in the aquifer. The MODFLOW–NWT model was used to simulate transient groundwater conditions for water years 1985–2009. The model was calibrated using statistical parameter estimation techniques. Potential future scenarios were simulated using the input parameters from the calibrated model for simulations of potential future drought and future increased pumping. Transient simulations were completed with the numerical model. A 200-year transient initialization period was used to establish starting conditions for the subsequent 25-year simulation of water years 1985–2009. The 25-year simulation was discretized into three seasonal stress periods per year and used to simulate transient conditions. A single-layer model was used to simulate flow and mass balance in the Ogallala aquifer with a grid of 133 rows and 282 columns and a uniform spacing of 500 meters (1,640 feet). Regional inflow and outflow were simulated along the western and southern boundaries using specified-head cells. All other boundaries were simulated using no-flow cells. Recharge to the aquifer occurs through precipitation on the outcrop area. Model calibration was accomplished using the Parameter Estimation (PEST) program that adjusted individual model input parameters and assessed the difference between estimated and model-simulated values of hydraulic head and base flow. This program was designed to estimate parameter values that are statistically the most likely set of values to result in the

  1. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  2. Year-round simulated methane emissions from a permafrost ecosystem in Northeast Siberia

    NASA Astrophysics Data System (ADS)

    Castro-Morales, Karel; Kleinen, Thomas; Kaiser, Sonja; Zaehle, Sönke; Kittler, Fanny; Kwon, Min Jung; Beer, Christian; Göckede, Mathias

    2018-05-01

    Wetlands of northern high latitudes are ecosystems highly vulnerable to climate change. Some degradation effects include soil hydrologic changes due to permafrost thaw, formation of deeper active layers, and rising topsoil temperatures that accelerate the degradation of permafrost carbon and increase in CO2 and CH4 emissions. In this work we present 2 years of modeled year-round CH4 emissions into the atmosphere from a Northeast Siberian region in the Russian Far East. We use a revisited version of the process-based JSBACH-methane model that includes four CH4 transport pathways: plant-mediated transport, ebullition and molecular diffusion in the presence or absence of snow. The gas is emitted through wetlands represented by grid cell inundated areas simulated with a TOPMODEL approach. The magnitude of the summertime modeled CH4 emissions is comparable to ground-based CH4 fluxes measured with the eddy covariance technique and flux chambers in the same area of study, whereas wintertime modeled values are underestimated by 1 order of magnitude. In an annual balance, the most important mechanism for transport of methane into the atmosphere is through plants (61 %). This is followed by ebullition ( ˜ 35 %), while summertime molecular diffusion is negligible (0.02 %) compared to the diffusion through the snow during winter ( ˜ 4 %). We investigate the relationship between temporal changes in the CH4 fluxes, soil temperature, and soil moisture content. Our results highlight the heterogeneity in CH4 emissions at landscape scale and suggest that further improvements to the representation of large-scale hydrological conditions in the model will facilitate a more process-oriented land surface scheme and better simulate CH4 emissions under climate change. This is especially necessary at regional scales in Arctic ecosystems influenced by permafrost thaw.

  3. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  4. Coupling a three-dimensional subsurface flow model with a land surface model to simulate stream-aquifer-land interactions

    NASA Astrophysics Data System (ADS)

    Huang, M.; Bisht, G.; Zhou, T.; Chen, X.; Dai, H.; Hammond, G. E.; Riley, W. J.; Downs, J.; Liu, Y.; Zachara, J. M.

    2016-12-01

    A fully coupled three-dimensional surface and subsurface land model is developed and applied to a site along the Columbia River to simulate three-way interactions among river water, groundwater, and land surface processes. The model features the coupling of the Community Land Model version 4.5 (CLM4.5) and a massively-parallel multi-physics reactive tranport model (PFLOTRAN). The coupled model (CLM-PFLOTRAN) is applied to a 400m×400m study domain instrumented with groundwater monitoring wells in the Hanford 300 Area along the Columbia River. CLM-PFLOTRAN simulations are performed at three different spatial resolutions over the period 2011-2015 to evaluate the impact of spatial resolution on simulated variables. To demonstrate the difference in model simulations with and without lateral subsurface flow, a vertical-only CLM-PFLOTRAN simulation is also conducted for comparison. Results show that the coupled model is skillful in simulating stream-aquifer interactions, and the land-surface energy partitioning can be strongly modulated by groundwater-river water interactions in high water years due to increased soil moisture availability caused by elevated groundwater table. In addition, spatial resolution does not seem to impact the land surface energy flux simulations, although it is a key factor for accurately estimating the mass exchange rates at the boundaries and associated biogeochemical reactions in the aquifer. The coupled model developed in this study establishes a solid foundation for understanding co-evolution of hydrology and biogeochemistry along the river corridors under historical and future hydro-climate changes.

  5. Two-year concurrent observation of isoprene at 20 sites over China: comparison with MEGAN-REAM model simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Yang, W.; Zhang, R.; Zhang, Z.; Lyu, S.; Yu, J.; Wang, Y.; Wang, G.; Wang, X.

    2017-12-01

    Isoprene, the most abundant non-methane hydrocarbon emitted from plants, directly and indirectly affects atmospheric photochemistry and radiative forcing, yet narrowing its emission uncertainties is a continuous challenge. Comparison of observed and modelled isoprene on large spatiotemporal scales would help recognize factors that control isoprene variability, systematic field observation data are however quite lacking. Here we collected ambient air samples with 1 L silonite-treated stainless steel canisters simultaneously at 20 sites over China on every Wednesday at approximately 14:00 pm Beijing time from 2012 to 2014, and analyzed isoprene mixing ratios by preconcentrator-GC-MSD/FID. Observed isoprene mixing ratios were also compared with that simulated by coupling MEGAN 2.0 (Guenther et al., 2006) with a 3-D Regional chEmical trAnsport Model (REAM) (Zhang et al., 2017). Similar seasonal variations between observation and model simulation were obtained for most of sampling sites, but overall the average isoprene mixing ratios during growing months (May to October) was 0.37 ± 0.08 ppbv from model simulation, about 32% lower than that of 0.54 ± 0.20 ppbv based on ground-based observation, and this discrepancy was particularly significant in north China during wintertime. Further investigation demonstrated that emission of biogenic isoprene in northwest China might be underestimated and non-biogenic emission, such burning biomass/biofuel, might contribute to the elevated levels of isoprene during winter time. The observation-based empirical formulas for changing isoprene emission with solar radiation and temperature were also derived for different regions of China.

  6. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  7. A Five- Year CMAQ Model Performance for Wildfires and ...

    EPA Pesticide Factsheets

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  8. Simulation modeling of route guidance concept

    DOT National Transportation Integrated Search

    1997-01-01

    The methodology of a simulation model developed at the University of New South Wales, Australia, for the evaluation of performance of Dynamic Route Guidance Systems (DRGS) is described. The microscopic simulation model adopts the event update simulat...

  9. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  10. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  11. Modeling and simulation in biomedicine.

    PubMed Central

    Aarts, J.; Möller, D.; van Wijk van Brievingh, R.

    1991-01-01

    A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745

  12. Modeling, simulation, and analysis at Sandia National Laboratories for health care systems

    NASA Astrophysics Data System (ADS)

    Polito, Joseph

    1994-12-01

    Modeling, Simulation, and Analysis are special competencies of the Department of Energy (DOE) National Laboratories which have been developed and refined through years of national defense work. Today, many of these skills are being applied to the problem of understanding the performance of medical devices and treatments. At Sandia National Laboratories we are developing models at all three levels of health care delivery: (1) phenomenology models for Observation and Test, (2) model-based outcomes simulations for Diagnosis and Prescription, and (3) model-based design and control simulations for the Administration of Treatment. A sampling of specific applications include non-invasive sensors for blood glucose, ultrasonic scanning for development of prosthetics, automated breast cancer diagnosis, laser burn debridement, surgical staple deformation, minimally invasive control for administration of a photodynamic drug, and human-friendly decision support aids for computer-aided diagnosis. These and other projects are being performed at Sandia with support from the DOE and in cooperation with medical research centers and private companies. Our objective is to leverage government engineering, modeling, and simulation skills with the biotechnical expertise of the health care community to create a more knowledge-rich environment for decision making and treatment.

  13. The Mt. Hood challenge: cross-testing two diabetes simulation models.

    PubMed

    Brown, J B; Palmer, A J; Bisgaard, P; Chan, W; Pedula, K; Russell, A

    2000-11-01

    Starting from identical patients with type 2 diabetes, we compared the 20-year predictions of two computer simulation models, a 1998 version of the IMIB model and version 2.17 of the Global Diabetes Model (GDM). Primary measures of outcome were 20-year cumulative rates of: survival, first (incident) acute myocardial infarction (AMI), first stroke, proliferative diabetic retinopathy (PDR), macro-albuminuria (gross proteinuria, or GPR), and amputation. Standardized test patients were newly diagnosed males aged 45 or 75, with high and low levels of glycated hemoglobin (HbA(1c)), systolic blood pressure (SBP), and serum lipids. Both models generated realistic results and appropriate responses to changes in risk factors. Compared with the GDM, the IMIB model predicted much higher rates of mortality and AMI, and fewer strokes. These differences can be explained by differences in model architecture (Markov vs. microsimulation), different evidence bases for cardiovascular prediction (Framingham Heart Study cohort vs. Kaiser Permanente patients), and isolated versus interdependent prediction of cardiovascular events. Compared with IMIB, GDM predicted much higher lifetime costs, because of lower mortality and the use of a different costing method. It is feasible to cross-validate and explicate dissimilar diabetes simulation models using standardized patients. The wide differences in the model results that we observed demonstrate the need for cross-validation. We propose to hold a second 'Mt Hood Challenge' in 2001 and invite all diabetes modelers to attend.

  14. Space shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1980-01-01

    The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.

  15. Simulated influences of Lake Agassiz on the climate of central North America 11,000 years ago

    USGS Publications Warehouse

    Hostetler, S.W.; Bartlein, P.J.; Clark, P.U.; Small, E.E.; Solomon, A.M.

    2000-01-01

    Eleven thousand years ago, large lakes existed in central and eastern North America along the margin of the Laurentide Ice Sheet. The large-scale North American climate at this time has been simulated with atmospheric general circulation models, but these relatively coarse global models do not resolve potentially important features of the mesoscale circulation that arise from interactions among the atmosphere, ice sheet, and proglacial lakes. Here we present simulations of the climate of central and eastern North America 11,000 years ago with a high-resolution, regional climate model nested within a general circulation model. The simulated climate is in general agreement with that inferred from palaeoecological evidence. Our experiments indicate that through mesoscale atmospheric feedbacks, the annual delivery of moisture to the Laurentide Ice Sheet was diminished at times of a large, cold Lake Agassiz relative to periods of lower lake stands. The resulting changes in the mass balance of the ice sheet may have contributed to fluctuations of the ice margin, thus affecting the routing of fresh water to the North Atlantic Ocean. A retreating ice margin during periods of high lake level may have opened an outlet for discharge of Lake Agassiz into the North Atlantic. A subsequent advance of the ice margin due to greater moisture delivery associated with a low lake level could have dammed the outlet, thereby reducing discharge to the North Atlantic. These variations may have been decisive in causing the Younger Dryas cold even.

  16. Simulation-Optimization Model for Seawater Intrusion Management at Pingtung Coastal Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, P. S.; Chiu, Y.

    2015-12-01

    In 1970's, the agriculture and aquaculture were rapidly developed at Pingtung coastal area in southern Taiwan. The groundwater aquifers were over-pumped and caused the seawater intrusion. In order to remedy the contaminated groundwater and find the best strategies of groundwater usage, a management model to search the optimal groundwater operational strategies is developed in this study. The objective function is to minimize the total amount of injection water and a set of constraints are applied to ensure the groundwater levels and concentrations are satisfied. A three-dimension density-dependent flow and transport simulation model, called SEAWAT developed by U.S. Geological Survey, is selected to simulate the phenomenon of seawater intrusion. The simulation model is well calibrated by the field measurements and replaced by the surrogate model of trained artificial neural networks (ANNs) to reduce the computational time. The ANNs are embedded in the management model to link the simulation and optimization models, and the global optimizer of differential evolution (DE) is applied for solving the management model. The optimal results show that the fully trained ANNs could substitute the original simulation model and reduce much computational time. Under appropriate setting of objective function and constraints, DE can find the optimal injection rates at predefined barriers. The concentrations at the target locations could decrease more than 50 percent within the planning horizon of 20 years. Keywords : Seawater intrusion, groundwater management, numerical model, artificial neural networks, differential evolution

  17. Closed loop models for analyzing the effects of simulator characteristics. [digital simulation of human operators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D. L.

    1978-01-01

    The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.

  18. A Reduced Form Model for Ozone Based on Two Decades of CMAQ Simulations for the Continental United States

    EPA Science Inventory

    A Reduced Form Model (RFM) is a mathematical relationship between the inputs and outputs of an air quality model, permitting estimation of additional modeling without costly new regional-scale simulations. A 21-year Community Multiscale Air Quality (CMAQ) simulation for the con...

  19. Strategic Mobility 21: Modeling, Simulation, and Analysis

    DTIC Science & Technology

    2010-04-14

    using AnyLogic , which is a Java programmed, multi-method simulation modeling tool developed by XJ Technologies. The last section examines the academic... simulation model from an Arena platform to an AnyLogic based Web Service. MATLAB is useful for small problems with few nodes, but GAMS/CPLEX is better... Transportation Modeling Studio TM . The SCASN modeling and simulation program was designed to be generic in nature to allow for use by both commercial and

  20. How model and input uncertainty impact maize yield simulations in West Africa

    NASA Astrophysics Data System (ADS)

    Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli

    2015-02-01

    Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.

  1. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  2. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  3. Analysis of the Effect of Interior Nudging on Temperature and Precipitation Distributions of Multi-year Regional Climate Simulations

    NASA Astrophysics Data System (ADS)

    Nolte, C. G.; Otte, T. L.; Bowden, J. H.; Otte, M. J.

    2010-12-01

    There is disagreement in the regional climate modeling community as to the appropriateness of the use of internal nudging. Some investigators argue that the regional model should be minimally constrained and allowed to respond to regional-scale forcing, while others have noted that in the absence of interior nudging, significant large-scale discrepancies develop between the regional model solution and the driving coarse-scale fields. These discrepancies lead to reduced confidence in the ability of regional climate models to dynamically downscale global climate model simulations under climate change scenarios, and detract from the usability of the regional simulations for impact assessments. The advantages and limitations of interior nudging schemes for regional climate modeling are investigated in this study. Multi-year simulations using the WRF model driven by reanalysis data over the continental United States at 36km resolution are conducted using spectral nudging, grid point nudging, and for a base case without interior nudging. The means, distributions, and inter-annual variability of temperature and precipitation will be evaluated in comparison to regional analyses.

  4. Integration of logistic regression, Markov chain and cellular automata models to simulate urban expansion

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Helbich, Marco; Kainz, Wolfgang; Darvishi Boloorani, Ali

    2013-04-01

    This research analyses the suburban expansion in the metropolitan area of Tehran, Iran. A hybrid model consisting of logistic regression model, Markov chain (MC), and cellular automata (CA) was designed to improve the performance of the standard logistic regression model. Environmental and socio-economic variables dealing with urban sprawl were operationalised to create a probability surface of spatiotemporal states of built-up land use for the years 2006, 2016, and 2026. For validation, the model was evaluated by means of relative operating characteristic values for different sets of variables. The approach was calibrated for 2006 by cross comparing of actual and simulated land use maps. The achieved outcomes represent a match of 89% between simulated and actual maps of 2006, which was satisfactory to approve the calibration process. Thereafter, the calibrated hybrid approach was implemented for forthcoming years. Finally, future land use maps for 2016 and 2026 were predicted by means of this hybrid approach. The simulated maps illustrate a new wave of suburban development in the vicinity of Tehran at the western border of the metropolis during the next decades.

  5. Simulation of tropospheric chemistry and aerosols with the climate model EC-Earth

    NASA Astrophysics Data System (ADS)

    van Noije, T. P. C.; Le Sager, P.; Segers, A. J.; van Velthoven, P. F. J.; Krol, M. C.; Hazeleger, W.; Williams, A. G.; Chambers, S. D.

    2014-10-01

    We have integrated the atmospheric chemistry and transport model TM5 into the global climate model EC-Earth version 2.4. We present an overview of the TM5 model and the two-way data exchange between TM5 and the IFS model from the European Centre for Medium-Range Weather Forecasts (ECMWF), the atmospheric general circulation model of EC-Earth. In this paper we evaluate the simulation of tropospheric chemistry and aerosols in a one-way coupled configuration. We have carried out a decadal simulation for present-day conditions and calculated chemical budgets and climatologies of tracer concentrations and aerosol optical depth. For comparison we have also performed offline simulations driven by meteorological fields from ECMWF's ERA-Interim reanalysis and output from the EC-Earth model itself. Compared to the offline simulations, the online-coupled system produces more efficient vertical mixing in the troposphere, which reflects an improvement of the treatment of cumulus convection. The chemistry in the EC-Earth simulations is affected by the fact that the current version of EC-Earth produces a cold bias with too dry air in large parts of the troposphere. Compared to the ERA-Interim driven simulation, the oxidizing capacity in EC-Earth is lower in the tropics and higher in the extratropics. The atmospheric lifetime of methane in EC-Earth is 9.4 years, which is 7% longer than the lifetime obtained with ERA-Interim but remains well within the range reported in the literature. We further evaluate the model by comparing the simulated climatologies of surface radon-222 and carbon monoxide, tropospheric and surface ozone, and aerosol optical depth against observational data. The work presented in this study is the first step in the development of EC-Earth into an Earth system model with fully interactive atmospheric chemistry and aerosols.

  6. Development and analysis of air quality modeling simulations for hazardous air pollutants

    NASA Astrophysics Data System (ADS)

    Luecken, D. J.; Hutzell, W. T.; Gipson, G. L.

    The concentrations of five hazardous air pollutants were simulated using the community multi-scale air quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results are shown for formaldehyde, acetaldehyde, benzene, 1,3-butadiene and acrolein. Photochemical production in the atmosphere is predicted to dominate ambient formaldehyde and acetaldehyde concentrations, and to account for a significant fraction of ambient acrolein concentrations. Spatial and temporal variations are large throughout the domain over the year. Predicted concentrations are compared with observations for formaldehyde, acetaldehyde, benzene and 1,3-butadiene. Although the modeling results indicate an overall slight tendency towards underprediction, they reproduce episodic and seasonal behavior of pollutant concentrations at many monitors with good skill.

  7. Model improvements to simulate charging in SEM

    NASA Astrophysics Data System (ADS)

    Arat, K. T.; Klimpel, T.; Hagen, C. W.

    2018-03-01

    Charging of insulators is a complex phenomenon to simulate since the accuracy of the simulations is very sensitive to the interaction of electrons with matter and electric fields. In this study, we report model improvements for a previously developed Monte-Carlo simulator to more accurately simulate samples that charge. The improvements include both modelling of low energy electron scattering and charging of insulators. The new first-principle scattering models provide a more realistic charge distribution cloud in the material, and a better match between non-charging simulations and experimental results. Improvements on charging models mainly focus on redistribution of the charge carriers in the material with an induced conductivity (EBIC) and a breakdown model, leading to a smoother distribution of the charges. Combined with a more accurate tracing of low energy electrons in the electric field, we managed to reproduce the dynamically changing charging contrast due to an induced positive surface potential.

  8. Collaborative modeling: the missing piece of distributed simulation

    NASA Astrophysics Data System (ADS)

    Sarjoughian, Hessam S.; Zeigler, Bernard P.

    1999-06-01

    The Department of Defense overarching goal of performing distributed simulation by overcoming geographic and time constraints has brought the problem of distributed modeling to the forefront. The High Level Architecture standard is primarily intended for simulation interoperability. However, as indicated, the existence of a distributed modeling infrastructure plays a fundamental and central role in supporting the development of distributed simulations. In this paper, we describe some fundamental distributed modeling concepts and their implications for constructing successful distributed simulations. In addition, we discuss the Collaborative DEVS Modeling environment that has been devised to enable graphically dispersed modelers to collaborate and synthesize modular and hierarchical models. We provide an actual example of the use of Collaborative DEVS Modeler in application to a project involving corporate partners developing an HLA-compliant distributed simulation exercise.

  9. TREAT Modeling and Simulation Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark David

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  10. Modifying a dynamic global vegetation model for simulating large spatial scale land surface water balance

    NASA Astrophysics Data System (ADS)

    Tang, G.; Bartlein, P. J.

    2012-01-01

    Water balance models of simple structure are easier to grasp and more clearly connect cause and effect than models of complex structure. Such models are essential for studying large spatial scale land surface water balance in the context of climate and land cover change, both natural and anthropogenic. This study aims to (i) develop a large spatial scale water balance model by modifying a dynamic global vegetation model (DGVM), and (ii) test the model's performance in simulating actual evapotranspiration (ET), soil moisture and surface runoff for the coterminous United States (US). Toward these ends, we first introduced development of the "LPJ-Hydrology" (LH) model by incorporating satellite-based land covers into the Lund-Potsdam-Jena (LPJ) DGVM instead of dynamically simulating them. We then ran LH using historical (1982-2006) climate data and satellite-based land covers at 2.5 arc-min grid cells. The simulated ET, soil moisture and surface runoff were compared to existing sets of observed or simulated data for the US. The results indicated that LH captures well the variation of monthly actual ET (R2 = 0.61, p < 0.01) in the Everglades of Florida over the years 1996-2001. The modeled monthly soil moisture for Illinois of the US agrees well (R2 = 0.79, p < 0.01) with the observed over the years 1984-2001. The modeled monthly stream flow for most 12 major rivers in the US is consistent R2 > 0.46, p < 0.01; Nash-Sutcliffe Coefficients >0.52) with observed values over the years 1982-2006, respectively. The modeled spatial patterns of annual ET and surface runoff are in accordance with previously published data. Compared to its predecessor, LH simulates better monthly stream flow in winter and early spring by incorporating effects of solar radiation on snowmelt. Overall, this study proves the feasibility of incorporating satellite-based land-covers into a DGVM for simulating large spatial scale land surface water balance. LH developed in this study should be a useful

  11. An Exercise Health Simulation Method Based on Integrated Human Thermophysiological Model

    PubMed Central

    Chen, Xiaohui; Yu, Liang; Yang, Kaixing

    2017-01-01

    Research of healthy exercise has garnered a keen research for the past few years. It is known that participation in a regular exercise program can help improve various aspects of cardiovascular function and reduce the risk of suffering from illness. But some exercise accidents like dehydration, exertional heatstroke, and even sudden death need to be brought to attention. If these exercise accidents can be analyzed and predicted before they happened, it will be beneficial to alleviate or avoid disease or mortality. To achieve this objective, an exercise health simulation approach is proposed, in which an integrated human thermophysiological model consisting of human thermal regulation model and a nonlinear heart rate regulation model is reported. The human thermoregulatory mechanism as well as the heart rate response mechanism during exercise can be simulated. On the basis of the simulated physiological indicators, a fuzzy finite state machine is constructed to obtain the possible health transition sequence and predict the exercise health status. The experiment results show that our integrated exercise thermophysiological model can numerically simulate the thermal and physiological processes of the human body during exercise and the predicted exercise health transition sequence from finite state machine can be used in healthcare. PMID:28702074

  12. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  13. Evaluating the Credibility of Transport Processes in the Global Modeling Initiative 3D Model Simulations of Ozone Recovery

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2003-01-01

    The Global Modeling Initiative has integrated two 35-year simulations of an ozone recovery scenario with an offline chemistry and transport model using two different meteorological inputs. Physically based diagnostics, derived from satellite and aircraft data sets, are described and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barrier formation in the subtropics and polar regions, and extratropical wave-driven transport. Some diagnostics are especially relevant to simulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of meteorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a general circulation model (GMI(sub GCM)) showed a very good residual circulation in the tropics and northern hemisphere. The simulation with input from a data assimilation system (GMI(sub DAS)) performed better in the midlatitudes than at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GMI(sub GCM) has greater fidelity throughout the stratosphere than the GMI(sub DAS).

  14. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  15. A satellite simulator for TRMM PR applied to climate model simulations

    NASA Astrophysics Data System (ADS)

    Spangehl, T.; Schroeder, M.; Bodas-Salcedo, A.; Hollmann, R.; Riley Dellaripa, E. M.; Schumacher, C.

    2017-12-01

    Climate model simulations have to be compared against observation based datasets in order to assess their skill in representing precipitation characteristics. Here we use a satellite simulator for TRMM PR in order to evaluate simulations performed with MPI-ESM (Earth system model of the Max Planck Institute for Meteorology in Hamburg, Germany) performed within the MiKlip project (https://www.fona-miklip.de/, funded by Federal Ministry of Education and Research in Germany). While classical evaluation methods focus on geophysical parameters such as precipitation amounts, the application of the satellite simulator enables an evaluation in the instrument's parameter space thereby reducing uncertainties on the reference side. The CFMIP Observation Simulator Package (COSP) provides a framework for the application of satellite simulators to climate model simulations. The approach requires the introduction of sub-grid cloud and precipitation variability. Radar reflectivities are obtained by applying Mie theory, with the microphysical assumptions being chosen to match the atmosphere component of MPI-ESM (ECHAM6). The results are found to be sensitive to the methods used to distribute the convective precipitation over the sub-grid boxes. Simple parameterization methods are used to introduce sub-grid variability of convective clouds and precipitation. In order to constrain uncertainties a comprehensive comparison with sub-grid scale convective precipitation variability which is deduced from TRMM PR observations is carried out.

  16. Simulating the effects of the southern pine beetle on regional dynamics 60 years into the future

    Treesearch

    Jennifer K. Costanza; Jiri Hulcr; Frank H. Koch; Todd Earnhardt; Alexa J. McKerrow; Rob R. Dunn; Jaime A. Collazo

    2012-01-01

    We developed a spatially explicit model that simulated future southern pine beetle (Dendroctonus frontalis, SPB) dynamics and pine forest management for a real landscape over 60 years to inform regional forest management. The SPB has a considerable effect on forest dynamics in the Southeastern United States, especially in loblolly pine (...

  17. Target modelling for SAR image simulation

    NASA Astrophysics Data System (ADS)

    Willis, Chris J.

    2014-10-01

    This paper examines target models that might be used in simulations of Synthetic Aperture Radar imagery. We examine the basis for scattering phenomena in SAR, and briefly review the Swerling target model set, before considering extensions to this set discussed in the literature. Methods for simulating and extracting parameters for the extended Swerling models are presented. It is shown that in many cases the more elaborate extended Swerling models can be represented, to a high degree of fidelity, by simpler members of the model set. Further, it is shown that it is quite unlikely that these extended models would be selected when fitting models to typical data samples.

  18. Simulating effects of fire on northern Rocky Mountain landscapes with the ecological process model FIRE-BGC.

    PubMed

    Keane, R E; Ryan, K C; Running, S W

    1996-03-01

    A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.

  19. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  20. Inability of CMIP5 Climate Models to Simulate Recent Multi-decadal Climate Change in the Tropical Pacific.

    NASA Astrophysics Data System (ADS)

    Power, S.; Delage, F.; Kociuba, G.; Wang, G.; Smith, I.

    2017-12-01

    Observed 15-year surface temperature trends beginning 1998 or later have attracted a great deal of interest because of an apparent slowdown in the rate of global warming, and contrasts between climate model simulations and observations of such trends. Many studies have addressed the statistical significance of these relatively short trends, whether they indicate a possible bias in models and the implications for global warming generally. Here we analyse historical and projected changes in 38 CMIP5 climate models. All of the models simulate multi-decadal warming in the Pacific over the past half-century that exceeds observed values. This stark difference cannot be fully explained by observed, internal multi-decadal climate variability, even if allowance is made for an apparent tendency for models to underestimate internal multi-decadal variability in the Pacific. We also show that CMIP5 models are not able to simulate the magnitude of the strengthening of the Walker Circulation over the past thirty years. Some of the reasons for these major shortcomings in the ability of models to simulate multi-decadal variability in the Pacific, and the impact these findings have on our confidence in global 21st century projections, will be discussed.

  1. 50 years of computer simulation of the human thermoregulatory system.

    PubMed

    Hensley, Daniel W; Mark, Andrew E; Abella, Jayvee R; Netscher, George M; Wissler, Eugene H; Diller, Kenneth R

    2013-02-01

    This paper presents an updated and augmented version of the Wissler human thermoregulation model that has been developed continuously over the past 50 years. The existing Fortran code is translated into C with extensive embedded commentary. A graphical user interface (GUI) has been developed in Python to facilitate convenient user designation of input and output variables and formatting of data presentation. Use of the code with the GUI is described and demonstrated. New physiological elements were added to the model to represent the hands and feet, including the unique vascular structures adapted for heat transfer associated with glabrous skin. The heat transfer function and efficacy of glabrous skin is unique within the entire body based on the capacity for a very high rate of blood perfusion and the novel capability for dynamic regulation of blood flow. The model was applied to quantify the absolute and relative contributions of glabrous skin flow to thermoregulation for varying levels of blood perfusion. The model also was used to demonstrate how the unique features of glabrous skin blood flow may be recruited to implement thermal therapeutic procedures. We have developed proprietary methods to manipulate the control of glabrous skin blood flow in conjunction with therapeutic devices and simulated the effect of these methods with the model.

  2. Dynamic Evaluation of Long-Term Air Quality Model Simulations Over the Northeastern U.S.

    EPA Science Inventory

    Dynamic model evaluation assesses a modeling system's ability to reproduce changes in air quality induced by changes in meteorology and/or emissions. In this paper, we illustrate various approaches to dynamic mode evaluation utilizing 18 years of air quality simulations perform...

  3. Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain

    NASA Astrophysics Data System (ADS)

    Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.

    2018-04-01

    The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.

  4. Modelling and simulation of a heat exchanger

    NASA Technical Reports Server (NTRS)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  5. Modeling and Simulation of U-tube Steam Generator

    NASA Astrophysics Data System (ADS)

    Zhang, Mingming; Fu, Zhongguang; Li, Jinyao; Wang, Mingfei

    2018-03-01

    The U-tube natural circulation steam generator was mainly researched with modeling and simulation in this article. The research is based on simuworks system simulation software platform. By analyzing the structural characteristics and the operating principle of U-tube steam generator, there are 14 control volumes in the model, including primary side, secondary side, down channel and steam plenum, etc. The model depends completely on conservation laws, and it is applied to make some simulation tests. The results show that the model is capable of simulating properly the dynamic response of U-tube steam generator.

  6. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  7. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  8. Model-free simulations of turbulent reactive flows

    NASA Technical Reports Server (NTRS)

    Givi, Peyman

    1989-01-01

    The current computational methods for solving transport equations of turbulent reacting single-phase flows are critically reviewed, with primary attention given to those methods that lead to model-free simulations. In particular, consideration is given to direct numerical simulations using spectral (Galerkin) and pseudospectral (collocation) methods, spectral element methods, and Lagrangian methods. The discussion also covers large eddy simulations and turbulence modeling.

  9. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  10. Present-day Circum-Antarctic Simulations using the POPSICLES Coupled Ice Sheet-Ocean Model

    NASA Astrophysics Data System (ADS)

    Asay-Davis, X.; Martin, D. F.; Price, S. F.; Maltrud, M. E.; Collins, W.

    2014-12-01

    We present POPSICLES simulation results covering the full Antarctic Ice Sheet and the Southern Ocean spanning the period 1990 to 2010. Simulations are performed at 0.1o (~5 km) ocean resolution and with adaptive ice-sheet model resolution as fine as 500 m. We compare time-averaged melt rates below a number of major ice shelves with those reported by Rignot et al. (2013) as well as other recent studies. We also present seasonal variability and decadal trends in submarine melting from several Antarctic regions. Finally, we explore the influence on basal melting and system dynamics resulting from two different choices of climate forcing: a "normal-year" climatology and the CORE v. 2 forcing data (Large and Yeager 2008).POPSICLES couples the POP2x ocean model, a modified version of the Parallel Ocean Program (Smith and Gent, 2002), and the BISICLES ice-sheet model (Cornford et al., 2012). POP2x includes sub-ice-shelf circulation using partial top cells (Losch, 2008) and boundary layer physics following Holland and Jenkins (1999), Jenkins (2001), and Jenkins et al. (2010). Standalone POP2x output compares well with standard ice-ocean test cases (e.g., ISOMIP; Losch, 2008) and other continental-scale simulations and melt-rate observations (Kimura et al., 2013; Rignot et al., 2013). BISICLES makes use of adaptive mesh refinement and a 1st-order accurate momentum balance similar to the L1L2 model of Schoof and Hindmarsh (2009) to accurately model regions of dynamic complexity, such as ice streams, outlet glaciers, and grounding lines. Results of BISICLES simulations have compared favorably to comparable simulations with a Stokes momentum balance in both idealized tests (MISMIP-3D; Pattyn et al., 2013) and realistic configurations (Favier et al. 2014).A companion presentation, "Response of the Antarctic Ice Sheet to ocean forcing using the POPSICLES coupled ice sheet-ocean model" in session C024 covers the ice-sheet response to these melt rates in the coupled simulation

  11. Multicriteria evaluation of discharge simulation in Dynamic Global Vegetation Models

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Piao, Shilong; Zeng, Zhenzhong; Ciais, Philippe; Yin, Yi; Friedlingstein, Pierre; Sitch, Stephen; Ahlström, Anders; Guimberteau, Matthieu; Huntingford, Chris; Levis, Sam; Levy, Peter E.; Huang, Mengtian; Li, Yue; Li, Xiran; Lomas, Mark R.; Peylin, Philippe; Poulter, Ben; Viovy, Nicolas; Zaehle, Soenke; Zeng, Ning; Zhao, Fang; Wang, Lei

    2015-08-01

    In this study, we assessed the performance of discharge simulations by coupling the runoff from seven Dynamic Global Vegetation Models (DGVMs; LPJ, ORCHIDEE, Sheffield-DGVM, TRIFFID, LPJ-GUESS, CLM4CN, and OCN) to one river routing model for 16 large river basins. The results show that the seasonal cycle of river discharge is generally modeled well in the low and middle latitudes but not in the high latitudes, where the peak discharge (due to snow and ice melting) is underestimated. For the annual mean discharge, the DGVMs chained with the routing model show an underestimation. Furthermore, the 30 year trend of discharge is also underestimated. For the interannual variability of discharge, a skill score based on overlapping of probability density functions (PDFs) suggests that most models correctly reproduce the observed variability (correlation coefficient higher than 0.5; i.e., models account for 50% of observed interannual variability) except for the Lena, Yenisei, Yukon, and the Congo river basins. In addition, we compared the simulated runoff from different simulations where models were forced with either fixed or varying land use. This suggests that both seasonal and annual mean runoff has been little affected by land use change but that the trend itself of runoff is sensitive to land use change. None of the models when considered individually show significantly better performances than any other and in all basins. This suggests that based on current modeling capability, a regional-weighted average of multimodel ensemble projections might be appropriate to reduce the bias in future projection of global river discharge.

  12. A survey on hair modeling: styling, simulation, and rendering.

    PubMed

    Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C

    2007-01-01

    Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.

  13. ENSO Simulation in Coupled Ocean-Atmosphere Models: Are the Current Models Better?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AchutaRao, K; Sperber, K R

    Maintaining a multi-model database over a generation or more of model development provides an important framework for assessing model improvement. Using control integrations, we compare the simulation of the El Nino/Southern Oscillation (ENSO), and its extratropical impact, in models developed for the 2007 Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report with models developed in the late 1990's (the so-called Coupled Model Intercomparison Project-2 [CMIP2] models). The IPCC models tend to be more realistic in representing the frequency with which ENSO occurs, and they are better at locating enhanced temperature variability over the eastern Pacific Ocean. When compared withmore » reanalyses, the IPCC models have larger pattern correlations of tropical surface air temperature than do the CMIP2 models during the boreal winter peak phase of El Nino. However, for sea-level pressure and precipitation rate anomalies, a clear separation in performance between the two vintages of models is not as apparent. The strongest improvement occurs for the modeling groups whose CMIP2 model tended to have the lowest pattern correlations with observations. This has been checked by subsampling the multi-century IPCC simulations in a manner to be consistent with the single 80-year time segment available from CMIP2. Our results suggest that multi-century integrations may be required to statistically assess model improvement of ENSO. The quality of the El Nino precipitation composite is directly related to the fidelity of the boreal winter precipitation climatology, highlighting the importance of reducing systematic model error. Over North America distinct improvement of El Nino forced boreal winter surface air temperature, sea-level pressure, and precipitation rate anomalies in the IPCC models occurs. This improvement, is directly proportional to the skill of the tropical El Nino forced precipitation anomalies.« less

  14. Fire dynamics during the 20th century simulated by the Community Land Model

    NASA Astrophysics Data System (ADS)

    Kloster, S.; Mahowald, N. M.; Randerson, J. T.; Thornton, P. E.; Hoffman, F. M.; Levis, S.; Lawrence, P. J.; Feddema, J. J.; Oleson, K. W.; Lawrence, D. M.

    2010-01-01

    Fire is an integral Earth System process that interacts with climate in multiple ways. Here we assessed the parametrization of fires in the Community Land Model (CLM-CN) and improved the ability of the model to reproduce contemporary global patterns of burned areas and fire emissions. In addition to wildfires we extended CLM-CN to account for fires related to deforestation. We compared contemporary fire carbon emissions predicted by the model to satellite based estimates in terms of magnitude, spatial extent as well as interannual and seasonal variability. Longterm trends during the 20th century were compared with historical estimates. Overall we found the best agreement between simulation and observations for the fire parametrization based on the work by Arora and Boer (2005). We obtain substantial improvement when we explicitly considered human caused ignition and fire suppression as a function of population density. Simulated fire carbon emissions ranged between 2.0 and 2.4 Pg C/year for the period 1997-2004. Regionally the simulations had a low bias over Africa and a high bias over South America when compared to satellite based products. The net terrestrial carbon source due to land use change for the 1990s was 1.2 Pg C/year with 11% stemming from deforestation fires. During 2000-2004 this flux decreased to 0.85 Pg C/year with a similar relative contribution from deforestation fires. Between 1900 and 1960 we simulated a slight downward trend in global fire emissions, which is explained by reduced fuels as a consequence of wood harvesting and partly by increasing fire suppression. The model predicted an upward trend in the last three decades of the 20th century caused by climate variations and large burning events associated with ENSO induced drought conditions.

  15. Simulation Model of A Ferroelectric Field Effect Transistor

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat Duen; Russell, Larry W. (Technical Monitor)

    2002-01-01

    An electronic simulation model has been developed of a ferroelectric field effect transistor (FFET). This model can be used in standard electrical circuit simulation programs to simulate the main characteristics of the FFET. The model uses a previously developed algorithm that incorporates partial polarization as a basis for the design. The model has the main characteristics of the FFET, which are the current hysterisis with different gate voltages and decay of the drain current when the gate voltage is off. The drain current has values matching actual FFET's, which were measured experimentally. The input and output resistance in the model is similar to that of the FFET. The model is valid for all frequencies below RF levels. A variety of different ferroelectric material characteristics can be modeled. The model can be used to design circuits using FFET'S with standard electrical simulation packages. The circuit can be used in designing non-volatile memory circuits and logic circuits and is compatible with all SPICE based circuit analysis programs. The model is a drop in library that integrates seamlessly into a SPICE simulation. A comparison is made between the model and experimental data measured from an actual FFET.

  16. Simulation of large-scale rule-based models

    PubMed Central

    Colvin, Joshua; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.; Von Hoff, Daniel D.; Posner, Richard G.

    2009-01-01

    Motivation: Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. Results: DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein–protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of StochSim. DYNSTOC differs from StochSim by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. Availability: DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at http://public.tgen.org/dynstoc/. Contact: dynstoc@tgen.org Supplementary information

  17. Modeling and Simulation with INS.

    ERIC Educational Resources Information Center

    Roberts, Stephen D.; And Others

    INS, the Integrated Network Simulation language, puts simulation modeling into a network framework and automatically performs such programming activities as placing the problem into a next event structure, coding events, collecting statistics, monitoring status, and formatting reports. To do this, INS provides a set of symbols (nodes and branches)…

  18. Simulation as a vehicle for enhancing collaborative practice models.

    PubMed

    Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A

    2008-12-01

    Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.

  19. Charge transfer in model peptides: obtaining Marcus parameters from molecular simulation.

    PubMed

    Heck, Alexander; Woiczikowski, P Benjamin; Kubař, Tomáš; Giese, Bernd; Elstner, Marcus; Steinbrecher, Thomas B

    2012-02-23

    Charge transfer within and between biomolecules remains a highly active field of biophysics. Due to the complexities of real systems, model compounds are a useful alternative to study the mechanistic fundamentals of charge transfer. In recent years, such model experiments have been underpinned by molecular simulation methods as well. In this work, we study electron hole transfer in helical model peptides by means of molecular dynamics simulations. A theoretical framework to extract Marcus parameters of charge transfer from simulations is presented. We find that the peptides form stable helical structures with sequence dependent small deviations from ideal PPII helices. We identify direct exposure of charged side chains to solvent as a cause of high reorganization energies, significantly larger than typical for electron transfer in proteins. This, together with small direct couplings, makes long-range superexchange electron transport in this system very slow. In good agreement with experiment, direct transfer between the terminal amino acid side chains can be dicounted in favor of a two-step hopping process if appropriate bridging groups exist. © 2012 American Chemical Society

  20. A Dynamic Bayesian Network model for long-term simulation of clinical complications in type 1 diabetes.

    PubMed

    Marini, Simone; Trifoglio, Emanuele; Barbarini, Nicola; Sambo, Francesco; Di Camillo, Barbara; Malovini, Alberto; Manfrini, Marco; Cobelli, Claudio; Bellazzi, Riccardo

    2015-10-01

    The increasing prevalence of diabetes and its related complications is raising the need for effective methods to predict patient evolution and for stratifying cohorts in terms of risk of developing diabetes-related complications. In this paper, we present a novel approach to the simulation of a type 1 diabetes population, based on Dynamic Bayesian Networks, which combines literature knowledge with data mining of a rich longitudinal cohort of type 1 diabetes patients, the DCCT/EDIC study. In particular, in our approach we simulate the patient health state and complications through discretized variables. Two types of models are presented, one entirely learned from the data and the other partially driven by literature derived knowledge. The whole cohort is simulated for fifteen years, and the simulation error (i.e. for each variable, the percentage of patients predicted in the wrong state) is calculated every year on independent test data. For each variable, the population predicted in the wrong state is below 10% on both models over time. Furthermore, the distributions of real vs. simulated patients greatly overlap. Thus, the proposed models are viable tools to support decision making in type 1 diabetes. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Impact of soil moisture on regional spectral model simulations for South America

    Treesearch

    Shyh-Chin Chen; John Roads

    2005-01-01

    A regional simulation using the regional spectral model (RSM) with 50-km grid space increment over South America is described. NCEP/NCAR 28 vertical levels T62 spectral resolution reanalyses were used to initialize and force the regional model for a two-year period from March 1997 through March 1999. Initially, the RSM had a severe drying trend in the soil moisture...

  2. Integrated surface/subsurface permafrost thermal hydrology: Model formulation and proof-of-concept simulations

    DOE PAGES

    Painter, Scott L.; Coon, Ethan T.; Atchley, Adam L.; ...

    2016-08-11

    The need to understand potential climate impacts and feedbacks in Arctic regions has prompted recent interest in modeling of permafrost dynamics in a warming climate. A new fine-scale integrated surface/subsurface thermal hydrology modeling capability is described and demonstrated in proof-of-concept simulations. The new modeling capability combines a surface energy balance model with recently developed three-dimensional subsurface thermal hydrology models and new models for nonisothermal surface water flows and snow distribution in the microtopography. Surface water flows are modeled using the diffusion wave equation extended to include energy transport and phase change of ponded water. Variation of snow depth in themore » microtopography, physically the result of wind scour, is also modeled heuristically with a diffusion wave equation. The multiple surface and subsurface processes are implemented by leveraging highly parallel community software. Fully integrated thermal hydrology simulations on the tilted open book catchment, an important test case for integrated surface/subsurface flow modeling, are presented. Fine-scale 100-year projections of the integrated permafrost thermal hydrological system on an ice wedge polygon at Barrow Alaska in a warming climate are also presented. Finally, these simulations demonstrate the feasibility of microtopography-resolving, process-rich simulations as a tool to help understand possible future evolution of the carbon-rich Arctic tundra in a warming climate.« less

  3. Modifying a dynamic global vegetation model for simulating large spatial scale land surface water balances

    NASA Astrophysics Data System (ADS)

    Tang, G.; Bartlein, P. J.

    2012-08-01

    Satellite-based data, such as vegetation type and fractional vegetation cover, are widely used in hydrologic models to prescribe the vegetation state in a study region. Dynamic global vegetation models (DGVM) simulate land surface hydrology. Incorporation of satellite-based data into a DGVM may enhance a model's ability to simulate land surface hydrology by reducing the task of model parameterization and providing distributed information on land characteristics. The objectives of this study are to (i) modify a DGVM for simulating land surface water balances; (ii) evaluate the modified model in simulating actual evapotranspiration (ET), soil moisture, and surface runoff at regional or watershed scales; and (iii) gain insight into the ability of both the original and modified model to simulate large spatial scale land surface hydrology. To achieve these objectives, we introduce the "LPJ-hydrology" (LH) model which incorporates satellite-based data into the Lund-Potsdam-Jena (LPJ) DGVM. To evaluate the model we ran LH using historical (1981-2006) climate data and satellite-based land covers at 2.5 arc-min grid cells for the conterminous US and for the entire world using coarser climate and land cover data. We evaluated the simulated ET, soil moisture, and surface runoff using a set of observed or simulated data at different spatial scales. Our results demonstrate that spatial patterns of LH-simulated annual ET and surface runoff are in accordance with previously published data for the US; LH-modeled monthly stream flow for 12 major rivers in the US was consistent with observed values respectively during the years 1981-2006 (R2 > 0.46, p < 0.01; Nash-Sutcliffe Coefficient > 0.52). The modeled mean annual discharges for 10 major rivers worldwide also agreed well (differences < 15%) with observed values for these rivers. Compared to a degree-day method for snowmelt computation, the addition of the solar radiation effect on snowmelt enabled LH to better simulate monthly

  4. Simulation of river stage using artificial neural network and MIKE 11 hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Panda, Rabindra K.; Pramanik, Niranjan; Bala, Biplab

    2010-06-01

    Simulation of water levels at different sections of a river using physically based flood routing models is quite cumbersome, because it requires many types of data such as hydrologic time series, river geometry, hydraulics of existing control structures and channel roughness coefficients. Normally in developing countries like India it is not easy to collect these data because of poor monitoring and record keeping. Therefore, an artificial neural network (ANN) technique is used as an effective alternative in hydrologic simulation studies. The present study aims at comparing the performance of the ANN technique with a widely used physically based hydrodynamic model in the MIKE 11 environment. The MIKE 11 hydrodynamic model was calibrated and validated for the monsoon periods (June-September) of the years 2006 and 2001, respectively. Feed forward neural network architecture with Levenberg-Marquardt (LM) back propagation training algorithm was used to train the neural network model using hourly water level data of the period June-September 2006. The trained ANN model was tested using data for the same period of the year 2001. Simulated water levels by the MIKE 11HD were compared with the corresponding water levels predicted by the ANN model. The results obtained from the ANN model were found to be much better than that of the MIKE 11HD results as indicated by the values of the goodness of fit indices used in the study. The Nash-Sutcliffe index ( E) and root mean square error (RMSE) obtained in case of the ANN model were found to be 0.8419 and 0.8939 m, respectively, during model testing, whereas in case of MIKE 11HD, the values of E and RMSE were found to be 0.7836 and 1.00 m, respectively, during model validation. The difference between the observed and simulated peak water levels obtained from the ANN model was found to be much lower than that of MIKE 11HD. The study reveals that the use of Levenberg-Marquardt algorithm with eight hidden neurons in the hidden layer

  5. Possibilities of rock constitutive modelling and simulations

    NASA Astrophysics Data System (ADS)

    Baranowski, Paweł; Małachowski, Jerzy

    2018-01-01

    The paper deals with a problem of rock finite element modelling and simulation. The main intention of authors was to present possibilities of different approaches in case of rock constitutive modelling. For this purpose granite rock was selected, due to its wide mechanical properties recognition and prevalence in literature. Two significantly different constitutive material models were implemented to simulate the granite fracture in various configurations: Johnson - Holmquist ceramic model which is very often used for predicting rock and other brittle materials behavior, and a simple linear elastic model with a brittle failure which can be used for simulating glass fracturing. Four cases with different loading conditions were chosen to compare the aforementioned constitutive models: uniaxial compression test, notched three-point-bending test, copper ball impacting a block test and small scale blasting test.

  6. [Surgical Simulation Models for Sialendoscopy].

    PubMed

    Geisthoff, U; Volk, G F; Finkensieper, M; Wittekindt, C; Guntinas-Lichius, O

    2015-09-01

    Different simulation models are in use to teach the technique of sialendoscopy. Only a few reports in literature deal with this topic with no comparison having been published, yet. We therefore asked sialendoscopy training course participants about our applied models by using a questionnaire. Material und Methods: A tube-, a pepper-, a porcine kidney-, and a pig head-model were developed as training models and used during 6 consecutive practical sialendoscopy courses from 2012 to 2014. Participants were asked to answer a questionnaire specifically designed to assess the value of the different training models. All respondents (n=61) rated all training models positively. However, porcine kidney- and pig head-models were described to be superior, especially with respect to realistic simulation. Intubation of the papilla can be practised sufficiently only in the pig head-model. The tube- and peppers-models have the advantage of being less expensive, easier to handle and cleaner. The models described are all useful in learning the sialendoscopy technique. However, they have distinct advantages and disadvantages making a combination of different models useful. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Modeling the "Year without summer 1816" with the CCM SOCOL

    NASA Astrophysics Data System (ADS)

    Arfeuille, Florian; Rozanov, Eugene; Peter, Thomas; Fischer, Andreas. M.; Weisenstein, Debra; Brönnimann, Stefan

    2010-05-01

    The "Year without summer" 1816 had profound social and environmental effects, and although the cataclysmic eruption of Mt Tambora is now commonly known to have largely contributed to the negative temperature anomalies of the summer 1816 in Europe and North America, lots of uncertainties remain. The eruption of Mt. Tambora in April 1815 is the largest within the last 500 years. A crucial parameter to assess in order to simulate this eruption is the aerosol size distribution, which strongly influences the radiative impact of the aerosols (changes in albedo and residence time in the stratosphere, among others) and the impacts on dynamics and chemistry. The representation of this major forcing is done by using the AER-2D aerosol model which calculates the size distribution of the aerosols formed after the eruption. The modeling of the climatic impacts is then done by the state-of-the-art Chemistry-Climate model (CCM) SOCOL. The importance of stratospheric processes for the study of the "Year without summer" 1816 justifies the choice of a CCM which allows a precise analysis of the radiative, dynamical and chemical impacts of the Tambora eruption. The 1810's decade is an interesting period as it combines both a strong signal to noise ratio for the study of the impacts of the volcanic forcing, and an availability of several high resolution climate proxies allowing a credible reconstruction of interesting climatic components like Sea Surface Temperatures (SST) which are forced in the CCM . This can particularly provide a realistic description of the inter-annual variability linked to the major atmosphere/ocean coupled oscillations such as ENSO. Reconstructions based on inland natural proxies and early instrumental records can then be used to validate the simulated climate. I will present the characteristics of the Tambora eruption and show some results from simulations made using the aerosol model/CCM, with an emphasis on the radiative and chemical implications of the

  8. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  9. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krause, E.; et al.

    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less

  10. Closed loop models for analyzing engineering requirements for simulators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  11. Simulating reservoir leakage in ground-water models

    USGS Publications Warehouse

    Fenske, J.P.; Leake, S.A.; Prudic, David E.

    1997-01-01

    Leakage to ground water resulting from the expansion and contraction of reservoirs cannot be easily simulated by most ground-water flow models. An algorithm, entitled the Reservoir Package, was developed for the United States Geological Survey (USGS) three-dimensional finite-difference modular ground-water flow model MODFLOW. The Reservoir Package automates the process of specifying head-dependent boundary cells, eliminating the need to divide a simulation into many stress periods while improving accuracy in simulating changes in ground-water levels resulting from transient reservoir stage. Leakage between the reservoir and the underlying aquifer is simulated for each model cell corrresponding to the inundated area by multiplying the head difference between the reservoir and the aquifer with the hydraulic conductance of the reservoir-bed sediments.

  12. Cross-Paradigm Simulation Modeling: Challenges and Successes

    DTIC Science & Technology

    2011-12-01

    is also highlighted. 2.1 Discrete-Event Simulation Discrete-event simulation ( DES ) is a modeling method for stochastic, dynamic models where...which almost anything can be coded; models can be incredibly detailed. Most commercial DES software has a graphical interface which allows the user to...results. Although the above definition is the commonly accepted definition of DES , there are two different worldviews that dominate DES modeling today: a

  13. Coupled Global-Regional Climate Model Simulations of Future Changes in Hydrology over Central America

    NASA Astrophysics Data System (ADS)

    Oglesby, R. J.; Erickson, D. J.; Hernandez, J. L.; Irwin, D.

    2005-12-01

    Central America covers a relatively small area, but is topographically very complex, has long coast-lines, large inland bodies of water, and very diverse land cover which is both natural and human-induced. As a result, Central America is plagued by hydrologic extremes, especially major flooding and drought events, in a region where many people still barely manage to eke out a living through subsistence. Therefore, considerable concern exists about whether these extreme events will change, either in magnitude or in number, as climate changes in the future. To address this concern, we have used global climate model simulations of future climate change to drive a regional climate model centered on Central America. We use the IPCC `business as usual' scenario 21st century run made with the NCAR CCSM3 global model to drive the regional model MM5 at 12 km resolution. We chose the `business as usual' scenario to focus on the largest possible changes that are likely to occur. Because we are most interested in near-term changes, our simulations are for the years 2010, 2015, and 2025. A long `present-day run (for 2005) allows us to distinguish between climate variability and any signal due to climate change. Furthermore, a multi-year run with MM5 forced by NCEP reanalyses allows an assessment of how well the coupled global-regional model performs over Central America. Our analyses suggest that the coupled model does a credible job simulating the current climate and hydrologic regime, though lack of sufficient observations strongly complicates this comparison. The suite of model runs for the future years is currently nearing completion, and key results will be presented at the meeting.

  14. Fear of missing a lesion: use of simulated breast models to decrease student anxiety when learning clinical breast examinations.

    PubMed

    Pugh, Carla M; Salud, Lawrence H

    2007-06-01

    Medical students experience a considerable amount of discomfort during their training. The purpose of the current study was to identify sources of student anxiety when learning clinical breast examinations (CBEs) and to evaluate the effects of simulated breast models on student comfort. Simulated breast models were introduced into the curriculum for 175 second-year medical students. Using surveys, students identified sources of anxiety and rated their comfort levels when learning CBE skills. "Fear of missing a lesion" and the "Intimate/personal nature of the exam" accounted for 73.8% of student anxiety when learning CBEs. In addition, there were significant improvements (P < .05) in student comfort levels when using simulated breast models to learn CBE skills. We have identified 2 of the top causes of anxiety for second-year medical students learning CBE. In addition, we found simulated breast models to be effective in increasing student comfort levels when learning CBEs.

  15. Computer simulation models as tools for identifying research needs: A black duck population model

    USGS Publications Warehouse

    Ringelman, J.K.; Longcore, J.R.

    1980-01-01

    Existing data on the mortality and production rates of the black duck (Anas rubripes) were used to construct a WATFIV computer simulation model. The yearly cycle was divided into 8 phases: hunting, wintering, reproductive, molt, post-molt, and juvenile dispersal mortality, and production from original and renesting attempts. The program computes population changes for sex and age classes during each phase. After completion of a standard simulation run with all variable default values in effect, a sensitivity analysis was conducted by changing each of 50 input variables, 1 at a time, to assess the responsiveness of the model to changes in each variable. Thirteen variables resulted in a substantial change in population level. Adult mortality factors were important during hunting and wintering phases. All production and mortality associated with original nesting attempts were sensitive, as was juvenile dispersal mortality. By identifying those factors which invoke the greatest population change, and providing an indication of the accuracy required in estimating these factors, the model helps to identify those variables which would be most profitable topics for future research.

  16. System Simulation Modeling: A Case Study Illustration of the Model Development Life Cycle

    Treesearch

    Janice K. Wiedenbeck; D. Earl Kline

    1994-01-01

    Systems simulation modeling techniques offer a method of representing the individual elements of a manufacturing system and their interactions. By developing and experimenting with simulation models, one can obtain a better understanding of the overall physical system. Forest products industries are beginning to understand the importance of simulation modeling to help...

  17. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  18. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    PubMed Central

    Speybroeck, Niko; Van Malderen, Carine; Harper, Sam; Müller, Birgit; Devleesschauwer, Brecht

    2013-01-01

    Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks. PMID:24192788

  19. Introduction of a simulation model for choledocho- and pancreaticojejunostomy.

    PubMed

    Narumi, Shunji; Toyoki, Yoshikazu; Ishido, Keinosuke; Kudo, Daisuke; Umehara, Minoru; Kimura, Norihisa; Miura, Takuya; Muroya, Takahiro; Hakamada, Kenichi

    2012-10-01

    Pancreaticoduodenectomy includes choledochojejunostomy and pancreaticojejunostomy, which require hand-sewn anastomoses. Educational simulation models for choledochojejunostomy and pancreaticojejunostomy have not been designed. We introduce a simulation model for choledochojejunostomy and pancreaticojejunostomy created with a skin closure pad and a vascular model. A wound closure pad and a vein model (4 mm diameter) were used as a stump model of the pancreas. Pancreaticojejunostomy was simulated with a stump model of the pancreas and a double layer bowel model; these models were stabilized in an end-to-side fashion on a magnetic board using magnetic clips. In addition, vein (6 or 8 mm diameter) and bowel models were used to simulate choledochojejunostomy. Pancreatic and hepatobiliary surgery are relatively rare, particularly in a community hospital although surgical residents wish to practice these procedures. Our simulator enables surgeons and surgical residents to practice choledocho- and pancreaticojejunostomy through open or laparoscopic approaches.

  20. Enhancement of Pyrometallurgical Teaching Using Excel Simulation Models

    NASA Astrophysics Data System (ADS)

    Grimsey, Eric J.

    Steady state Excel models for a copper flash smelter and an iron blast furnace are used to enhance the teaching of pyrometallurgical smelting principles within a fourth year level process engineering unit delivered at the Western Australian School of Mines. A lecture/workshop approach has been adopted in which student teams undertake process simulation assignments that illustrate the multifaceted responses of process outputs to variation of inputs, the objectives being to reinforce their understanding of smelting principles. The approach has proven to be popular with students, as evidenced by the consistently high ratings the unit has received through student feedback. This paper provides an overview of the teaching approach and process models used.

  1. Theory, modeling, and simulation of structural and functional materials: Micromechanics, microstructures, and properties

    NASA Astrophysics Data System (ADS)

    Jin, Yongmei

    In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori

  2. Black carbon and trace gases over South Asia: Measurements and Regional Climate model simulations

    NASA Astrophysics Data System (ADS)

    Bhuyan, Pradip; Pathak, Binita; Parottil, Ajay

    2016-07-01

    Trace gases and aerosols are simulated with 50 km spatial resolution over South Asian CORDEX domain enclosing the Indian sub-continent and North-East India for the year 2012 using two regional climate models RegCM4 coupled with CLM4.5 and WRF-Chem 3.5. Both models are found to capture the seasonality in the simulated O3 and its precursors, NOx and CO and black carbon concentrations together with the meteorological variables over the Indian Subcontinent as well as over the sub-Himalayan North-Eastern region of India including Bangladesh. The model simulations are compared with the measurements made at Dibrugarh (27.3°N, 94.6°E, 111 m amsl). Both the models are found to capture the observed diurnal and seasonal variations in O3 concentrations with maximum in spring and minimum in monsoon, the correlation being better for WRF-Chem (R~0.77) than RegCM (R~0.54). Simulated NOx and CO is underestimated in all the seasons by both the models, the performance being better in the case of WRF-Chem. The observed difference may be contributed by the bias in the estimation of the O3 precursors NOx and CO in the emission inventories or the error in the simulation of the meteorological variables which influences O3 concentration in both the models. For example, in the pre-monsoon and winter season, the WRF-Chem model simulated shortwave flux overestimates the observation by ~500 Wm-2 while in the monsoon and post monsoon season, simulated shortwave flux is equivalent to the observation. The model predicts higher wind speed in all the seasons especially during night-time. In the post-monsoon and winter season, the simulated wind pattern is reverse to observation with daytime low and night-time high values. Rainfall is overestimated in all the seasons. RegCM-CLM4.5 is found to underestimate rainfall and other meteorological parameters. The WRF-Chem model closely captured the observed values of black carbon mass concentrations during pre-monsoon and summer monsoon seasons, but

  3. Robust three-body water simulation model

    NASA Astrophysics Data System (ADS)

    Tainter, C. J.; Pieniazek, P. A.; Lin, Y.-S.; Skinner, J. L.

    2011-05-01

    The most common potentials used in classical simulations of liquid water assume a pairwise additive form. Although these models have been very successful in reproducing many properties of liquid water at ambient conditions, none is able to describe accurately water throughout its complicated phase diagram. The primary reason for this is the neglect of many-body interactions. To this end, a simulation model with explicit three-body interactions was introduced recently [R. Kumar and J. L. Skinner, J. Phys. Chem. B 112, 8311 (2008), 10.1021/jp8009468]. This model was parameterized to fit the experimental O-O radial distribution function and diffusion constant. Herein we reparameterize the model, fitting to a wider range of experimental properties (diffusion constant, rotational correlation time, density for the liquid, liquid/vapor surface tension, melting point, and the ice Ih density). The robustness of the model is then verified by comparing simulation to experiment for a number of other quantities (enthalpy of vaporization, dielectric constant, Debye relaxation time, temperature of maximum density, and the temperature-dependent second and third virial coefficients), with good agreement.

  4. Interannual Rainfall Variability in North-East Brazil: Observation and Model Simulation

    NASA Astrophysics Data System (ADS)

    Harzallah, A.; Rocha de Aragão, J. O.; Sadourny, R.

    1996-08-01

    The relationship between interannual variability of rainfall in north-east Brazil and tropical sea-surface temperature is studied using observations and model simulations. The simulated precipitation is the average of seven independent realizations performed using the Laboratoire de Météorologie Dynamique atmospheric general model forced by the 1970-1988 observed sea-surface temperature. The model reproduces very well the rainfall anomalies (correlation of 091 between observed and modelled anomalies). The study confirms that precipitation in north-east Brazil is highly correlated to the sea-surface temperature in the tropical Atlantic and Pacific oceans. Using the singular value decomposition method, we find that Nordeste rainfall is modulated by two independent oscillations, both governed by the Atlantic dipole, but one involving only the Pacific, the other one having a period of about 10 years. Correlations between precipitation in north-east Brazil during February-May and the sea-surface temperature 6 months earlier indicate that both modes are essential to estimate the quality of the rainy season.

  5. Developing Cognitive Models for Social Simulation from Survey Data

    NASA Astrophysics Data System (ADS)

    Alt, Jonathan K.; Lieberman, Stephen

    The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.

  6. Future Modelling and Simulation Challenges (Defis futurs pour la modelisation et la simulation)

    DTIC Science & Technology

    2002-11-01

    Language School Figure 2: Location of the simulation center within the MEC Military operations research section - simulation lab Military operations... language . This logic can be probabilistic (branching is randomised, which is useful for modelling error), tactical (a branch goes to the task with the... language and a collection of simulation tools that can be used to create human and team behaviour models to meet users’ needs. Hence, different ways of

  7. Protein Simulation Data in the Relational Model.

    PubMed

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  8. Protein Simulation Data in the Relational Model

    PubMed Central

    Simms, Andrew M.; Daggett, Valerie

    2011-01-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646

  9. Recent Advances in the Theory and Simulation of Model Colloidal Microphase Formers.

    PubMed

    Zhuang, Yuan; Charbonneau, Patrick

    2016-08-18

    This mini-review synthesizes our understanding of the equilibrium behavior of particle-based models with short-range attractive and long-range repulsive (SALR) interactions. These models, which can form stable periodic microphases, aim to reproduce the essence of colloidal suspensions with competing interparticle interactions. Ordered structures, however, have yet to be obtained in experiments. In order to better understand the hurdles to periodic microphase assembly, marked theoretical and simulation advances have been made over the past few years. Here, we present recent progress in the study of microphases in models with SALR interactions using liquid-state theory and density-functional theory as well as numerical simulations. Combining these various approaches provides a description of periodic microphases, and gives insights into the rich phenomenology of the surrounding disordered regime. Ongoing research directions in the thermodynamics of models with SALR interactions are also presented.

  10. Uncertainty Propagation of Non-Parametric-Derived Precipitation Estimates into Multi-Hydrologic Model Simulations

    NASA Astrophysics Data System (ADS)

    Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.

  11. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  12. An Ecosystem Simulation Model for Methane Production and Emission from Wetlands

    NASA Technical Reports Server (NTRS)

    Potter, C. S.; Peterson, David L. (Technical Monitor)

    1997-01-01

    Previous experimental studies suggest that methane emission from wetland is influenced by multiple interactive pathways of gas production and transport through soil and sediment layers to the atmosphere. The objective of this study is to evaluate a new simulation model of methane production and emission in wetland soils that was developed initially to help identify key processes that regulate methanogenesis and net flux of CH4 to the air, but which is designed ultimately for regional simulation using remotely sensed inputs for land cover characteristics. The foundation for these computer simulations is based on a well-documented model (CASA) of ecosystem production and carbon cycling in the terrestrial blaspheme. Modifications to represent flooded wetland soils and anaerobic decomposition include three new sub-models for: (1) layered soil temperature and water table depth (WTD) as a function of daily climate drivers, (2) CH4 production within the anoxic soil layer as a function of WTD and CO2 production under poorly drained conditions, and (3) CH4 gaseous transport pathways (molecular diffusion, ebullition, and plant vascular transport) as a function of WTD and ecosystem type. The model was applied and tested using climate and ecological data to characterize tundra wetland sites near Fairbanks, Alaska studied previously by Whalen and Reeburgh. Comparison of model predictions to measurements of soil temperature and thaw depth, water-table depth, and CH4 emissions over a two year period suggest that inter-site differences in soil physical conditions and methane fluxes could be reproduced accurately for selected periods. Day-to-day comparison of predicted emissions to measured CH4 flux rates reveals good agreement during the early part of the thaw season, but the model tends to underestimate production of CH4 during the months of July and August in both test years. Important seasonal effects, including that of falling WTD during these periods, are apparently

  13. CMIP5 Historical Simulations (1850-2012) with GISS ModelE2

    NASA Technical Reports Server (NTRS)

    Miller, Ronald Lindsay; Schmidt, Gavin A.; Nazarenko, Larissa S.; Tausnev, Nick; Bauer, Susanne E.; DelGenio, Anthony D.; Kelley, Max; Lo, Ken K.; Ruedy, Reto; Shindell, Drew T.; hide

    2014-01-01

    Observations of climate change during the CMIP5 extended historical period (1850-2012) are compared to trends simulated by six versions of the NASA Goddard Institute for Space Studies ModelE2 Earth System Model. The six models are constructed from three versions of the ModelE2 atmospheric general circulation model, distinguished by their treatment of atmospheric composition and the aerosol indirect effect, combined with two ocean general circulation models, HYCOM and Russell. Forcings that perturb the model climate during the historical period are described. Five-member ensemble averages from each of the six versions of ModelE2 simulate trends of surface air temperature, atmospheric temperature, sea ice and ocean heat content that are in general agreement with observed trends, although simulated warming is slightly excessive within the past decade. Only simulations that include increasing concentrations of long-lived greenhouse gases match the warming observed during the twentieth century. Differences in twentieth-century warming among the six model versions can be attributed to differences in climate sensitivity, aerosol and ozone forcing, and heat uptake by the deep ocean. Coupled models with HYCOM export less heat to the deep ocean, associated with reduced surface warming in regions of deepwater formation, but greater warming elsewhere at high latitudes along with reduced sea ice. All ensembles show twentieth-century annular trends toward reduced surface pressure at southern high latitudes and a poleward shift of the midlatitude westerlies, consistent with observations.

  14. A simulation model to estimate cost-offsets for a disease-management program for chronic kidney disease.

    PubMed

    Gandjour, Afschin; Tschulena, Ulrich; Steppan, Sonja; Gatti, Emanuele

    2015-04-01

    The aim of this paper is to develop a simulation model that analyzes cost-offsets of a hypothetical disease management program (DMP) for patients with chronic kidney disease (CKD) in Germany compared to no such program. A lifetime Markov model with simulated 65-year-old patients with CKD was developed using published data on costs and health status and simulating the progression to end-stage renal disease (ESRD), cardiovascular disease and death. A statutory health insurance perspective was adopted. This modeling study shows considerable potential for cost-offsets from a DMP for patients with CKD. The potential for cost-offsets increases with relative risk reduction by the DMP and baseline glomerular filtration rate. Results are most sensitive to the cost of dialysis treatment. This paper presents a general 'prototype' simulation model for the prevention of ESRD. The model allows for further modification and adaptation in future applications.

  15. Pressurized storm sewer simulation : model enhancement.

    DOT National Transportation Integrated Search

    1991-01-01

    A modified Pressurized Flow Simulation Model, PFSM, was developed and attached to the Federal Highway Administration, FHWA, Pool Funded PFP-HYDRA Package. Four hydrograph options are available for simulating inflow to a sewer system under surcharge o...

  16. Incorporation of RAM techniques into simulation modeling

    NASA Astrophysics Data System (ADS)

    Nelson, S. C., Jr.; Haire, M. J.; Schryver, J. C.

    1995-01-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model to represent the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army's next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through 'what if' questions, sensitivity studies, and battle scenario changes.

  17. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  18. Evaluating the Credibility of Transport Processes in Simulations of Ozone Recovery using the Global Modeling Initiative Three-dimensional Model

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2004-01-01

    The Global Modeling Initiative (GMI) has integrated two 36-year simulations of an ozone recovery scenario with an offline chemistry and tra nsport model using two different meteorological inputs. Physically ba sed diagnostics, derived from satellite and aircraft data sets, are d escribed and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barri er formation in the subtropics and polar regions, and extratropical w ave-driven transport. Some diagnostics are especially relevant to sim ulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The global temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of me teorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a g eneral circulation model (GMI(GCM)) showed a very good residual circulation in the tropics and Northern Hemisphere. The simulation with inp ut from a data assimilation system (GMI(DAS)) performed better in the midlatitudes than it did at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GML(GCM) has greater fidelity throughout the stratosphere tha n it does in the GMI(DAS)

  19. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Treesearch

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  20. Indentation experiments and simulation of ovine bone using a viscoelastic-plastic damage model

    PubMed Central

    Zhao, Yang; Wu, Ziheng; Turner, Simon; MacLeay, Jennifer; Niebur, Glen L.; Ovaert, Timothy C.

    2015-01-01

    Indentation methods have been widely used to study bone at the micro- and nanoscales. It has been shown that bone exhibits viscoelastic behavior with permanent deformation during indentation. At the same time, damage due to microcracks is induced due to the stresses beneath the indenter tip. In this work, a simplified viscoelastic-plastic damage model was developed to more closely simulate indentation creep data, and the effect of the model parameters on the indentation curve was investigated. Experimentally, baseline and 2-year postovariectomized (OVX-2) ovine (sheep) bone samples were prepared and indented. The damage model was then applied via finite element analysis to simulate the bone indentation data. The mechanical properties of yielding, viscosity, and damage parameter were obtained from the simulations. The results suggest that damage develops more quickly for OVX-2 samples under the same indentation load conditions as the baseline data. PMID:26136623

  1. A comparison among observations and earthquake simulator results for the allcal2 California fault model

    USGS Publications Warehouse

    Tullis, Terry. E.; Richards-Dinger, Keith B.; Barall, Michael; Dieterich, James H.; Field, Edward H.; Heien, Eric M.; Kellogg, Louise; Pollitz, Fred F.; Rundle, John B.; Sachs, Michael K.; Turcotte, Donald L.; Ward, Steven N.; Yikilmaz, M. Burak

    2012-01-01

    model, allcal2. With the exception of ViscoSim, which ran for 10,000 years, all the simulators ran for 30,000 years. Presentations containing content similar to this paper can be found at http://scec.usc.edu/research/eqsims/.

  2. Wedge Experiment Modeling and Simulation for Reactive Flow Model Calibration

    NASA Astrophysics Data System (ADS)

    Maestas, Joseph T.; Dorgan, Robert J.; Sutherland, Gerrit T.

    2017-06-01

    Wedge experiments are a typical method for generating pop-plot data (run-to-detonation distance versus input shock pressure), which is used to assess an explosive material's initiation behavior. Such data can be utilized to calibrate reactive flow models by running hydrocode simulations and successively tweaking model parameters until a match between experiment is achieved. Typical simulations are performed in 1D and typically use a flyer impact to achieve the prescribed shock loading pressure. In this effort, a wedge experiment performed at the Army Research Lab (ARL) was modeled using CTH (SNL hydrocode) in 1D, 2D, and 3D space in order to determine if there was any justification in using simplified models. A simulation was also performed using the BCAT code (CTH companion tool) that assumes a plate impact shock loading. Results from the simulations were compared to experimental data and show that the shock imparted into an explosive specimen is accurately captured with 2D and 3D simulations, but changes significantly in 1D space and with the BCAT tool. The difference in shock profile is shown to only affect numerical predictions for large run distances. This is attributed to incorrectly capturing the energy fluence for detonation waves versus flat shock loading. Portions of this work were funded through the Joint Insensitive Munitions Technology Program.

  3. Simulating cancer growth with multiscale agent-based modeling.

    PubMed

    Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S

    2015-02-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Simulating Cancer Growth with Multiscale Agent-Based Modeling

    PubMed Central

    Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.

    2014-01-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698

  5. Multi-criteria Evaluation of Discharge Simulation in Dynamic Global Vegetation Models

    NASA Astrophysics Data System (ADS)

    Yang, H.; Piao, S.; Zeng, Z.; Ciais, P.; Yin, Y.; Friedlingstein, P.; Sitch, S.; Ahlström, A.; Guimberteau, M.; Huntingford, C.; Levis, S.; Levy, P. E.; Huang, M.; Li, Y.; Li, X.; Lomas, M.; Peylin, P. P.; Poulter, B.; Viovy, N.; Zaehle, S.; Zeng, N.; Zhao, F.; Wang, L.

    2015-12-01

    In this study, we assessed the performance of discharge simulations by coupling the runoff from seven Dynamic Global Vegetation Models (DGVMs; LPJ, ORCHIDEE, Sheffield-DGVM, TRIFFID, LPJ-GUESS, CLM4CN, and OCN) to one river routing model for 16 large river basins. The results show that the seasonal cycle of river discharge is generally modelled well in the low and mid latitudes, but not in the high latitudes, where the peak discharge (due to snow and ice melting) is underestimated. For the annual mean discharge, the DGVMs chained with the routing model show an underestimation. Furthermore the 30-year trend of discharge is also under-estimated. For the inter-annual variability of discharge, a skill score based on overlapping of probability density functions (PDFs) suggests that most models correctly reproduce the observed variability (correlation coefficient higher than 0.5; i.e. models account for 50% of observed inter-annual variability) except for the Lena, Yenisei, Yukon, and the Congo river basins. In addition, we compared the simulated runoff from different simulations where models were forced with either fixed or varying land use. This suggests that both seasonal and annual mean runoff has been little affected by land use change, but that the trend itself of runoff is sensitive to land use change. None of the models when considered individually show significantly better performances than any other and in all basins. This suggests that based on current modelling capability, a regional-weighted average of multi-model ensemble projections might be appropriate to reduce the bias in future projection of global river discharge.

  6. Validation of a model to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD: the rotterdam ischemic heart disease and stroke computer simulation (RISC) model.

    PubMed

    van Kempen, Bob J H; Ferket, Bart S; Hofman, Albert; Steyerberg, Ewout W; Colkesen, Ersen B; Boekholdt, S Matthijs; Wareham, Nicholas J; Khaw, Kay-Tee; Hunink, M G Myriam

    2012-12-06

    We developed a Monte Carlo Markov model designed to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD. Internal, predictive, and external validity of the model have not yet been established. The Rotterdam Ischemic Heart Disease and Stroke Computer Simulation (RISC) model was developed using data covering 5 years of follow-up from the Rotterdam Study. To prove 1) internal and 2) predictive validity, the incidences of coronary heart disease (CHD), stroke, CVD death, and non-CVD death simulated by the model over a 13-year period were compared with those recorded for 3,478 participants in the Rotterdam Study with at least 13 years of follow-up. 3) External validity was verified using 10 years of follow-up data from the European Prospective Investigation of Cancer (EPIC)-Norfolk study of 25,492 participants, for whom CVD and non-CVD mortality was compared. At year 5, the observed incidences (with simulated incidences in brackets) of CHD, stroke, and CVD and non-CVD mortality for the 3,478 Rotterdam Study participants were 5.30% (4.68%), 3.60% (3.23%), 4.70% (4.80%), and 7.50% (7.96%), respectively. At year 13, these percentages were 10.60% (10.91%), 9.90% (9.13%), 14.20% (15.12%), and 24.30% (23.42%). After recalibrating the model for the EPIC-Norfolk population, the 10-year observed (simulated) incidences of CVD and non-CVD mortality were 3.70% (4.95%) and 6.50% (6.29%). All observed incidences fell well within the 95% credibility intervals of the simulated incidences. We have confirmed the internal, predictive, and external validity of the RISC model. These findings provide a basis for analyzing the effects of modifying cardiovascular disease risk factors on the burden of CVD with the RISC model.

  7. Competency-based learning in an ambulatory care setting: Implementation of simulation training in the Ambulatory Care Rotation during the final year of the MaReCuM model curriculum.

    PubMed

    Dusch, Martin; Narciß, Elisabeth; Strohmer, Renate; Schüttpelz-Brauns, Katrin

    2018-01-01

    Aim: As part of the MaReCuM model curriculum at Medical Faculty Mannheim Heidelberg University, a final year rotation in ambulatory care was implemented and augmented to include ambulatory care simulation. In this paper we describe this ambulatory care simulation, the designated competency-based learning objectives, and evaluate the educational effect of the ambulatory care simulation training. Method: Seventy-five final year medical students participated in the survey (response rate: 83%). The control group completed the ambulatory rotation prior to the implementation of the ambulatory care simulation. The experimental group was required to participate in the simulation at the beginning of the final year rotation in ambulatory care. A survey of both groups was conducted at the beginning and at the end of the rotation. The learning objectives were taken from the National Competency-based Catalogue of Learning Objectives for Undergraduate Medical Education (NKLM). Results: The ambulatory care simulation had no measurable influence on students' subjectively perceived learning progress, the evaluation of the ambulatory care rotation, or working in an ambulatory care setting. At the end of the rotation participants in both groups reported having gained better insight into treating outpatients. At the beginning of the rotation members of both groups assessed their competencies to be at the same level. The simulated ambulatory scenarios were evaluated by the participating students as being well structured and easy to understand. The scenarios successfully created a sense of time pressure for those confronted with them. The ability to correctly fill out a narcotic prescription form as required was rated significantly higher by those who participated in the simulation. Participation in the ambulatory care simulation had no effect on the other competencies covered by the survey. Discussion: The effect of the four instructional units comprising the ambulatory care simulation

  8. New Approaches to Quantifying Transport Model Error in Atmospheric CO2 Simulations

    NASA Technical Reports Server (NTRS)

    Ott, L.; Pawson, S.; Zhu, Z.; Nielsen, J. E.; Collatz, G. J.; Gregg, W. W.

    2012-01-01

    In recent years, much progress has been made in observing CO2 distributions from space. However, the use of these observations to infer source/sink distributions in inversion studies continues to be complicated by difficulty in quantifying atmospheric transport model errors. We will present results from several different experiments designed to quantify different aspects of transport error using the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric General Circulation Model (AGCM). In the first set of experiments, an ensemble of simulations is constructed using perturbations to parameters in the model s moist physics and turbulence parameterizations that control sub-grid scale transport of trace gases. Analysis of the ensemble spread and scales of temporal and spatial variability among the simulations allows insight into how parameterized, small-scale transport processes influence simulated CO2 distributions. In the second set of experiments, atmospheric tracers representing model error are constructed using observation minus analysis statistics from NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA). The goal of these simulations is to understand how errors in large scale dynamics are distributed, and how they propagate in space and time, affecting trace gas distributions. These simulations will also be compared to results from NASA's Carbon Monitoring System Flux Pilot Project that quantified the impact of uncertainty in satellite constrained CO2 flux estimates on atmospheric mixing ratios to assess the major factors governing uncertainty in global and regional trace gas distributions.

  9. A numerical model for simulation of bioremediation of hydrocarbons in aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munoz, J.F.; Irarrazaval, M.J.

    1998-03-01

    A numerical model was developed to describe the bioremediation of hydrocarbons in ground water aquifers considering aerobic degradation. The model solves the independent transport of three solutes (oxygen, hydrocarbons, and microorganisms) in ground water flow using the method of characteristics. Interactions between the three solutes, in which oxygen and hydrocarbons are consumed by microorganisms, are represented by Monod kinetics, solved using a Runge-Kutta method. Model simulations showed good correlation as compared with results of soil column experiments. The model was used to estimate the time needed to remediate the columns, which varied from one to two years.

  10. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  11. Modelling radiative transfer through ponded first-year Arctic sea ice with a plane-parallel model

    NASA Astrophysics Data System (ADS)

    Taskjelle, Torbjørn; Hudson, Stephen R.; Granskog, Mats A.; Hamre, Børge

    2017-09-01

    Under-ice irradiance measurements were done on ponded first-year pack ice along three transects during the ICE12 expedition north of Svalbard. Bulk transmittances (400-900 nm) were found to be on average 0.15-0.20 under bare ice, and 0.39-0.46 under ponded ice. Radiative transfer modelling was done with a plane-parallel model. While simulated transmittances deviate significantly from measured transmittances close to the edge of ponds, spatially averaged bulk transmittances agree well. That is, transect-average bulk transmittances, calculated using typical simulated transmittances for ponded and bare ice weighted by the fractional coverage of the two surface types, are in good agreement with the measured values. Radiative heating rates calculated from model output indicates that about 20 % of the incident solar energy is absorbed in bare ice, and 50 % in ponded ice (35 % in pond itself, 15 % in the underlying ice). This large difference is due to the highly scattering surface scattering layer (SSL) increasing the albedo of the bare ice.

  12. Comparison of Model and Observed Regional Temperature Changes During the Past 40 Years

    NASA Technical Reports Server (NTRS)

    Russell, Gary L.; Miller, James R.; Rind, David; Ruedy, Reto A.; Schmidt, Gavin A.; Sheth, Sukeshi

    1999-01-01

    Results are presented for six simulations of the Goddard Institute for Space Studies (GISS) global atmosphere-ocean model for the years 1950 to 2099. There are two control simulations with constant 1950 atmospheric composition from different initial states, two GHG experiments with observed greenhouse gases up to 1990 and compounded .5% CO2 annual increases thereafter, and two GHG+SO4 experiments with the same varying greenhouse gases plus varying tropospheric sulfate aerosols. Surface air temperature trends in the two GHG experiments are compared between themselves and with the observed temperature record from 1960 and 1998. All comparisons show high positive spatial correlation in the northern hemisphere except in summer when the greenhouse signal is weakest. The GHG+SO4 experiments show weaker correlations. In the southern hemisphere, correlations are either weak or negative which in part are due to the model's unrealistic interannual variability of southern sea ice cover. The model results imply that temperature changes due to forcing by increased greenhouse gases have risen above the level of regional interannual temperature variability in the northern hemisphere over the past 40 years. This period is thus an important test of reliability of coupled climate models.

  13. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  14. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  15. Modeling and Simulation of Explosively Driven Electromechanical Devices

    NASA Astrophysics Data System (ADS)

    Demmie, Paul N.

    2002-07-01

    Components that store electrical energy in ferroelectric materials and produce currents when their permittivity is explosively reduced are used in a variety of applications. The modeling and simulation of such devices is a challenging problem since one has to represent the coupled physics of detonation, shock propagation, and electromagnetic field generation. The high fidelity modeling and simulation of complicated electromechanical devices was not feasible prior to having the Accelerated Strategic Computing Initiative (ASCI) computers and the ASCI developed codes at Sandia National Laboratories (SNL). The EMMA computer code is used to model such devices and simulate their operation. In this paper, I discuss the capabilities of the EMMA code for the modeling and simulation of one such electromechanical device, a slim-loop ferroelectric (SFE) firing set.

  16. Design, modeling, simulation and evaluation of a distributed energy system

    NASA Astrophysics Data System (ADS)

    Cultura, Ambrosio B., II

    This dissertation presents the design, modeling, simulation and evaluation of distributed energy resources (DER) consisting of photovoltaics (PV), wind turbines, batteries, a PEM fuel cell and supercapacitors. The distributed energy resources installed at UMass Lowell consist of the following: 2.5kW PV, 44kWhr lead acid batteries and 1500W, 500W & 300W wind turbines, which were installed before year 2000. Recently added to that are the following: 10.56 kW PV array, 2.4 kW wind turbine, 29 kWhr Lead acid batteries, a 1.2 kW PEM fuel cell and 4-140F supercapacitors. Each newly added energy resource has been designed, modeled, simulated and evaluated before its integration into the existing PV/Wind grid-connected system. The Mathematical and Simulink model of each system was derived and validated by comparing the simulated and experimental results. The Simulated results of energy generated from a 10.56kW PV system are in good agreement with the experimental results. A detailed electrical model of a 2.4kW wind turbine system equipped with a permanent magnet generator, diode rectifier, boost converter and inverter is presented. The analysis of the results demonstrates the effectiveness of the constructed simulink model, and can be used to predict the performance of the wind turbine. It was observed that a PEM fuel cell has a very fast response to load changes. Moreover, the model has validated the actual operation of the PEM fuel cell, showing that the simulated results in Matlab Simulink are consistent with the experimental results. The equivalent mathematical equation, derived from an electrical model of the supercapacitor, is used to simulate its voltage response. The model is completely capable of simulating its voltage behavior, and can predict the charge time and discharge time of voltages on the supercapacitor. The bi-directional dc-dc converter was designed in order to connect the 48V battery bank storage to the 24V battery bank storage. This connection was

  17. Modeling and Simulation of Nanoindentation

    NASA Astrophysics Data System (ADS)

    Huang, Sixie; Zhou, Caizhi

    2017-11-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  18. SpaceNet: Modeling and Simulating Space Logistics

    NASA Technical Reports Server (NTRS)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  19. A Simulation Model Articulation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  20. Development of a Precipitation-Runoff Model to Simulate Unregulated Streamflow in the Salmon Creek Basin, Okanogan County, Washington

    USGS Publications Warehouse

    van Heeswijk, Marijke

    2006-01-01

    Surface water has been diverted from the Salmon Creek Basin for irrigation purposes since the early 1900s, when the Bureau of Reclamation built the Okanogan Project. Spring snowmelt runoff is stored in two reservoirs, Conconully Reservoir and Salmon Lake Reservoir, and gradually released during the growing season. As a result of the out-of-basin streamflow diversions, the lower 4.3 miles of Salmon Creek typically has been a dry creek bed for almost 100 years, except during the spring snowmelt season during years of high runoff. To continue meeting the water needs of irrigators but also leave water in lower Salmon Creek for fish passage and to help restore the natural ecosystem, changes are being considered in how the Okanogan Project is operated. This report documents development of a precipitation-runoff model for the Salmon Creek Basin that can be used to simulate daily unregulated streamflows. The precipitation-runoff model is a component of a Decision Support System (DSS) that includes a water-operations model the Bureau of Reclamation plans to develop to study the water resources of the Salmon Creek Basin. The DSS will be similar to the DSS that the Bureau of Reclamation and the U.S. Geological Survey developed previously for the Yakima River Basin in central southern Washington. The precipitation-runoff model was calibrated for water years 1950-89 and tested for water years 1990-96. The model was used to simulate daily streamflows that were aggregated on a monthly basis and calibrated against historical monthly streamflows for Salmon Creek at Conconully Dam. Additional calibration data were provided by the snowpack water-equivalent record for a SNOTEL station in the basin. Model input time series of daily precipitation and minimum and maximum air temperatures were based on data from climate stations in the study area. Historical records of unregulated streamflow for Salmon Creek at Conconully Dam do not exist for water years 1950-96. Instead, estimates of

  1. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    NASA Astrophysics Data System (ADS)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and

  2. Turbulence modeling for Francis turbine water passages simulation

    NASA Astrophysics Data System (ADS)

    Maruzewski, P.; Hayashi, H.; Munch, C.; Yamaishi, K.; Hashii, T.; Mombelli, H. P.; Sugow, Y.; Avellan, F.

    2010-08-01

    The applications of Computational Fluid Dynamics, CFD, to hydraulic machines life require the ability to handle turbulent flows and to take into account the effects of turbulence on the mean flow. Nowadays, Direct Numerical Simulation, DNS, is still not a good candidate for hydraulic machines simulations due to an expensive computational time consuming. Large Eddy Simulation, LES, even, is of the same category of DNS, could be an alternative whereby only the small scale turbulent fluctuations are modeled and the larger scale fluctuations are computed directly. Nevertheless, the Reynolds-Averaged Navier-Stokes, RANS, model have become the widespread standard base for numerous hydraulic machine design procedures. However, for many applications involving wall-bounded flows and attached boundary layers, various hybrid combinations of LES and RANS are being considered, such as Detached Eddy Simulation, DES, whereby the RANS approximation is kept in the regions where the boundary layers are attached to the solid walls. Furthermore, the accuracy of CFD simulations is highly dependent on the grid quality, in terms of grid uniformity in complex configurations. Moreover any successful structured and unstructured CFD codes have to offer a wide range to the variety of classic RANS model to hybrid complex model. The aim of this study is to compare the behavior of turbulent simulations for both structured and unstructured grids topology with two different CFD codes which used the same Francis turbine. Hence, the study is intended to outline the encountered discrepancy for predicting the wake of turbine blades by using either the standard k-epsilon model, or the standard k-epsilon model or the SST shear stress model in a steady CFD simulation. Finally, comparisons are made with experimental data from the EPFL Laboratory for Hydraulic Machines reduced scale model measurements.

  3. Enabling co-simulation of tokamak plant models and plasma control systems

    DOE PAGES

    Walker, M. L.

    2017-12-22

    A system for connecting the Plasma Control System and a model of the tokamak Plant in closed loop co-simulation for plasma control development has been in routine use at DIII-D for more than 20 years and at other fusion labs that use variants of the DIII-D PCS for approximately the last decade. Here, co-simulation refers to the simultaneous execution of two independent codes with the exchange of data - Plant actuator commands and tokamak diagnostic data - between them during execution. Interest in this type of PCS-Plant simulation technology has also been growing recently at other fusion facilities. In fact,more » use of such closed loop control simulations is assumed to play an even larger role in the development of both the ITER Plasma Control System (PCS) and the experimental operation of the ITER device, where they will be used to support verification/validation of the PCS and also for ITER pulse schedule development and validation. We describe the key use cases that motivate the co-simulation capability and the features that must be provided by the Plasma Control System to support it. These features could be provided by the PCS itself or by a model of the PCS. If the PCS itself is chosen to provide them, there are requirements imposed on its architecture. If a PCS model is chosen, there are requirements imposed on the initial implementation of this simulation as well as long-term consequences for its continued development and maintenance. We describe these issues for each use case and discuss the relative merits of the two choices. Several examples are given illustrating uses of the co-simulation method to address problems of plasma control during the operation of DIII-D and of other devices that use the DIII-D PCS.« less

  4. Enabling co-simulation of tokamak plant models and plasma control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, M. L.

    A system for connecting the Plasma Control System and a model of the tokamak Plant in closed loop co-simulation for plasma control development has been in routine use at DIII-D for more than 20 years and at other fusion labs that use variants of the DIII-D PCS for approximately the last decade. Here, co-simulation refers to the simultaneous execution of two independent codes with the exchange of data - Plant actuator commands and tokamak diagnostic data - between them during execution. Interest in this type of PCS-Plant simulation technology has also been growing recently at other fusion facilities. In fact,more » use of such closed loop control simulations is assumed to play an even larger role in the development of both the ITER Plasma Control System (PCS) and the experimental operation of the ITER device, where they will be used to support verification/validation of the PCS and also for ITER pulse schedule development and validation. We describe the key use cases that motivate the co-simulation capability and the features that must be provided by the Plasma Control System to support it. These features could be provided by the PCS itself or by a model of the PCS. If the PCS itself is chosen to provide them, there are requirements imposed on its architecture. If a PCS model is chosen, there are requirements imposed on the initial implementation of this simulation as well as long-term consequences for its continued development and maintenance. We describe these issues for each use case and discuss the relative merits of the two choices. Several examples are given illustrating uses of the co-simulation method to address problems of plasma control during the operation of DIII-D and of other devices that use the DIII-D PCS.« less

  5. State-and-transition simulation models: a framework for forecasting landscape change

    USGS Publications Warehouse

    Daniel, Colin; Frid, Leonardo; Sleeter, Benjamin M.; Fortin, Marie-Josée

    2016-01-01

    SummaryA wide range of spatially explicit simulation models have been developed to forecast landscape dynamics, including models for projecting changes in both vegetation and land use. While these models have generally been developed as separate applications, each with a separate purpose and audience, they share many common features.We present a general framework, called a state-and-transition simulation model (STSM), which captures a number of these common features, accompanied by a software product, called ST-Sim, to build and run such models. The STSM method divides a landscape into a set of discrete spatial units and simulates the discrete state of each cell forward as a discrete-time-inhomogeneous stochastic process. The method differs from a spatially interacting Markov chain in several important ways, including the ability to add discrete counters such as age and time-since-transition as state variables, to specify one-step transition rates as either probabilities or target areas, and to represent multiple types of transitions between pairs of states.We demonstrate the STSM method using a model of land-use/land-cover (LULC) change for the state of Hawai'i, USA. Processes represented in this example include expansion/contraction of agricultural lands, urbanization, wildfire, shrub encroachment into grassland and harvest of tree plantations; the model also projects shifts in moisture zones due to climate change. Key model output includes projections of the future spatial and temporal distribution of LULC classes and moisture zones across the landscape over the next 50 years.State-and-transition simulation models can be applied to a wide range of landscapes, including questions of both land-use change and vegetation dynamics. Because the method is inherently stochastic, it is well suited for characterizing uncertainty in model projections. When combined with the ST-Sim software, STSMs offer a simple yet powerful means for developing a wide range of models of

  6. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  7. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  8. Modeling of Protection in Dynamic Simulation Using Generic Relay Models and Settings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samaan, Nader A.; Dagle, Jeffery E.; Makarov, Yuri V.

    This paper shows how generic protection relay models available in planning tools can be augmented with settings that are based on NERC standards or best engineering practice. Selected generic relay models in Siemens PSS®E have been used in dynamic simulations in the proposed approach. Undervoltage, overvoltage, underfrequency, and overfrequency relays have been modeled for each generating unit. Distance-relay protection was modeled for transmission system protection. Two types of load-shedding schemes were modeled: underfrequency (frequency-responsive non-firm load shedding) and underfrequency and undervoltage firm load shedding. Several case studies are given to show the impact of protection devices on dynamic simulations. Thismore » is useful for simulating cascading outages.« less

  9. Towards Automatic Processing of Virtual City Models for Simulations

    NASA Astrophysics Data System (ADS)

    Piepereit, R.; Schilling, A.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2016-10-01

    Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.

  10. Multi-time scale Climate Informed Stochastic Hybrid Simulation-Optimization Model (McISH model) for Multi-Purpose Reservoir System

    NASA Astrophysics Data System (ADS)

    Lu, M.; Lall, U.

    2013-12-01

    In order to mitigate the impacts of climate change, proactive management strategies to operate reservoirs and dams are needed. A multi-time scale climate informed stochastic model is developed to optimize the operations for a multi-purpose single reservoir by simulating decadal, interannual, seasonal and sub-seasonal variability. We apply the model to a setting motivated by the largest multi-purpose dam in N. India, the Bhakhra reservoir on the Sutlej River, a tributary of the Indus. This leads to a focus on timing and amplitude of the flows for the monsoon and snowmelt periods. The flow simulations are constrained by multiple sources of historical data and GCM future projections, that are being developed through a NSF funded project titled 'Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoon Asia'. The model presented is a multilevel, nonlinear programming model that aims to optimize the reservoir operating policy on a decadal horizon and the operation strategy on an updated annual basis. The model is hierarchical, in terms of having a structure that two optimization models designated for different time scales are nested as a matryoshka doll. The two optimization models have similar mathematical formulations with some modifications to meet the constraints within that time frame. The first level of the model is designated to provide optimization solution for policy makers to determine contracted annual releases to different uses with a prescribed reliability; the second level is a within-the-period (e.g., year) operation optimization scheme that allocates the contracted annual releases on a subperiod (e.g. monthly) basis, with additional benefit for extra release and penalty for failure. The model maximizes the net benefit of irrigation, hydropower generation and flood control in each of the periods. The model design thus facilitates the consistent application of weather and climate forecasts to improve operations of reservoir systems. The

  11. Evolution Model and Simulation of Profit Model of Agricultural Products Logistics Financing

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wu, Yan

    2018-03-01

    Agricultural products logistics financial warehousing business mainly involves agricultural production and processing enterprises, third-party logistics enterprises and financial institutions tripartite, to enable the three parties to achieve win-win situation, the article first gives the replication dynamics and evolutionary stability strategy between the three parties in business participation, and then use NetLogo simulation platform, using the overall modeling and simulation method of Multi-Agent, established the evolutionary game simulation model, and run the model under different revenue parameters, finally, analyzed the simulation results. To achieve the agricultural products logistics financial financing warehouse business to participate in tripartite mutually beneficial win-win situation, thus promoting the smooth flow of agricultural products logistics business.

  12. Evaluation of air traffic control models and simulations.

    DOT National Transportation Integrated Search

    1971-06-01

    Approximately two hundred reports were identified as describing Air Traffic Control (ATC) modeling and simulation efforts. Of these, about ninety analytical and simulation models dealing with virtually all aspects of ATC were formally evaluated. The ...

  13. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  14. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE PAGES

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...

    2017-01-01

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  15. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  16. PSH Transient Simulation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muljadi, Eduard

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  17. The millennium water vapour drop in chemistry-climate model simulations

    NASA Astrophysics Data System (ADS)

    Brinkop, Sabine; Dameris, Martin; Jöckel, Patrick; Garny, Hella; Lossow, Stefan; Stiller, Gabriele

    2016-07-01

    This study investigates the abrupt and severe water vapour decline in the stratosphere beginning in the year 2000 (the "millennium water vapour drop") and other similarly strong stratospheric water vapour reductions by means of various simulations with the state-of-the-art Chemistry-Climate Model (CCM) EMAC (ECHAM/MESSy Atmospheric Chemistry Model). The model simulations differ with respect to the prescribed sea surface temperatures (SSTs) and whether nudging is applied or not. The CCM EMAC is able to most closely reproduce the signature and pattern of the water vapour drop in agreement with those derived from satellite observations if the model is nudged. Model results confirm that this extraordinary water vapour decline is particularly obvious in the tropical lower stratosphere and is related to a large decrease in cold point temperature. The drop signal propagates under dilution to the higher stratosphere and to the poles via the Brewer-Dobson circulation (BDC). We found that the driving forces for this significant decline in water vapour mixing ratios are tropical sea surface temperature (SST) changes due to a coincidence with a preceding strong El Niño-Southern Oscillation event (1997/1998) followed by a strong La Niña event (1999/2000) and supported by the change of the westerly to the easterly phase of the equatorial stratospheric quasi-biennial oscillation (QBO) in 2000. Correct (observed) SSTs are important for triggering the strong decline in water vapour. There are indications that, at least partly, SSTs contribute to the long period of low water vapour values from 2001 to 2006. For this period, the specific dynamical state of the atmosphere (overall atmospheric large-scale wind and temperature distribution) is important as well, as it causes the observed persistent low cold point temperatures. These are induced by a period of increased upwelling, which, however, has no corresponding pronounced signature in SSTs anomalies in the tropics. Our free

  18. Simulated permafrost soil thermal dynamics during 1960-2009 in eight offline processed-based models

    NASA Astrophysics Data System (ADS)

    Peng, S.; Gouttevin, I.; Krinner, G.; Ciais, P.

    2013-12-01

    Permafrost soil thermal dynamics not only determine the status of permafrost, but also have large impacts on permafrost organic carbon decomposition. Here, we used eight processed based models that participated in the Vulnerability Permafrost Carbon Research Coordination Network (RCN) project to investigate: (1) the trends in soil temperature at different depths over the northern hemisphere permafrost region during the past five decades, and (2) which factors drive trends and inter-annual variability of permafrost soil temperature? The simulated annual soil temperature at 20cm increases by ~0.02 °C per year from 1960 to 2009 (ranging from 0.00 °C per year in CoLM to 0.04 °C per year in ISBA). Most models simulated more warming of soil in spring and winter than in summer and autumn, although there were different seasonal trends in different models. Trends in soil temperature decrease with soil depth in all models. To quantify the contributions of various factors (air temperature, precipitation, downward longwave radiation etc.) to trends and inter-annual variation in soil temperature, we ran offline models with detrended air temperature, precipitation, downward longwave radiation, respectively. Our results suggest that both annual air temperature and downward longwave radiation significantly correlate with annual soil temperature. Moreover, trend in air temperature and downward longwave radiation contribute 30% and 60% to trends in soil temperature (0 - 200cm), respectively, during the period 1960-2009. Spatial distributions of trend in annual soil temperature at 20cm from R01 simulations of (a) CLM4, (b) CoLM, (c) ISBA, (d) JULES, (e) LPJ_GUESS, (f) ORCHIDEE, (g) UVic and (h) UW-VIC during the period 1960-2009.

  19. Historical Development of Simulation Models of Recreation Use

    Treesearch

    Jan W. van Wagtendonk; David N. Cole

    2005-01-01

    The potential utility of modeling as a park and wilderness management tool has been recognized for decades. Romesburg (1974) explored how mathematical decision modeling could be used to improve decisions about regulation of wilderness use. Cesario (1975) described a computer simulation modeling approach that utilized GPSS (General Purpose Systems Simulator), a...

  20. Modeling 100,000-year climate fluctuations in pre-Pleistocene time series

    NASA Technical Reports Server (NTRS)

    Crowley, Thomas J.; Kim, Kwang-Yul; Mengel, John G.; Short, David A.

    1992-01-01

    A number of pre-Pleistocene climate records exhibit significant fluctuations at the 100,000-year (100-ky) eccentricity period, before the time of such fluctuations in global ice volume. The origin of these fluctuations has been obscure. Results reported here from a modeling study suggest that such a response can occur over low-altitude land areas involved in monsoon fluctuations. The twice yearly passage of the sun across the equator and the seasonal timing of perihelion interact to increase both 100-ky and 400-ky power in the modeled temperature field. The magnitude of the temperature response is sufficiently large to leave an imprint on the geologic record, and simulated fluctuations resemble those found in records of Triassic lake levels.

  1. Applying dynamic simulation modeling methods in health care delivery research-the SIMULATE checklist: report of the ISPOR simulation modeling emerging good practices task force.

    PubMed

    Marshall, Deborah A; Burgos-Liz, Lina; IJzerman, Maarten J; Osgood, Nathaniel D; Padula, William V; Higashi, Mitchell K; Wong, Peter K; Pasupathy, Kalyan S; Crown, William

    2015-01-01

    Health care delivery systems are inherently complex, consisting of multiple tiers of interdependent subsystems and processes that are adaptive to changes in the environment and behave in a nonlinear fashion. Traditional health technology assessment and modeling methods often neglect the wider health system impacts that can be critical for achieving desired health system goals and are often of limited usefulness when applied to complex health systems. Researchers and health care decision makers can either underestimate or fail to consider the interactions among the people, processes, technology, and facility designs. Health care delivery system interventions need to incorporate the dynamics and complexities of the health care system context in which the intervention is delivered. This report provides an overview of common dynamic simulation modeling methods and examples of health care system interventions in which such methods could be useful. Three dynamic simulation modeling methods are presented to evaluate system interventions for health care delivery: system dynamics, discrete event simulation, and agent-based modeling. In contrast to conventional evaluations, a dynamic systems approach incorporates the complexity of the system and anticipates the upstream and downstream consequences of changes in complex health care delivery systems. This report assists researchers and decision makers in deciding whether these simulation methods are appropriate to address specific health system problems through an eight-point checklist referred to as the SIMULATE (System, Interactions, Multilevel, Understanding, Loops, Agents, Time, Emergence) tool. It is a primer for researchers and decision makers working in health care delivery and implementation sciences who face complex challenges in delivering effective and efficient care that can be addressed with system interventions. On reviewing this report, the readers should be able to identify whether these simulation modeling

  2. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Temperature responses to the 11 year solar cycle in the mesosphere from the 31 year (1979-2010) extended Canadian Middle Atmosphere Model simulations and a comparison with the 14 year (2002-2015) TIMED/SABER observations

    NASA Astrophysics Data System (ADS)

    Gan, Quan; Du, Jian; Fomichev, Victor I.; Ward, William E.; Beagley, Stephen R.; Zhang, Shaodong; Yue, Jia

    2017-04-01

    A recent 31 year simulation (1979-2010) by extended Canadian Middle Atmosphere Model (eCMAM30) and the 14 year (2002-2015) observation by the Thermosphere Ionosphere Mesosphere and Dynamics/Sounding of the Atmosphere using Broadband Emssion Radiometry (TIMED/SABER) are utilized to investigate the temperature response to the 11 year solar cycle on the mesosphere. Overall, the zonal mean responses tend to increase with height, and the amplitudes are on the order of 1-2 K/100 solar flux unit (1 sfu = 10-22 W m-2 Hz-1) below 80 km and 2-4 K/100 sfu in the mesopause region (80-100 km) from the eCMAM30, comparatively weaker than those from the SABER except in the midlatitude lower mesosphere. A pretty good consistence takes place at around 75-80 km with a response of 1.5 K/100 sfu within 10°S/N. Also, a symmetric pattern of the responses about the equator agrees reasonably well between the two. It is noteworthy that the eCMAM30 displays an alternate structure with the upper stratospheric cooling and the lower mesospheric warming at midlatitudes of the winter hemisphere, in favor of the long-term Rayleigh lidar observation reported by the previous studies. Through diagnosing multiple dynamical parameters, it is manifested that this localized feature is induced by the anomalous residual circulation as a consequence of the wave-mean flow interaction during the solar maximum year.

  4. 7-year of surface ozone in a coastal city of central Italy: Observations and models

    NASA Astrophysics Data System (ADS)

    Biancofiore, Fabio; Verdecchia, Marco; Di Carlo, Piero; Tomassetti, Barbara; Aruffo, Eleonora; Busilacchio, Marcella; Bianco, Sebastiano; Di Tommaso, Sinibaldo; Colangeli, Carlo

    2014-05-01

    Hourly concentrations of ozone (O3) and nitrogen dioxide (NO2) have been measured for seven years, from 1998 to 2005, in a seaside town in the central Italy. Seasonal trends of O3 and NO2 recorded in the considered years are studied. Furthermore, we have focused our attention on data collected during the 2005, analyzing them using two different methods: a regression model and a neural network model. Both models are used to simulate the hourly ozone concentration, using several sets of input. In order to evaluate the performance of the model four statistical criteria are used: correlation coefficient (R), fractional bias (FB), normalized mean squared error (NMSE) e factor of two (FA2). All the criteria show that the neural network has better results compared to the regression model in all the simulations. In addiction we have tested some improvements of the neural network model, results of these tests are discussed. Finally, we have used the neural network to forecast the ozone hourly concentrations a day ahead and 1, 3, 6, 12 hour ahead. Performances of the model in predicting ozone levels are discussed.

  5. Large-scale tropospheric transport in the Chemistry-Climate Model Initiative (CCMI) simulations

    NASA Astrophysics Data System (ADS)

    Orbe, Clara; Yang, Huang; Waugh, Darryn W.; Zeng, Guang; Morgenstern, Olaf; Kinnison, Douglas E.; Lamarque, Jean-Francois; Tilmes, Simone; Plummer, David A.; Scinocca, John F.; Josse, Beatrice; Marecal, Virginie; Jöckel, Patrick; Oman, Luke D.; Strahan, Susan E.; Deushi, Makoto; Tanaka, Taichu Y.; Yoshida, Kohei; Akiyoshi, Hideharu; Yamashita, Yousuke; Stenke, Andreas; Revell, Laura; Sukhodolov, Timofei; Rozanov, Eugene; Pitari, Giovanni; Visioni, Daniele; Stone, Kane A.; Schofield, Robyn; Banerjee, Antara

    2018-05-01

    Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future) changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry-Climate Model Initiative (CCMI). Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH) midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than) the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  6. Discovering mental models and frames in learning of nursing ethics through simulations.

    PubMed

    Díaz Agea, J L; Martín Robles, M R; Jiménez Rodríguez, D; Morales Moreno, I; Viedma Viedma, I; Leal Costa, C

    2018-05-15

    The acquisition of ethical competence is necessary in nursing. The aims of the study were to analyse students' perceptions of the process of learning ethics through simulations and to describe the underlying frames that inform the decision making process of nursing students. A qualitative study based on the analysis of simulated experiences and debriefings of six simulated scenarios with ethical content in three different groups of fourth-year nursing students (n = 30), was performed. The simulated situations were designed to contain ethical dilemmas. The students' perspective regarding their learning and acquisition of ethical competence through simulations was positive. A total of 15 mental models were identified that underlie the ethical decision making of the students. The student's opinions reinforce the use of simulations as a tool for learning ethics. Thus, the putting into practice the knowledge regarding the frames that guide ethical actions is a suitable pedagogical strategy. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in

  8. Smoothing inpatient discharges decreases emergency department congestion: a system dynamics simulation model.

    PubMed

    Wong, Hannah J; Wu, Robert C; Caesar, Michael; Abrams, Howard; Morra, Dante

    2010-08-01

    Timely access to emergency patient care is an important quality and efficiency issue. Reduced discharges of inpatients at weekends are a reality to many hospitals and may reduce hospital efficiency and contribute to emergency department (ED) congestion. To evaluate the daily number of ED beds occupied by inpatients after evenly distributing inpatient discharges over the course of the week using a computer simulation model. Simulation modelling study from an academic care hospital in Toronto, Canada. Daily historical data from the general internal medicine (GIM) department between 15 January and 15 December for two years, 2005 and 2006, were used for model building and validation, respectively. There was good agreement between model simulations and historical data for both ED and ward censuses and their respective lengths of stay (LOS), with the greatest difference being +7.8% for GIM ward LOS (model: 9.3 days vs historical: 8.7 days). When discharges were smoothed across the 7 days, the number of ED beds occupied by GIM patients decreased by approximately 27-57% while ED LOS decreased 7-14 hours. The model also demonstrated that patients occupying hospital beds who no longer require acute care have a considerable impact on ED and ward beds. Smoothing out inpatient discharges over the course of a week had a positive effect on decreasing the number of ED beds occupied by inpatients. Despite the particular challenges associated with weekend discharges, simulation experiments suggest that discharges evenly spread across the week may significantly reduce bed requirements and ED LOS.

  9. Impact of high resolution land surface initialization in Indian summer monsoon simulation using a regional climate model

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, C. K.; Rajeevan, M.; Rao, S. Vijaya Bhaskara

    2016-06-01

    The direct impact of high resolution land surface initialization on the forecast bias in a regional climate model in recent years over Indian summer monsoon region is investigated. Two sets of regional climate model simulations are performed, one with a coarse resolution land surface initial conditions and second one used a high resolution land surface data for initial condition. The results show that all monsoon years respond differently to the high resolution land surface initialization. The drought monsoon year 2009 and extended break periods were more sensitive to the high resolution land surface initialization. These results suggest that the drought monsoon year predictions can be improved with high resolution land surface initialization. Result also shows that there are differences in the response to the land surface initialization within the monsoon season. Case studies of heat wave and a monsoon depression simulation show that, the model biases were also improved with high resolution land surface initialization. These results show the need for a better land surface initialization strategy in high resolution regional models for monsoon forecasting.

  10. Integrated Turbine-Based Combined Cycle Dynamic Simulation Model

    NASA Technical Reports Server (NTRS)

    Haid, Daniel A.; Gamble, Eric J.

    2011-01-01

    A Turbine-Based Combined Cycle (TBCC) dynamic simulation model has been developed to demonstrate all modes of operation, including mode transition, for a turbine-based combined cycle propulsion system. The High Mach Transient Engine Cycle Code (HiTECC) is a highly integrated tool comprised of modules for modeling each of the TBCC systems whose interactions and controllability affect the TBCC propulsion system thrust and operability during its modes of operation. By structuring the simulation modeling tools around the major TBCC functional modes of operation (Dry Turbojet, Afterburning Turbojet, Transition, and Dual Mode Scramjet) the TBCC mode transition and all necessary intermediate events over its entire mission may be developed, modeled, and validated. The reported work details the use of the completed model to simulate a TBCC propulsion system as it accelerates from Mach 2.5, through mode transition, to Mach 7. The completion of this model and its subsequent use to simulate TBCC mode transition significantly extends the state-of-the-art for all TBCC modes of operation by providing a numerical simulation of the systems, interactions, and transient responses affecting the ability of the propulsion system to transition from turbine-based to ramjet/scramjet-based propulsion while maintaining constant thrust.

  11. Fault diagnosis based on continuous simulation models

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  12. High resolution global climate modelling; the UPSCALE project, a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-01-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  13. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  14. Comparison of tropical cyclogenesis processes in climate model and cloud-resolving model simulations using moist static energy budget analysis

    NASA Astrophysics Data System (ADS)

    Wing, Allison; Camargo, Suzana; Sobel, Adam; Kim, Daehyun; Murakami, Hiroyuki; Reed, Kevin; Vecchi, Gabriel; Wehner, Michael; Zarzycki, Colin; Zhao, Ming

    2017-04-01

    In recent years, climate models have improved such that high-resolution simulations are able to reproduce the climatology of tropical cyclone activity with some fidelity and show some skill in seasonal forecasting. However biases remain in many models, motivating a better understanding of what factors control the representation of tropical cyclone activity in climate models. We explore the tropical cyclogenesis processes in five high-resolution climate models, including both coupled and uncoupled configurations. Our analysis framework focuses on how convection, moisture, clouds and related processes are coupled and employs budgets of column moist static energy and the spatial variance of column moist static energy. The latter was originally developed to study the mechanisms of tropical convective organization in idealized cloud-resolving models, and allows us to quantify the different feedback processes responsible for the amplification of moist static energy anomalies associated with the organization of convection and cyclogenesis. We track the formation and evolution of tropical cyclones in the climate model simulations and apply our analysis both along the individual tracks and composited over many tropical cyclones. We then compare the genesis processes; in particular, the role of cloud-radiation interactions, to those of spontaneous tropical cyclogenesis in idealized cloud-resolving model simulations.

  15. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    PubMed Central

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  16. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  17. Simulation of hydrodynamics using large eddy simulation-second-order moment model in circulating fluidized beds

    NASA Astrophysics Data System (ADS)

    Juhui, Chen; Yanjia, Tang; Dan, Li; Pengfei, Xu; Huilin, Lu

    2013-07-01

    Flow behavior of gas and particles is predicted by the large eddy simulation of gas-second order moment of solid model (LES-SOM model) in the simulation of flow behavior in CFB. This study shows that the simulated solid volume fractions along height using a two-dimensional model are in agreement with experiments. The velocity, volume fraction and second-order moments of particles are computed. The second-order moments of clusters are calculated. The solid volume fraction, velocity and second order moments are compared at the three different model constants.

  18. A two-year experience of an integrated simulation residency curriculum.

    PubMed

    Wittels, Kathleen A; Takayesu, James K; Nadel, Eric S

    2012-07-01

    Human Patient Simulation (HPS) is increasingly used in medical education, but its role in Emergency Medicine (EM) residency education is uncertain. The objective of this study was to evaluate the perceived effectiveness of HPS when fully integrated into an EM residency didactic curriculum. The study design was a cross-sectional survey performed in 2006, 2 years after the implementation of an integrated simulation curriculum. Fifty-four residents (postgraduate year [PGY] 1-4) of a 4-year EM residency were surveyed with demographic and curricular questions on the perceived value of simulation relative to other teaching formats. Survey items were rated on a bipolar linear numeric scale of 1 (strongly disagree) to 9 (strongly agree), with 5 being neutral. Data were analyzed using Student t-tests. Forty residents responded to the survey (74% response rate). The perceived effectiveness of HPS was higher for junior residents than senior residents (8.0 vs. 6.2, respectively, p<0.001). There were no differences in perceived effectiveness of lectures (7.8 vs. 7.9, respectively, p=0.1), morbidity and mortality conference (8.5 vs. 8.7, respectively, p=0.3), and trauma conference (8.4 vs. 8.8, respectively, p=0.2) between junior and senior residents. Scores for perceptions of improvement in residency training (knowledge acquisition and clinical decision-making) after the integration of HPS into the curriculum were positive for all residents. Residents' perceptions of HPS integration into an EM residency curriculum are positive for both improving knowledge acquisition and learning clinical decision-making. HPS was rated as more effective during junior years than senior years, while the perceived efficacy of more traditional educational modalities remained constant throughout residency training. Copyright © 2012. Published by Elsevier Inc.

  19. Water-Balance Model to Simulate Historical Lake Levels for Lake Merced, California

    NASA Astrophysics Data System (ADS)

    Maley, M. P.; Onsoy, S.; Debroux, J.; Eagon, B.

    2009-12-01

    Lake Merced is a freshwater lake located in southwestern San Francisco, California. In the late 1980s and early 1990s, an extended, severe drought impacted the area that resulted in significant declines in Lake Merced lake levels that raised concerns about the long-term health of the lake. In response to these concerns, the Lake Merced Water Level Restoration Project was developed to evaluate an engineered solution to increase and maintain Lake Merced lake levels. The Lake Merced Lake-Level Model was developed to support the conceptual engineering design to restore lake levels. It is a spreadsheet-based water-balance model that performs monthly water-balance calculations based on the hydrological conceptual model. The model independently calculates each water-balance component based on available climate and hydrological data. The model objective was to develop a practical, rule-based approach for the water balance and to calibrate the model results to measured lake levels. The advantage of a rule-based approach is that once the rules are defined, they enhance the ability to then adapt the model for use in future-case simulations. The model was calibrated to historical lake levels over a 70-year period from 1939 to 2009. Calibrating the model over this long historical range tested the model over a variety of hydrological conditions including wet, normal and dry precipitation years, flood events, and periods of high and low lake levels. The historical lake level range was over 16 feet. The model calibration of historical to simulated lake levels had a residual mean of 0.02 feet and an absolute residual mean of 0.42 feet. More importantly, the model demonstrated the ability to simulate both long-term and short-term trends with a strong correlation of the magnitude for both annual and seasonal fluctuations in lake levels. The calibration results demonstrate an improved conceptual understanding of the key hydrological factors that control lake levels, reduce uncertainty

  20. Operative and diagnostic hysteroscopy: A novel learning model combining new animal models and virtual reality simulation.

    PubMed

    Bassil, Alfred; Rubod, Chrystèle; Borghesi, Yves; Kerbage, Yohan; Schreiber, Elie Servan; Azaïs, Henri; Garabedian, Charles

    2017-04-01

    Hysteroscopy is one of the most common gynaecological procedure. Training for diagnostic and operative hysteroscopy can be achieved through numerous previously described models like animal models or virtual reality simulation. We present our novel combined model associating virtual reality and bovine uteruses and bladders. End year residents in obstetrics and gynaecology attended a full day workshop. The workshop was divided in theoretical courses from senior surgeons and hands-on training in operative hysteroscopy and virtual reality Essure ® procedures using the EssureSim™ and Pelvicsim™ simulators with multiple scenarios. Theoretical and operative knowledge was evaluated before and after the workshop and General Points Averages (GPAs) were calculated and compared using a Student's T test. GPAs were significantly higher after the workshop was completed. The biggest difference was observed in operative knowledge (0,28 GPA before workshop versus 0,55 after workshop, p<0,05). All of the 25 residents having completed the workshop applauded the realism an efficiency of this type of training. The force feedback allowed by the cattle uteruses gives the residents the possibility to manage thickness of resection as in real time surgery. Furthermore, the two-horned bovine uteruses allowed to reproduce septa resection in conditions close to human surgery CONCLUSION: Teaching operative and diagnostic hysteroscopy is essential. Managing this training through a full day workshop using a combined animal model and virtual reality simulation is an efficient model not described before. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Water-budgets and recharge-area simulations for the Spring Creek and Nittany Creek Basins and parts of the Spruce Creek Basin, Centre and Huntingdon Counties, Pennsylvania, Water Years 2000–06

    USGS Publications Warehouse

    Fulton, John W.; Risser, Dennis W.; Regan, R. Steve; Walker, John F.; Hunt, Randall J.; Niswonger, Richard G.; Hoffman, Scott A.; Markstrom, Steven

    2015-08-17

    This report describes the results of a study by the U.S. Geological Survey in cooperation with ClearWater Conservancy and the Pennsylvania Department of Environmental Protection to develop a hydrologic model to simulate a water budget and identify areas of greater than average recharge for the Spring Creek Basin in central Pennsylvania. The model was developed to help policy makers, natural resource managers, and the public better understand and manage the water resources in the region. The Groundwater and Surface-water FLOW model (GSFLOW), which is an integration of the Precipitation-Runoff Modeling System (PRMS) and the Modular Groundwater Flow Model (MODFLOW-NWT), was used to simulate surface water and groundwater in the Spring Creek Basin for water years 2000–06. Because the groundwater and surface-water divides for the Spring Creek Basin do not coincide, the study area includes the Nittany Creek Basin and headwaters of the Spruce Creek Basin. The hydrologic model was developed by the use of a stepwise process: (1) develop and calibrate a PRMS model and steady-state MODFLOW-NWT model; (2) re-calibrate the steady-state MODFLOW-NWT model using potential recharge estimates simulated from the PRMS model, and (3) integrate the PRMS and MODFLOW-NWT models into GSFLOW. The individually calibrated PRMS and MODFLOW-NWT models were used as a starting point for the calibration of the fully coupled GSFLOW model. The GSFLOW model calibration was done by comparing observations and corresponding simulated values of streamflow from 11 streamgages and groundwater levels from 16 wells. The cumulative water budget and individual water budgets for water years 2000–06 were simulated by using GSFLOW. The largest source and sink terms are represented by precipitation and evapotranspiration, respectively. For the period simulated, a net surplus in the water budget was computed where inflows exceeded outflows by about 1.7 billion cubic feet (0.47 inches per year over the basin area

  2. Mars Smart Lander Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Queen, Eric M.; Raiszadeh, Ben

    2002-01-01

    A multi-body flight simulation for the Mars Smart Lander has been developed that includes six degree-of-freedom rigid-body models for both the supersonically-deployed and subsonically-deployed parachutes. This simulation is designed to be incorporated into a larger simulation of the entire entry, descent and landing (EDL) sequence. The complete end-to-end simulation will provide attitude history predictions of all bodies throughout the flight as well as loads on each of the connecting lines. Other issues such as recontact with jettisoned elements (heat shield, back shield, parachute mortar covers, etc.), design of parachute and attachment points, and desirable line properties can also be addressed readily using this simulation.

  3. Simulation of crop yield variability by improved root-soil-interaction modelling

    NASA Astrophysics Data System (ADS)

    Duan, X.; Gayler, S.; Priesack, E.

    2009-04-01

    Understanding the processes and factors that govern the within-field variability in crop yield has attached great importance due to applications in precision agriculture. Crop response to environment at field scale is a complex dynamic process involving the interactions of soil characteristics, weather conditions and crop management. The numerous static factors combined with temporal variations make it very difficult to identify and manage the variability pattern. Therefore, crop simulation models are considered to be useful tools in analyzing separately the effects of change in soil or weather conditions on the spatial variability, in order to identify the cause of yield variability and to quantify the spatial and temporal variation. However, tests showed that usual crop models such as CERES-Wheat and CERES-Maize were not able to quantify the observed within-field yield variability, while their performance on crop growth simulation under more homogeneous and mainly non-limiting conditions was sufficent to simulate average yields at the field-scale. On a study site in South Germany, within-field variability in crop growth has been documented since years. After detailed analysis and classification of the soil patterns, two site specific factors, the plant-available-water and the O2 deficiency, were considered as the main causes of the crop growth variability in this field. Based on our measurement of root distribution in the soil profile, we hypothesize that in our case the insufficiency of the applied crop models to simulate the yield variability can be due to the oversimplification of the involved root models which fail to be sensitive to different soil conditions. In this study, the root growth model described by Jones et al. (1991) was adapted by using data of root distributions in the field and linking the adapted root model to the CERES crop model. The ability of the new root model to increase the sensitivity of the CERES crop models to different enviromental

  4. Human Centered Modeling and Simulation

    Science.gov Websites

    Contacts Researchers Thrust Area 2: Human Centered Modeling and Simulation Thrust Area Leader: Dr. Matthew performance of human occupants and operators are paramount in the achievement of ground vehicle design objectives, but these occupants are also the most variable components of the human-machine system. Modeling

  5. Model Refinement and Simulation of Groundwater Flow in Clinton, Eaton, and Ingham Counties, Michigan

    USGS Publications Warehouse

    Luukkonen, Carol L.

    2010-01-01

    potential declines in water levels in both the upper glacial aquifer and the upper sandstone bedrock aquifer under steady-state and transient conditions when recharge was reduced by 20 and 50 percent in urban areas. Transient simulations were done to investigate reduced recharge due to low rainfall and increased pumping to meet anticipated future demand with 24 months (2 years) of modified recharge or modified recharge and pumping rates. During these two simulation years, monthly recharge rates were reduced by about 30 percent, and monthly withdrawal rates for Lansing area production wells were increased by 15 percent. The reduction in the amount of water available to recharge the groundwater system affects the upper model layers representing the glacial aquifers more than the deeper bedrock layers. However, with a reduction in recharge and an increase in withdrawals from the bedrock aquifer, water levels in the bedrock layers are affected more than those in the glacial layers. Differences in water levels between simulations with reduced recharge and reduced recharge with increased pumping are greatest in the Lansing area and least away from pumping centers, as expected. Additionally, the increases in pumping rates had minimal effect on most simulated streamflows. Additional simulations included updating the estimated 10-year wellhead-contributing areas for selected Lansing-area wells under 2006-7 pumping conditions. Optimization of groundwater withdrawals with a water-resource management model was done to determine withdrawal rates while minimizing operational costs and to determine withdrawal locations to achieve additional capacity while meeting specified head constraints. In these optimization scenarios, the desired groundwater withdrawals are achieved by simulating managed wells (where pumping rates can be optimized) and unmanaged wells (where pumping rates are not optimized) and by using various combinations of existing and proposed well locations.

  6. Advances in edge-diffraction modeling for virtual-acoustic simulations

    NASA Astrophysics Data System (ADS)

    Calamia, Paul Thomas

    In recent years there has been growing interest in modeling sound propagation in complex, three-dimensional (3D) virtual environments. With diverse applications for the military, the gaming industry, psychoacoustics researchers, architectural acousticians, and others, advances in computing power and 3D audio-rendering techniques have driven research and development aimed at closing the gap between the auralization and visualization of virtual spaces. To this end, this thesis focuses on improving the physical and perceptual realism of sound-field simulations in virtual environments through advances in edge-diffraction modeling. To model sound propagation in virtual environments, acoustical simulation tools commonly rely on geometrical-acoustics (GA) techniques that assume asymptotically high frequencies, large flat surfaces, and infinitely thin ray-like propagation paths. Such techniques can be augmented with diffraction modeling to compensate for the effect of surface size on the strength and directivity of a reflection, to allow for propagation around obstacles and into shadow zones, and to maintain soundfield continuity across reflection and shadow boundaries. Using a time-domain, line-integral formulation of the Biot-Tolstoy-Medwin (BTM) diffraction expression, this thesis explores various aspects of diffraction calculations for virtual-acoustic simulations. Specifically, we first analyze the periodic singularity of the BTM integrand and describe the relationship between the singularities and higher-order reflections within wedges with open angle less than 180°. Coupled with analytical approximations for the BTM expression, this analysis allows for accurate numerical computations and a continuous sound field in the vicinity of an arbitrary wedge geometry insonified by a point source. Second, we describe an edge-subdivision strategy that allows for fast diffraction calculations with low error relative to a numerically more accurate solution. Third, to address

  7. Heat waves over Central Europe in regional climate model simulations

    NASA Astrophysics Data System (ADS)

    Lhotka, Ondřej; Kyselý, Jan

    2014-05-01

    Regional climate models (RCMs) have become a powerful tool for exploring impacts of global climate change on a regional scale. The aim of the study is to evaluate the capability of RCMs to reproduce characteristics of major heat waves over Central Europe in their simulations of the recent climate (1961-2000), with a focus on the most severe and longest Central European heat wave that occurred in 1994. We analyzed 7 RCM simulations with a high resolution (0.22°) from the ENSEMBLES project, driven by the ERA-40 reanalysis. In observed data (the E-OBS 9.0 dataset), heat waves were defined on the basis of deviations of daily maximum temperature (Tmax) from the 95% quantile of summer Tmax distribution in grid points over Central Europe. The same methodology was applied in the RCM simulations; we used corresponding 95% quantiles (calculated for each RCM and grid point) in order to remove the bias of modelled Tmax. While climatological characteristics of heat waves are reproduced reasonably well in the RCM ensemble, we found major deficiencies in simulating heat waves in individual years. For example, METNOHIRHAM simulated very severe heat waves in 1996, when no heat wave was observed. Focusing on the major 1994 heat wave, considerable differences in simulated temperature patterns were found among the RCMs. The differences in the temperature patterns were clearly linked to the simulated amount of precipitation during this event. The 1994 heat wave was almost absent in all RCMs that did not capture the observed precipitation deficit, while it was by far most pronounced in KNMI-RACMO that simulated virtually no precipitation over Central Europe during the 15-day period of the heat wave. By contrast to precipitation, values of evaporative fraction in the RCMs were not linked to severity of the simulated 1994 heat wave. This suggests a possible major contribution of other factors such as cloud cover and associated downward shortwave radiation. Therefore, a more detailed

  8. Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.

    PubMed

    Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve

    2011-11-01

    Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Fire dynamics during the 20th century simulated by the Community Land Model

    NASA Astrophysics Data System (ADS)

    Kloster, S.; Mahowald, N. M.; Randerson, J. T.; Thornton, P. E.; Hoffman, F. M.; Levis, S.; Lawrence, P. J.; Feddema, J. J.; Oleson, K. W.; Lawrence, D. M.

    2010-06-01

    Fire is an integral Earth System process that interacts with climate in multiple ways. Here we assessed the parametrization of fires in the Community Land Model (CLM-CN) and improved the ability of the model to reproduce contemporary global patterns of burned areas and fire emissions. In addition to wildfires we extended CLM-CN to account for fires related to deforestation. We compared contemporary fire carbon emissions predicted by the model to satellite-based estimates in terms of magnitude and spatial extent as well as interannual and seasonal variability. Long-term trends during the 20th century were compared with historical estimates. Overall we found the best agreement between simulation and observations for the fire parametrization based on the work by Arora and Boer (2005). We obtained substantial improvement when we explicitly considered human caused ignition and fire suppression as a function of population density. Simulated fire carbon emissions ranged between 2.0 and 2.4 Pg C/year for the period 1997-2004. Regionally the simulations had a low bias over Africa and a high bias over South America when compared to satellite-based products. The net terrestrial carbon source due to land use change for the 1990s was 1.2 Pg C/year with 11% stemming from deforestation fires. During 2000-2004 this flux decreased to 0.85 Pg C/year with a similar relative contribution from deforestation fires. Between 1900 and 1960 we predicted a slight downward trend in global fire emissions caused by reduced fuels as a consequence of wood harvesting and also by increases in fire suppression. The model predicted an upward trend during the last three decades of the 20th century as a result of climate variations and large burning events associated with ENSO-induced drought conditions.

  10. Fire dynamics during the 20th century simulated by the Community Land Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kloster, Silvia; Mahowald, Natalie; Randerson, Jim

    2011-01-01

    Fire is an integral Earth System process that interacts with climate in multiple ways. Here we assessed the parametrization of fires in the Community Land Model (CLM-CN) and improved the ability of the model to reproduce contemporary global patterns of burned areas and fire emissions. In addition to wildfires we extended CLM-CN to account for fires related to deforestation. We compared contemporary fire carbon emissions predicted by the model to satellite-based estimates in terms of magnitude and spatial extent as well as interannual and seasonal variability. Long-term trends during the 20th century were compared with historical estimates. Overall we foundmore » the best agreement between simulation and observations for the fire parametrization based on the work by Arora and Boer (2005). We obtained substantial improvement when we explicitly considered human caused ignition and fire suppression as a function of population density. Simulated fire carbon emissions ranged between 2.0 and 2.4 Pg C/year for the period 1997 2004. Regionally the simulations had a low bias over Africa and a high bias over South America when compared to satellite-based products. The net terrestrial carbon source due to land use change for the 1990s was 1.2 Pg C/year with 11% stemming from deforestation fires. During 2000 2004 this flux decreased to 0.85 Pg C/year with a similar relative contribution from deforestation fires. Between 1900 and 1960 we predicted a slight downward trend in global fire emissions caused by reduced fuels as a consequence of wood harvesting and also by increases in fire suppression. The model predicted an upward trend during the last three decades of the 20th century as a result of climate variations and large burning events associated with ENSO-induced drought conditions.« less

  11. PSPICE Hybrid Modeling and Simulation of Capacitive Micro-Gyroscopes

    PubMed Central

    Su, Yan; Tong, Xin; Liu, Nan; Han, Guowei; Si, Chaowei; Ning, Jin; Li, Zhaofeng; Yang, Fuhua

    2018-01-01

    With an aim to reduce the cost of prototype development, this paper establishes a PSPICE hybrid model for the simulation of capacitive microelectromechanical systems (MEMS) gyroscopes. This is achieved by modeling gyroscopes in different modules, then connecting them in accordance with the corresponding principle diagram. Systematic simulations of this model are implemented along with a consideration of details of MEMS gyroscopes, including a capacitance model without approximation, mechanical thermal noise, and the effect of ambient temperature. The temperature compensation scheme and optimization of interface circuits are achieved based on the hybrid closed-loop simulation of MEMS gyroscopes. The simulation results show that the final output voltage is proportional to the angular rate input, which verifies the validity of this model. PMID:29597284

  12. Intercomparison of Streamflow Simulations between WRF-Hydro and Hydrology Laboratory-Research Distributed Hydrologic Model Frameworks

    NASA Astrophysics Data System (ADS)

    KIM, J.; Smith, M. B.; Koren, V.; Salas, F.; Cui, Z.; Johnson, D.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA)-National Weather Service (NWS) developed the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) framework as an initial step towards spatially distributed modeling at River Forecast Centers (RFCs). Recently, the NOAA/NWS worked with the National Center for Atmospheric Research (NCAR) to implement the National Water Model (NWM) for nationally-consistent water resources prediction. The NWM is based on the WRF-Hydro framework and is run at a 1km spatial resolution and 1-hour time step over the contiguous United States (CONUS) and contributing areas in Canada and Mexico. In this study, we compare streamflow simulations from HL-RDHM and WRF-Hydro to observations from 279 USGS stations. For streamflow simulations, HL-RDHM is run on 4km grids with the temporal resolution of 1 hour for a 5-year period (Water Years 2008-2012), using a priori parameters provided by NOAA-NWS. The WRF-Hydro streamflow simulations for the same time period are extracted from NCAR's 23 retrospective run of the NWM (version 1.0) over CONUS based on 1km grids. We choose 279 USGS stations which are relatively less affected by dams or reservoirs, in the domains of six different RFCs. We use the daily average values of simulations and observations for the convenience of comparison. The main purpose of this research is to evaluate how HL-RDHM and WRF-Hydro perform at USGS gauge stations. We compare daily time-series of observations and both simulations, and calculate the error values using a variety of error functions. Using these plots and error values, we evaluate the performances of HL-RDHM and WRF-Hydro models. Our results show a mix of model performance across geographic regions.

  13. Modeling and simulating industrial land-use evolution in Shanghai, China

    NASA Astrophysics Data System (ADS)

    Qiu, Rongxu; Xu, Wei; Zhang, John; Staenz, Karl

    2018-01-01

    This study proposes a cellular automata-based Industrial and Residential Land Use Competition Model to simulate the dynamic spatial transformation of industrial land use in Shanghai, China. In the proposed model, land development activities in a city are delineated as competitions among different land-use types. The Hedonic Land Pricing Model is adopted to implement the competition framework. To improve simulation results, the Land Price Agglomeration Model was devised to simulate and adjust classic land price theory. A new evolutionary algorithm-based parameter estimation method was devised in place of traditional methods. Simulation results show that the proposed model closely resembles actual land transformation patterns and the model can not only simulate land development, but also redevelopment processes in metropolitan areas.

  14. The influence of atmospheric grid resolution in a climate model-forced ice sheet simulation

    NASA Astrophysics Data System (ADS)

    Lofverstrom, Marcus; Liakka, Johan

    2018-04-01

    Coupled climate-ice sheet simulations have been growing in popularity in recent years. Experiments of this type are however challenging as ice sheets evolve over multi-millennial timescales, which is beyond the practical integration limit of most Earth system models. A common method to increase model throughput is to trade resolution for computational efficiency (compromise accuracy for speed). Here we analyze how the resolution of an atmospheric general circulation model (AGCM) influences the simulation quality in a stand-alone ice sheet model. Four identical AGCM simulations of the Last Glacial Maximum (LGM) were run at different horizontal resolutions: T85 (1.4°), T42 (2.8°), T31 (3.8°), and T21 (5.6°). These simulations were subsequently used as forcing of an ice sheet model. While the T85 climate forcing reproduces the LGM ice sheets to a high accuracy, the intermediate resolution cases (T42 and T31) fail to build the Eurasian ice sheet. The T21 case fails in both Eurasia and North America. Sensitivity experiments using different surface mass balance parameterizations improve the simulations of the Eurasian ice sheet in the T42 case, but the compromise is a substantial ice buildup in Siberia. The T31 and T21 cases do not improve in the same way in Eurasia, though the latter simulates the continent-wide Laurentide ice sheet in North America. The difficulty to reproduce the LGM ice sheets in the T21 case is in broad agreement with previous studies using low-resolution atmospheric models, and is caused by a substantial deterioration of the model climate between the T31 and T21 resolutions. It is speculated that this deficiency may demonstrate a fundamental problem with using low-resolution atmospheric models in these types of experiments.

  15. Cross-Scale Modelling of Subduction from Minute to Million of Years Time Scale

    NASA Astrophysics Data System (ADS)

    Sobolev, S. V.; Muldashev, I. A.

    2015-12-01

    Subduction is an essentially multi-scale process with time-scales spanning from geological to earthquake scale with the seismic cycle in-between. Modelling of such process constitutes one of the largest challenges in geodynamic modelling today.Here we present a cross-scale thermomechanical model capable of simulating the entire subduction process from rupture (1 min) to geological time (millions of years) that employs elasticity, mineral-physics-constrained non-linear transient viscous rheology and rate-and-state friction plasticity. The model generates spontaneous earthquake sequences. The adaptive time-step algorithm recognizes moment of instability and drops the integration time step to its minimum value of 40 sec during the earthquake. The time step is then gradually increased to its maximal value of 5 yr, following decreasing displacement rates during the postseismic relaxation. Efficient implementation of numerical techniques allows long-term simulations with total time of millions of years. This technique allows to follow in details deformation process during the entire seismic cycle and multiple seismic cycles. We observe various deformation patterns during modelled seismic cycle that are consistent with surface GPS observations and demonstrate that, contrary to the conventional ideas, the postseismic deformation may be controlled by viscoelastic relaxation in the mantle wedge, starting within only a few hours after the great (M>9) earthquakes. Interestingly, in our model an average slip velocity at the fault closely follows hyperbolic decay law. In natural observations, such deformation is interpreted as an afterslip, while in our model it is caused by the viscoelastic relaxation of mantle wedge with viscosity strongly varying with time. We demonstrate that our results are consistent with the postseismic surface displacement after the Great Tohoku Earthquake for the day-to-year time range. We will also present results of the modeling of deformation of the

  16. Surrogate model approach for improving the performance of reactive transport simulations

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines

  17. Flight Simulation Model Exchange. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the results of the assessment.

  18. Wind Energy Modeling and Simulation | Wind | NREL

    Science.gov Websites

    Wind Energy Modeling and Simulation Wind Turbine Modeling and Simulation Wind turbines are unique wind turbines. It enables the analysis of a range of wind turbine configurations, including: Two- or (SOWFA) employs computational fluid dynamics to allow users to investigate wind turbine and wind power

  19. Selecting a dynamic simulation modeling method for health care delivery research-part 2: report of the ISPOR Dynamic Simulation Modeling Emerging Good Practices Task Force.

    PubMed

    Marshall, Deborah A; Burgos-Liz, Lina; IJzerman, Maarten J; Crown, William; Padula, William V; Wong, Peter K; Pasupathy, Kalyan S; Higashi, Mitchell K; Osgood, Nathaniel D

    2015-03-01

    In a previous report, the ISPOR Task Force on Dynamic Simulation Modeling Applications in Health Care Delivery Research Emerging Good Practices introduced the fundamentals of dynamic simulation modeling and identified the types of health care delivery problems for which dynamic simulation modeling can be used more effectively than other modeling methods. The hierarchical relationship between the health care delivery system, providers, patients, and other stakeholders exhibits a level of complexity that ought to be captured using dynamic simulation modeling methods. As a tool to help researchers decide whether dynamic simulation modeling is an appropriate method for modeling the effects of an intervention on a health care system, we presented the System, Interactions, Multilevel, Understanding, Loops, Agents, Time, Emergence (SIMULATE) checklist consisting of eight elements. This report builds on the previous work, systematically comparing each of the three most commonly used dynamic simulation modeling methods-system dynamics, discrete-event simulation, and agent-based modeling. We review criteria for selecting the most suitable method depending on 1) the purpose-type of problem and research questions being investigated, 2) the object-scope of the model, and 3) the method to model the object to achieve the purpose. Finally, we provide guidance for emerging good practices for dynamic simulation modeling in the health sector, covering all aspects, from the engagement of decision makers in the model design through model maintenance and upkeep. We conclude by providing some recommendations about the application of these methods to add value to informed decision making, with an emphasis on stakeholder engagement, starting with the problem definition. Finally, we identify areas in which further methodological development will likely occur given the growing "volume, velocity and variety" and availability of "big data" to provide empirical evidence and techniques

  20. A Simulation Model to Determine Sensitivity and Timeliness of Surveillance Strategies.

    PubMed

    Schulz, J; Staubach, C; Conraths, F J; Schulz, K

    2017-12-01

    Animal surveillance systems need regular evaluation. We developed an easily applicable simulation model of the German wild boar population to investigate two evaluation attributes: the sensitivity and timeliness (i.e. the ability to detect a disease outbreak rapidly) of a surveillance system. Classical swine fever (CSF) was used as an example for the model. CSF is an infectious disease that may lead to massive economic losses. It can affect wild boar as well as domestic pigs, and CSF outbreaks in domestic pigs have been linked to infections in wild boar. Awareness of the CSF status in wild boar is therefore vital. Our non-epidemic simulation model is based on real data and evaluates the currently implemented German surveillance system for CSF in wild boar. The results show that active surveillance for CSF fulfils the requirements of detecting an outbreak with 95% confidence within one year after the introduction of CSF into the wild boar population. Nevertheless, there is room for improved performance and efficiency by more homogeneous (active and passive) sampling of wild boar over the year. Passive surveillance alone is not sufficient to meet the requirements for detecting the infection. Although CSF was used as example to develop the model, it may also be applied to the evaluation of other surveillance systems for viral diseases in wild boar. It is also possible to compare sensitivity and timeliness across hypothetical alternative or risk-based surveillance strategies. © 2016 Blackwell Verlag GmbH.

  1. Polar Processes in a 50-year Simulation of Stratospheric Chemistry and Transport

    NASA Technical Reports Server (NTRS)

    Kawa, S.R.; Douglass, A. R.; Patrick, L. C.; Allen, D. R.; Randall, C. E.

    2004-01-01

    The unique chemical, dynamical, and microphysical processes that occur in the winter polar lower stratosphere are expected to interact strongly with changing climate and trace gas abundances. Significant changes in ozone have been observed and prediction of future ozone and climate interactions depends on modeling these processes successfully. We have conducted an off-line model simulation of the stratosphere for trace gas conditions representative of 1975-2025 using meteorology from the NASA finite-volume general circulation model. The objective of this simulation is to examine the sensitivity of stratospheric ozone and chemical change to varying meteorology and trace gas inputs. This presentation will examine the dependence of ozone and related processes in polar regions on the climatological and trace gas changes in the model. The model past performance is base-lined against available observations, and a future ozone recovery scenario is forecast. Overall the model ozone simulation is quite realistic, but initial analysis of the detailed evolution of some observable processes suggests systematic shortcomings in our description of the polar chemical rates and/or mechanisms. Model sensitivities, strengths, and weaknesses will be discussed with implications for uncertainty and confidence in coupled climate chemistry predictions.

  2. Simulation of Ultra-Small MOSFETs Using a 2-D Quantum-Corrected Drift-Diffusion Model

    NASA Technical Reports Server (NTRS)

    Biegal, Bryan A.; Rafferty, Connor S.; Yu, Zhiping; Ancona, Mario G.; Dutton, Robert W.; Saini, Subhash (Technical Monitor)

    1998-01-01

    The continued down-scaling of electronic devices, in particular the commercially dominant MOSFET, will force a fundamental change in the process of new electronics technology development in the next five to ten years. The cost of developing new technology generations is soaring along with the price of new fabrication facilities, even as competitive pressure intensifies to bring this new technology to market faster than ever before. To reduce cost and time to market, device simulation must become a more fundamental, indeed dominant, part of the technology development cycle. In order to produce these benefits, simulation accuracy must improve markedly. At the same time, device physics will become more complex, with the rapid increase in various small-geometry and quantum effects. This work describes both an approach to device simulator development and a physical model which advance the effort to meet the tremendous electronic device simulation challenge described above. The device simulation approach is to specify the physical model at a high level to a general-purpose (but highly efficient) partial differential equation solver (in this case PROPHET, developed by Lucent Technologies), which then simulates the model in 1-D, 2-D, or 3-D for a specified device and test regime. This approach allows for the rapid investigation of a wide range of device models and effects, which is certainly essential for device simulation to catch up with, and then stay ahead of, electronic device technology of the present and future. The physical device model used in this work is the density-gradient (DG) quantum correction to the drift-diffusion model [Ancona, Phys. Rev. B 35(5), 7959 (1987)]. This model adds tunneling and quantum smoothing of carrier density profiles to the drift-diffusion model. We used the DG model in 1-D and 2-D (for the first time) to simulate both bipolar and unipolar devices. Simulations of heavily-doped, short-base diodes indicated that the DG quantum

  3. Automatic mathematical modeling for real time simulation system

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1988-01-01

    A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.

  4. Process Modeling and Dynamic Simulation for EAST Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing

    2016-06-01

    In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)

  5. Integration of high-fidelity simulator in third-year paediatrics clerkship.

    PubMed

    Ortiz, Nerian; Pedrogo, Yasmin; Bonet, Nydia

    2011-06-01

    Simulation in medicine is a useful tool for assessing clinical competencies. The liaison committee on medical education expects students to have simulation experiences in the curriculum. The integration of simulators has been encouraged for clinical clerkships. The use of the human simulator in a safe environment should result in enhanced teamworking, communication and critical thinking skills. During the academic year 2007-08, a formative activity using the simulator was implemented in the paediatrics clerkship. The objectives included exposing students to an emergent general paediatric medical scenario using the human simulator. It was imperative that students would adequately go through the critical thinking process. The paediatrics clerkship has incorporated a formative activity using the high-fidelity simulator. A faculty member debriefed the students, and feedback was offered. A total of 124 students participated in the activity. Ninety-eight percent agreed that the use of the simulator in a scenario such as the one presented allowed for a better understanding of the clinical issues studied in the clerkship. More than 85 percent of the students recommended the integration of the simulator in other major clinical clerkships. Performance in the objective structured clinical exam (OSCE) at the end of the clerkship has improved after the implementation of this formative activity. The use of the high-fidelity simulator during the paediatrics clerkship has been identified as an excellent teaching tool. This formative activity has been deemed successful by the students, who feel that it serves as an extra tool to strengthen learned concepts and skills. © Blackwell Publishing Ltd 2011.

  6. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    NASA Astrophysics Data System (ADS)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  7. Simulated Students and Classroom Use of Model-Based Intelligent Tutoring

    NASA Technical Reports Server (NTRS)

    Koedinger, Kenneth R.

    2008-01-01

    Two educational uses of models and simulations: 1) Students create models and use simulations ; and 2) Researchers create models of learners to guide development of reliably effective materials. Cognitive tutors simulate and support tutoring - data is crucial to create effective model. Pittsburgh Science of Learning Center: Resources for modeling, authoring, experimentation. Repository of data and theory. Examples of advanced modeling efforts: SimStudent learns rule-based model. Help-seeking model: Tutors metacognition. Scooter uses machine learning detectors of student engagement.

  8. Case studies of simulation models of recreation use

    Treesearch

    David N. Cole

    2005-01-01

    Computer simulation models can be usefully applied to many different outdoor recreation situations. Model outputs can also be used for a wide variety of planning and management purposes. The intent of this chapter is to use a collection of 12 case studies to illustrate how simulation models have been used in a wide range of recreation situations and for diverse...

  9. DoD Modeling and Simulation (M&S) Glossary

    DTIC Science & Technology

    1998-01-01

    modeling and simulation. It is the group responsible for establishing the need for the ...logical data grouping (in the logical data model ) to which it belongs. (DoD Publication 8320.1-M-l and NBS Pub 500-149, (references (q) and (u)) 399...Department of the Navy Modeling and Simulation Technical Support Group Demonstration of Dynamic Object Oriented Requirements System Disk

  10. Analyzing Strategic Business Rules through Simulation Modeling

    NASA Astrophysics Data System (ADS)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  11. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  12. A 2000-year European Mean Summer Temperature Reconstruction from the PAGES 2k Regional Network and Comparison to Millennium-Length Forced Model Simulations

    NASA Astrophysics Data System (ADS)

    Smerdon, J. E.; Büntgen, U.; Ljungqvist, F. C.; Esper, J.; Fernández-Donado, L.; Gonzalez-Rouco, F. J.; Luterbacher, J.; McCarroll, D.; Wagner, S.; Wahl, E. R.; Wanner, H.; Werner, J.; Zorita, E.

    2012-12-01

    A reconstruction of mean European summer (JJA) land temperatures from 138 B.C.E. to 2003 C.E. is presented and compared to 37 forced transient simulations of the last millennium from coupled General Circulation Models (CGCMs). Eleven annually resolved tree-ring and documentary records from ten European countries/regions were used for the reconstruction and compiled as part of the Euro_Med working group contribution to the PAGES 2k Regional Network. Records were selected based upon their summer temperature signal, annual resolution, and time-continuous sampling. All tree-ring data were detrended using the Regional Curve Standardization (RCS) method to retain low-frequency variance in the resulting mean chronologies. The calibration time series was the area-weighted JJA temperature computed from the CRUTEM4v dataset over a European land domain (35°-70°N, 10°W-40°E). A nested 'Composite-Plus-Scale' reconstruction was derived using nine nests reflecting the availability of predictors back in time. Each nest was calculated by standardizing the available predictor series over the calibration interval, and subsequently calculating a weighted composite in which each proxy was multiplied by its correlation with the target index. The CPS methodology was implemented using a resampling scheme that uses 104 years for calibration. The initial calibration period extended from 1850-1953 C.E. and was incremented by one year until reaching the final period of 1900-2003 C.E., yielding a total of 51 reconstructions for each nest. Within each calibration step, the 50 years excluded from calibration were used for validation. Validation statistics across all reconstruction ensemble members within each nest indicate skillful reconstructions (RE: 0.42-0.64; CE: 0.26-0.54) and are all above the maximum validation statistics achieved in an ensemble of red noise benchmarking experiments. Warm periods in the derived reconstruction during the 1st, 2nd, and 7th-12th centuries compare to

  13. Modeling, simulation, and analysis of optical remote sensing systems

    NASA Technical Reports Server (NTRS)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  14. Simulating hydrodynamics and ice cover in Lake Erie using an unstructured grid model

    NASA Astrophysics Data System (ADS)

    Fujisaki-Manome, A.; Wang, J.

    2016-02-01

    An unstructured grid Finite-Volume Coastal Ocean Model (FVCOM) is applied to Lake Erie to simulate seasonal ice cover. The model is coupled with an unstructured-grid, finite-volume version of the Los Alamos Sea Ice Model (UG-CICE). We replaced the original 2-time-step Euler forward scheme in time integration by the central difference (i.e., leapfrog) scheme to assure a neutrally inertial stability. The modified version of FVCOM coupled with the ice model is applied to the shallow freshwater lake in this study using unstructured grids to represent the complicated coastline in the Laurentian Great Lakes and refining the spatial resolution locally. We conducted multi-year simulations in Lake Erie from 2002 to 2013. The results were compared with the observed ice extent, water surface temperature, ice thickness, currents, and water temperature profiles. Seasonal and interannual variation of ice extent and water temperature was captured reasonably, while the modeled thermocline was somewhat diffusive. The modeled ice thickness tends to be systematically thinner than the observed values. The modeled lake currents compared well with measurements obtained from an Acoustic Doppler Current Profiler located in the deep part of the lake, whereas the simulated currents deviated from measurements near the surface, possibly due to the model's inability to reproduce the sharp thermocline during the summer and the lack of detailed representation of offshore wind fields in the interpolated meteorological forcing.

  15. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  16. RAM simulation model for SPH/RSV systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Primm, A.H.; Nelson, S.C.

    1995-12-31

    The US Army`s Project Manager, Crusader is sponsoring the development of technologies that apply to the Self-Propelled Howitzer (SPH), formerly the Advanced Field Artillery System (AFAS), and Resupply Vehicle (RSV), formerly the Future Armored Resupply Vehicle (FARV), weapon system. Oak Ridge National Laboratory (ORNL) is currently performing developmental work in support of the SPH/PSV Crusader system. Supportive analyses of reliability, availability, and maintainability (RAM) aspects were also performed for the SPH/RSV effort. During FY 1994 and FY 1995 OPNL conducted a feasibility study to demonstrate the application of simulation modeling for RAM analysis of the Crusader system. Following completion ofmore » the feasibility study, a full-scale RAM simulation model of the Crusader system was developed for both the SPH and PSV. This report provides documentation for the simulation model as well as instructions in the proper execution and utilization of the model for the conduct of RAM analyses.« less

  17. Assessing Model Data Fit of Unidimensional Item Response Theory Models in Simulated Data

    ERIC Educational Resources Information Center

    Kose, Ibrahim Alper

    2014-01-01

    The purpose of this paper is to give an example of how to assess the model-data fit of unidimensional IRT models in simulated data. Also, the present research aims to explain the importance of fit and the consequences of misfit by using simulated data sets. Responses of 1000 examinees to a dichotomously scoring 20 item test were simulated with 25…

  18. Estimating solar radiation for plant simulation models

    NASA Technical Reports Server (NTRS)

    Hodges, T.; French, V.; Leduc, S.

    1985-01-01

    Five algorithms producing daily solar radiation surrogates using daily temperatures and rainfall were evaluated using measured solar radiation data for seven U.S. locations. The algorithms were compared both in terms of accuracy of daily solar radiation estimates and terms of response when used in a plant growth simulation model (CERES-wheat). Requirements for accuracy of solar radiation for plant growth simulation models are discussed. One algorithm is recommended as being best suited for use in these models when neither measured nor satellite estimated solar radiation values are available.

  19. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  20. Evaluation of articulation simulation system using artificial maxillectomy models.

    PubMed

    Elbashti, M E; Hattori, M; Sumita, Y I; Taniguchi, H

    2015-09-01

    Acoustic evaluation is valuable for guiding the treatment of maxillofacial defects and determining the effectiveness of rehabilitation with an obturator prosthesis. Model simulations are important in terms of pre-surgical planning and pre- and post-operative speech function. This study aimed to evaluate the acoustic characteristics of voice generated by an articulation simulation system using a vocal tract model with or without artificial maxillectomy defects. More specifically, we aimed to establish a speech simulation system for maxillectomy defect models that both surgeons and maxillofacial prosthodontists can use in guiding treatment planning. Artificially simulated maxillectomy defects were prepared according to Aramany's classification (Classes I-VI) in a three-dimensional vocal tract plaster model of a subject uttering the vowel /a/. Formant and nasalance acoustic data were analysed using Computerized Speech Lab and the Nasometer, respectively. Formants and nasalance of simulated /a/ sounds were successfully detected and analysed. Values of Formants 1 and 2 for the non-defect model were 675.43 and 976.64 Hz, respectively. Median values of Formants 1 and 2 for the defect models were 634.36 and 1026.84 Hz, respectively. Nasalance was 11% in the non-defect model, whereas median nasalance was 28% in the defect models. The results suggest that an articulation simulation system can be used to help surgeons and maxillofacial prosthodontists to plan post-surgical defects that will be facilitate maxillofacial rehabilitation. © 2015 John Wiley & Sons Ltd.

  1. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  2. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  3. Documenting Climate Models and Their Simulations

    DOE PAGES

    Guilyardi, Eric; Balaji, V.; Lawrence, Bryan; ...

    2013-05-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now nonspecialists such as government officials, policy makers, and the general public all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. We describe a pilot community initiative to collect and make available documentation of climatemore » models and their simulations. In an initial application, a metadata repository is being established to provide information of this kind for a major internationally coordinated modeling activity known as CMIP5 (Coupled Model Intercomparison Project, Phase 5). We expected that for a wide range of stakeholders, this and similar community-managed metadata repositories will spur development of analysis tools that facilitate discovery and exploitation of Earth system simulations.« less

  4. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  5. Core discrete event simulation model for the evaluation of health care technologies in major depressive disorder.

    PubMed

    Vataire, Anne-Lise; Aballéa, Samuel; Antonanzas, Fernando; Roijen, Leona Hakkaart-van; Lam, Raymond W; McCrone, Paul; Persson, Ulf; Toumi, Mondher

    2014-03-01

    A review of existing economic models in major depressive disorder (MDD) highlighted the need for models with longer time horizons that also account for heterogeneity in treatment pathways between patients. A core discrete event simulation model was developed to estimate health and cost outcomes associated with alternative treatment strategies. This model simulated short- and long-term clinical events (partial response, remission, relapse, recovery, and recurrence), adverse events, and treatment changes (titration, switch, addition, and discontinuation) over up to 5 years. Several treatment pathways were defined on the basis of fictitious antidepressants with three levels of efficacy, tolerability, and price (low, medium, and high) from first line to third line. The model was populated with input data from the literature for the UK setting. Model outputs include time in different health states, quality-adjusted life-years (QALYs), and costs from National Health Service and societal perspectives. The codes are open source. Predicted costs and QALYs from this model are within the range of results from previous economic evaluations. The largest cost components from the payer perspective were physician visits and hospitalizations. Key parameters driving the predicted costs and QALYs were utility values, effectiveness, and frequency of physician visits. Differences in QALYs and costs between two strategies with different effectiveness increased approximately twofold when the time horizon increased from 1 to 5 years. The discrete event simulation model can provide a more comprehensive evaluation of different therapeutic options in MDD, compared with existing Markov models, and can be used to compare a wide range of health care technologies in various groups of patients with MDD. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. Update of global TC simulations using a variable resolution non-hydrostatic model

    NASA Astrophysics Data System (ADS)

    Park, S. H.

    2017-12-01

    Using in a variable resolution meshes in MPAS during 2017 summer., Tropical cyclone (TC) forecasts are simulated. Two physics suite are tested to explore performance and bias of each physics suite for TC forecasting. A WRF physics suite is selected from experience on weather forecasting and CAM (Community Atmosphere Model) physics is taken from a AMIP type climate simulation. Based on the last year results from CAM5 physical parameterization package and comparing with WRF physics, we investigated a issue with intensity bias using updated version of CAM physics (CAM6). We also compared these results with coupled version of TC simulations. During this talk, TC structure will be compared specially around of boundary layer and investigate their relationship between TC intensity and different physics package.

  7. The effects of simulated patients and simulated gynecologic models on student anxiety in providing IUD services.

    PubMed

    Khadivzadeh, Talat; Erfanian, Fatemeh

    2012-10-01

    Midwifery students experience high levels of stress during their initial clinical practices. Addressing the learner's source of anxiety and discomfort can ease the learning experience and lead to better outcomes. The aim of this study was to find out the effect of a simulation-based course, using simulated patients and simulated gynecologic models on student anxiety and comfort while practicing to provide intrauterine device (IUD) services. Fifty-six eligible midwifery students were randomly allocated into simulation-based and traditional training groups. They participated in a 12-hour workshop in providing IUD services. The simulation group was trained through an educational program including simulated gynecologic models and simulated patients. The students in both groups then practiced IUD consultation and insertion with real patients in the clinic. The students' anxiety in IUD insertion was assessed using the "Spielberger anxiety test" and the "comfort in providing IUD services" questionnaire. There were significant differences between students in 2 aspects of anxiety including state (P < 0.001) and trait (P = 0.024) and the level of comfort (P = 0.000) in providing IUD services in simulation and traditional groups. "Fear of uterine perforation during insertion" was the most important cause of students' anxiety in providing IUD services, which was reported by 74.34% of students. Simulated patients and simulated gynecologic models are effective in optimizing students' anxiety levels when practicing to deliver IUD services. Therefore, it is recommended that simulated patients and simulated gynecologic models be used before engaging students in real clinical practice.

  8. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of

  9. Integrating meso- and micro-simulation models to evaluate traffic management strategies - year 1 : final report.

    DOT National Transportation Integrated Search

    2016-06-01

    In this project the researchers developed a hierarchical multi-resolution traffic simulation system for metropolitan areas, referred to as MetroSim. Categorically, the focus is on integrating two types of simulation: microscopic simulation in which i...

  10. Combining Simulation and Optimization Models for Hardwood Lumber Production

    Treesearch

    G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman

    1991-01-01

    Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...

  11. Model-based surgical planning and simulation of cranial base surgery.

    PubMed

    Abe, M; Tabuchi, K; Goto, M; Uchino, A

    1998-11-01

    Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.

  12. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  13. Twitter's tweet method modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  14. A Regional Model for Malaria Vector Developmental Habitats Evaluated Using Explicit, Pond-Resolving Surface Hydrology Simulations.

    PubMed

    Asare, Ernest Ohene; Tompkins, Adrian Mark; Bomblies, Arne

    2016-01-01

    Dynamical malaria models can relate precipitation to the availability of vector breeding sites using simple models of surface hydrology. Here, a revised scheme is developed for the VECTRI malaria model, which is evaluated alongside the default scheme using a two year simulation by HYDREMATS, a 10 metre resolution, village-scale model that explicitly simulates individual ponds. Despite the simplicity of the two VECTRI surface hydrology parametrization schemes, they can reproduce the sub-seasonal evolution of fractional water coverage. Calibration of the model parameters is required to simulate the mean pond fraction correctly. The default VECTRI model tended to overestimate water fraction in periods subject to light rainfall events and underestimate it during periods of intense rainfall. This systematic error was improved in the revised scheme by including the a parametrization for surface run-off, such that light rainfall below the initial abstraction threshold does not contribute to ponds. After calibration of the pond model, the VECTRI model was able to simulate vector densities that compared well to the detailed agent based model contained in HYDREMATS without further parameter adjustment. Substituting local rain-gauge data with satellite-retrieved precipitation gave a reasonable approximation, raising the prospects for regional malaria simulations even in data sparse regions. However, further improvements could be made if a method can be derived to calibrate the key hydrology parameters of the pond model in each grid cell location, possibly also incorporating slope and soil texture.

  15. A Regional Model for Malaria Vector Developmental Habitats Evaluated Using Explicit, Pond-Resolving Surface Hydrology Simulations

    PubMed Central

    Asare, Ernest Ohene; Tompkins, Adrian Mark; Bomblies, Arne

    2016-01-01

    Dynamical malaria models can relate precipitation to the availability of vector breeding sites using simple models of surface hydrology. Here, a revised scheme is developed for the VECTRI malaria model, which is evaluated alongside the default scheme using a two year simulation by HYDREMATS, a 10 metre resolution, village-scale model that explicitly simulates individual ponds. Despite the simplicity of the two VECTRI surface hydrology parametrization schemes, they can reproduce the sub-seasonal evolution of fractional water coverage. Calibration of the model parameters is required to simulate the mean pond fraction correctly. The default VECTRI model tended to overestimate water fraction in periods subject to light rainfall events and underestimate it during periods of intense rainfall. This systematic error was improved in the revised scheme by including the a parametrization for surface run-off, such that light rainfall below the initial abstraction threshold does not contribute to ponds. After calibration of the pond model, the VECTRI model was able to simulate vector densities that compared well to the detailed agent based model contained in HYDREMATS without further parameter adjustment. Substituting local rain-gauge data with satellite-retrieved precipitation gave a reasonable approximation, raising the prospects for regional malaria simulations even in data sparse regions. However, further improvements could be made if a method can be derived to calibrate the key hydrology parameters of the pond model in each grid cell location, possibly also incorporating slope and soil texture. PMID:27003834

  16. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  17. Simulation of the global ocean thermohaline circulation with an eddy-resolving INMIO model configuration

    NASA Astrophysics Data System (ADS)

    Ushakov, K. V.; Ibrayev, R. A.

    2017-11-01

    In this paper, the first results of a simulation of the mean World Ocean thermohaline characteristics obtained by the INMIO ocean general circulation model configured with 0.1 degree resolution in a 5-year long numerical experiment following the CORE-II protocol are presented. The horizontal and zonal mean distributions of the solution bias against the WOA09 data are analyzed. The seasonal cycle of heat content at a specified site of the North Atlantic is also discussed. The simulation results demonstrate a clear improvement in the quality of representation of the upper ocean compared to the results of experiments with 0.5 and 0.25 degree model configurations. Some remaining biases of the model solution and possible ways of their overcoming are highlighted.

  18. Project Shuttle simulation math model coordination catalog, revision 1

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.

  19. 3D printed simulation models based on real patient situations for hands-on practice.

    PubMed

    Kröger, E; Dekiff, M; Dirksen, D

    2017-11-01

    During the last few years, the curriculum of many dentistry schools in Germany has been reorganised. Two key aspects of the applied changes are the integration of up-to-date teaching methods and the promotion of interdisciplinarity. To support these efforts, an approach to fabricating individualised simulation models for hands-on courses employing 3D printing is presented. The models are based on real patients, thus providing students a more realistic preparation for real clinical situations. As a wide variety of dental procedures can be implemented, the simulation models can also contribute to a more interdisciplinary dental education. The data used for the construction of the models were acquired by 3D surface scanning. The data were further processed with 3D modelling software. Afterwards, the models were fabricated by 3D printing with the PolyJet technique. Three models serve as examples: a prosthodontic model for training veneer preparation, a conservative model for practicing dental bonding and an interdisciplinary model featuring carious teeth and an insufficient crown. The third model was evaluated in a hands-on course with 22 fourth-year dental students. The students answered a questionnaire and gave their personal opinion. Whilst the concept of the model received very positive feedback, some aspects of the implementation were criticised. We discuss these observations and suggest ways for further improvement. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. An Open Simulation System Model for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1995-01-01

    A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.

  1. Transient climate simulations of the deglaciation 21-9 thousand years before present (version 1) - PMIP4 Core experiment design and boundary conditions

    NASA Astrophysics Data System (ADS)

    Ivanovic, Ruza F.; Gregoire, Lauren J.; Kageyama, Masa; Roche, Didier M.; Valdes, Paul J.; Burke, Andrea; Drummond, Rosemarie; Peltier, W. Richard; Tarasov, Lev

    2016-07-01

    The last deglaciation, which marked the transition between the last glacial and present interglacial periods, was punctuated by a series of rapid (centennial and decadal) climate changes. Numerical climate models are useful for investigating mechanisms that underpin the climate change events, especially now that some of the complex models can be run for multiple millennia. We have set up a Paleoclimate Modelling Intercomparison Project (PMIP) working group to coordinate efforts to run transient simulations of the last deglaciation, and to facilitate the dissemination of expertise between modellers and those engaged with reconstructing the climate of the last 21 000 years. Here, we present the design of a coordinated Core experiment over the period 21-9 thousand years before present (ka) with time-varying orbital forcing, greenhouse gases, ice sheets and other geographical changes. A choice of two ice sheet reconstructions is given, and we make recommendations for prescribing ice meltwater (or not) in the Core experiment. Additional focussed simulations will also be coordinated on an ad hoc basis by the working group, for example to investigate more thoroughly the effect of ice meltwater on climate system evolution, and to examine the uncertainty in other forcings. Some of these focussed simulations will target shorter durations around specific events in order to understand them in more detail and allow for the more computationally expensive models to take part.

  2. Using Radiation Risk Models in Cancer Screening Simulations: Important Assumptions and Effects on Outcome Projections

    PubMed Central

    Lee, Janie M.; McMahon, Pamela M.; Lowry, Kathryn P.; Omer, Zehra B.; Eisenberg, Jonathan D.; Pandharipande, Pari V.; Gazelle, G. Scott

    2012-01-01

    Purpose: To evaluate the effect of incorporating radiation risk into microsimulation (first-order Monte Carlo) models for breast and lung cancer screening to illustrate effects of including radiation risk on patient outcome projections. Materials and Methods: All data used in this study were derived from publicly available or deidentified human subject data. Institutional review board approval was not required. The challenges of incorporating radiation risk into simulation models are illustrated with two cancer screening models (Breast Cancer Model and Lung Cancer Policy Model) adapted to include radiation exposure effects from mammography and chest computed tomography (CT), respectively. The primary outcome projected by the breast model was life expectancy (LE) for BRCA1 mutation carriers. Digital mammographic screening beginning at ages 25, 30, 35, and 40 years was evaluated in the context of screenings with false-positive results and radiation exposure effects. The primary outcome of the lung model was lung cancer–specific mortality reduction due to annual screening, comparing two diagnostic CT protocols for lung nodule evaluation. The Metropolis-Hastings algorithm was used to estimate the mean values of the results with 95% uncertainty intervals (UIs). Results: Without radiation exposure effects, the breast model indicated that annual digital mammography starting at age 25 years maximized LE (72.03 years; 95% UI: 72.01 years, 72.05 years) and had the highest number of screenings with false-positive results (2.0 per woman). When radiation effects were included, annual digital mammography beginning at age 30 years maximized LE (71.90 years; 95% UI: 71.87 years, 71.94 years) with a lower number of screenings with false-positive results (1.4 per woman). For annual chest CT screening of 50-year-old females with no follow-up for nodules smaller than 4 mm in diameter, the lung model predicted lung cancer–specific mortality reduction of 21.50% (95% UI: 20.90%, 22

  3. Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  4. Multi-year assessment of soil-vegetation-atmosphere transfer (SVAT) modeling uncertainties over a Mediterranean agricultural site

    NASA Astrophysics Data System (ADS)

    Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.

    2012-04-01

    Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata

  5. Piloted Evaluation of a UH-60 Mixer Equivalent Turbulence Simulation Model

    NASA Technical Reports Server (NTRS)

    Lusardi, Jeff A.; Blanken, Chris L.; Tischeler, Mark B.

    2002-01-01

    A simulation study of a recently developed hover/low speed Mixer Equivalent Turbulence Simulation (METS) model for the UH-60 Black Hawk helicopter was conducted in the NASA Ames Research Center Vertical Motion Simulator (VMS). The experiment was a continuation of previous work to develop a simple, but validated, turbulence model for hovering rotorcraft. To validate the METS model, two experienced test pilots replicated precision hover tasks that had been conducted in an instrumented UH-60 helicopter in turbulence. Objective simulation data were collected for comparison with flight test data, and subjective data were collected that included handling qualities ratings and pilot comments for increasing levels of turbulence. Analyses of the simulation results show good analytic agreement between the METS model and flight test data, with favorable pilot perception of the simulated turbulence. Precision hover tasks were also repeated using the more complex rotating-frame SORBET (Simulation Of Rotor Blade Element Turbulence) model to generate turbulence. Comparisons of the empirically derived METS model with the theoretical SORBET model show good agreement providing validation of the more complex blade element method of simulating turbulence.

  6. Year-long simulation of gaseous and particulate air pollutants in India

    NASA Astrophysics Data System (ADS)

    Kota, Sri Harsha; Guo, Hao; Myllyvirta, Lauri; Hu, Jianlin; Sahu, Shovan Kumar; Garaga, Rajyalakshmi; Ying, Qi; Gao, Aifang; Dahiya, Sunil; Wang, Yuan; Zhang, Hongliang

    2018-05-01

    Severe pollution events occur frequently in India but few studies have investigated the characteristics, sources, and control strategies for the whole country. A year-long simulation was carried out in India to provide detailed information of spatial and temporal distribution of gas species and particulate matter (PM). The concentrations of O3, NO2, SO2, CO, as well as PM2.5 and its components in 2015 were predicted using Weather Research Forecasting (WRF) and the Community Multiscale Air Quality (CMAQ) models. Model performance was validated against available observations from ground based national ambient air quality monitoring stations in major cities. Model performance of O3 does not always meet the criteria suggested by the US Environmental Protection Agency (EPA) but that of PM2.5 meets suggested criteria by previous studies. The performance of model was better on days with high O3 and PM2.5 levels. Concentrations of PM2.5, NO2, CO and SO2 were highest in the Indo-Gangetic region, including northern and eastern India. PM2.5 concentrations were higher during winter and lower during monsoon season. Winter nitrate concentrations were 160-230% higher than yearly average. In contrast, the fraction of sulfate in total PM2.5 was maximum in monsoon and least in winter, due to decrease in temperature and solar radiation intensity in winter. Except in southern India, where sulfate was the major component of PM2.5, primary organic aerosol (POA) fraction in PM2.5 was highest in all regions of the country. Fractions of secondary components were higher on bad days than on good days in these cities, indicating the importance of control of precursors for secondary pollutants in India.

  7. Global Climate Model Simulated Hydrologic Droughts and Floods in the Nelson-Churchill Watershed

    NASA Astrophysics Data System (ADS)

    Vieira, M. J. F.; Stadnyk, T. A.; Koenig, K. A.

    2014-12-01

    There is uncertainty surrounding the duration, magnitude and frequency of historical hydroclimatic extremes such as hydrologic droughts and floods prior to the observed record. In regions where paleoclimatic studies are less reliable, Global Climate Models (GCMs) can provide useful information about past hydroclimatic conditions. This study evaluates the use of Coupled Model Intercomparison Project 5 (CMIP5) GCMs to enhance the understanding of historical droughts and floods across the Canadian Prairie region in the Nelson-Churchill Watershed (NCW). The NCW is approximately 1.4 million km2 in size and drains into Hudson Bay in Northern Manitoba, Canada. One hundred years of observed hydrologic records show extended dry and wet periods in this region; however paleoclimatic studies suggest that longer, more severe droughts have occurred in the past. In Manitoba, where hydropower is the primary source of electricity, droughts are of particular interest as they are important for future resource planning. Twenty-three GCMs with daily runoff are evaluated using 16 metrics for skill in reproducing historic annual runoff patterns. A common 56-year historic period of 1950-2005 is used for this evaluation to capture wet and dry periods. GCM runoff is then routed at a grid resolution of 0.25° using the WATFLOOD hydrological model storage-routing algorithm to develop streamflow scenarios. Reservoir operation is naturalized and a consistent temperature scenario is used to determine ice-on and ice-off conditions. These streamflow simulations are compared with the historic record to remove bias using quantile mapping of empirical distribution functions. GCM runoff data from pre-industrial and future projection experiments are also bias corrected to obtain extended streamflow simulations. GCM streamflow simulations of more than 650 years include a stationary (pre-industrial) period and future periods forced by radiative forcing scenarios. Quantile mapping adjusts for magnitude

  8. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  9. A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI

    PubMed Central

    Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.

    2016-01-01

    Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244

  10. Extended behavioural device modelling and circuit simulation with Qucs-S

    NASA Astrophysics Data System (ADS)

    Brinson, M. E.; Kuznetsov, V.

    2018-03-01

    Current trends in circuit simulation suggest a growing interest in open source software that allows access to more than one simulation engine while simultaneously supporting schematic drawing tools, behavioural Verilog-A and XSPICE component modelling, and output data post-processing. This article introduces a number of new features recently implemented in the 'Quite universal circuit simulator - SPICE variant' (Qucs-S), including structure and fundamental schematic capture algorithms, at the same time highlighting their use in behavioural semiconductor device modelling. Particular importance is placed on the interaction between Qucs-S schematics, equation-defined devices, SPICE B behavioural sources and hardware description language (HDL) scripts. The multi-simulator version of Qucs is a freely available tool that offers extended modelling and simulation features compared to those provided by legacy circuit simulators. The performance of a number of Qucs-S modelling extensions are demonstrated with a GaN HEMT compact device model and data obtained from tests using the Qucs-S/Ngspice/Xyce ©/SPICE OPUS multi-engine circuit simulator.

  11. Assessment of Flood Mitigation Solutions Using a Hydrological Model and Refined 2D Hydrodynamic Simulations

    NASA Astrophysics Data System (ADS)

    Khuat Duy, B.; Archambeau, P.; Dewals, B. J.; Erpicum, S.; Pirotton, M.

    2009-04-01

    Following recurrent inundation problems on the Berwinne catchment, in Belgium, a combined hydrologic and hydrodynamic study has been carried out in order to find adequate solutions for the floods mitigation. Thanks to detailed 2D simulations, the effectiveness of the solutions can be assessed not only in terms of discharge and height reductions in the river, but also with other aspects such as the inundated surfaces reduction and the decrease of inundated buildings and roads. The study is carried out in successive phases. First, the hydrological runoffs are generated using a physically based and spatially distributed multi-layer model solving depth-integrated equations for overland flow, subsurface flow and baseflow. Real floods events are simulated using rainfall series collected at 8 stations (over 20 years of available data). The hydrological inputs are routed through the river network (and through the sewage network if relevant) with the 1D component of the modelling system, which solves the Saint-Venant equations for both free-surface and pressurized flows in a unified way. On the main part of the river, the measured river cross-sections are included in the modelling, and existing structures along the river (such as bridges, sluices or pipes) are modelled explicitely with specific cross sections. Two gauging stations with over 15 years of continuous measurements allow the calibration of both the hydrologic and hydrodynamic models. Second, the flood mitigation solutions are tested in the simulations in the case of an extreme flooding event, and their effects are assessed using detailed 2D simulations on a few selected sensitive areas. The digital elevation model comes from an airborne laser survey with a spatial resolution of 1 point per square metre and is completed in the river bed with a bathymetry interpolated from cross-section data. The upstream discharge is extracted from the 1D simulation for the selected rainfall event. The study carried out with this

  12. Mid-Holocene and last glacial maximum climate simulations with the IPSL model: part II: model-data comparisons

    NASA Astrophysics Data System (ADS)

    Kageyama, Masa; Braconnot, Pascale; Bopp, Laurent; Mariotti, Véronique; Roy, Tilla; Woillez, Marie-Noëlle; Caubel, Arnaud; Foujols, Marie-Alice; Guilyardi, Eric; Khodri, Myriam; Lloyd, James; Lombard, Fabien; Marti, Olivier

    2013-05-01

    The climates of the mid-Holocene (MH, 6,000 years ago) and the Last Glacial Maximum (LGM, 21,000 years ago) have been extensively documented and as such, have become targets for the evaluation of climate models for climate contexts very different from the present. In Part 1 of the present work, we have studied the MH and LGM simulations performed with the last two versions of the IPSL model: IPSL_CM4, run for the PMIP2/CMIP3 (Coupled Model Intercomparion Project) projects and IPSL_CM5A, run for the most recent PMIP3/CMIP5 projets. We have shown that not only are these models different in their simulations of the PI climate, but also in their simulations of the climatic anomalies for the MH and LGM. In the Part 2 of this paper, we first examine whether palaeo-data can help discriminate between the model performances. This is indeed the case for the African monsoon for the MH or for North America south of the Laurentide ice sheet, the South Atlantic or the southern Indian ocean for the LGM. For the LGM, off-line vegetation modelling appears to offer good opportunities to distinguish climate model results because glacial vegetation proves to be very sensitive to even small differences in LGM climate. For other cases such as the LGM North Atlantic or the LGM equatorial Pacific, the large uncertainty on the SST reconstructions, prevents model discrimination. We have examined the use of other proxy-data for model evaluation, which has become possible with the inclusion of the biogeochemistry morel PISCES in the IPSL_CM5A model. We show a broad agreement of the LGM-PI export production changes with reconstructions. These changes are related to the mixed layer depth in most regions and to sea-ice variations in the high latitudes. We have also modelled foraminifer abundances with the FORAMCLIM model and shown that the changes in foraminifer abundance in the equatorial Pacific are mainly forced by changes in SSTs, hence confirming the SST-foraminifer abundance relationship

  13. Throwing the Uncertainty Toolbox at Antarctica: Multi-model Ensemble Simulation, Emulation and Bayesian Calibration of Marine Ice Sheet Instability

    NASA Astrophysics Data System (ADS)

    Edwards, T.

    2015-12-01

    Modelling Antarctic marine ice sheet instability (MISI) - the potential for sustained grounding line retreat along downsloping bedrock - is very challenging because high resolution at the grounding line is required for reliable simulation. Assessing modelling uncertainties is even more difficult, because such models are very computationally expensive, restricting the number of simulations that can be performed. Quantifying uncertainty in future Antarctic instability has therefore so far been limited. There are several ways to tackle this problem, including: Simulating a small domain, to reduce expense and allow the use of ensemble methods; Parameterising response of the grounding line to the onset of MISI, for the same reasons; Emulating the simulator with a statistical model, to explore the impacts of uncertainties more thoroughly; Substituting physical models with expert-elicited statistical distributions. Methods 2-4 require rigorous testing against observations and high resolution models to have confidence in their results. We use all four to examine the dependence of MISI in the Amundsen Sea Embayment (ASE) on uncertain model inputs, including bedrock topography, ice viscosity, basal friction, model structure (sliding law and treatment of grounding line migration) and MISI triggers (including basal melting and risk of ice shelf collapse). We compare simulations from a 3000 member ensemble with GRISLI (methods 2, 4) with a 284 member ensemble from BISICLES (method 1) and also use emulation (method 3). Results from the two ensembles show similarities, despite very different model structures and ensemble designs. Basal friction and topography have a large effect on the extent of grounding line retreat, and the sliding law strongly modifies sea level contributions through changes in the rate and extent of grounding line retreat and the rate of ice thinning. Over 50 years, MISI in the ASE gives up to 1.1 mm/year (95% quantile) SLE in GRISLI (calibrated with ASE

  14. Longitudinal train dynamics model for a rail transit simulation system

    DOE PAGES

    Wang, Jinghui; Rakha, Hesham A.

    2018-01-01

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  15. Longitudinal train dynamics model for a rail transit simulation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jinghui; Rakha, Hesham A.

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  16. Analog quantum simulation of generalized Dicke models in trapped ions

    NASA Astrophysics Data System (ADS)

    Aedo, Ibai; Lamata, Lucas

    2018-04-01

    We propose the analog quantum simulation of generalized Dicke models in trapped ions. By combining bicromatic laser interactions on multiple ions we can generate all regimes of light-matter coupling in these models, where here the light mode is mimicked by a motional mode. We present numerical simulations of the three-qubit Dicke model both in the weak field (WF) regime, where the Jaynes-Cummings behavior arises, and the ultrastrong coupling (USC) regime, where a rotating-wave approximation cannot be considered. We also simulate the two-qubit biased Dicke model in the WF and USC regimes and the two-qubit anisotropic Dicke model in the USC regime and the deep-strong coupling regime. The agreement between the mathematical models and the ion system convinces us that these quantum simulations can be implemented in the laboratory with current or near-future technology. This formalism establishes an avenue for the quantum simulation of many-spin Dicke models in trapped ions.

  17. Age- and sex-specific thorax finite element model development and simulation.

    PubMed

    Schoell, Samantha L; Weaver, Ashley A; Vavalle, Nicholas A; Stitzel, Joel D

    2015-01-01

    The shape, size, bone density, and cortical thickness of the thoracic skeleton vary significantly with age and sex, which can affect the injury tolerance, especially in at-risk populations such as the elderly. Computational modeling has emerged as a powerful and versatile tool to assess injury risk. However, current computational models only represent certain ages and sexes in the population. The purpose of this study was to morph an existing finite element (FE) model of the thorax to depict thorax morphology for males and females of ages 30 and 70 years old (YO) and to investigate the effect on injury risk. Age- and sex-specific FE models were developed using thin-plate spline interpolation. In order to execute the thin-plate spline interpolation, homologous landmarks on the reference, target, and FE model are required. An image segmentation and registration algorithm was used to collect homologous rib and sternum landmark data from males and females aged 0-100 years. The Generalized Procrustes Analysis was applied to the homologous landmark data to quantify age- and sex-specific isolated shape changes in the thorax. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant model was morphed to create age- and sex-specific thoracic shape change models (scaled to a 50th percentile male size). To evaluate the thoracic response, 2 loading cases (frontal hub impact and lateral impact) were simulated to assess the importance of geometric and material property changes with age and sex. Due to the geometric and material property changes with age and sex, there were observed differences in the response of the thorax in both the frontal and lateral impacts. Material property changes alone had little to no effect on the maximum thoracic force or the maximum percent compression. With age, the thorax becomes stiffer due to superior rotation of the ribs, which can result in increased bone strain that can increase the risk of fracture. For the 70-YO models

  18. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  19. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis.

    PubMed

    Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L

    2011-12-01

    To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.

  20. Modeling the Emergent Impacts of Harvesting Acadian Forests over 100+ Years

    NASA Astrophysics Data System (ADS)

    Luus, K. A.; Plug, L. J.

    2007-12-01

    Harvesting strategies and policies for Acadian forest in Nova Scotia, Canada, presently are set using Decision Support Models (DSMs) that aim to maximize the long-term (>100y) value of forests through decisions implemented over short time horizons (5-80 years). However, DSMs typically are aspatial, lack ecological processes and do not treat erosion, so the long-term (>100y) emergent impacts of the prescribed forestry decisions on erosion and vegetation in Acadian forests remain poorly known. To better understand these impacts, we created an equation-based model that simulates the evolution of a ≥4 km2 forest in time steps of 1 y and at a spatial resolution of 3 m2, the footprint of a single mature tree. The model combines 1) ecological processes of recruitment, competition, and mortality; 2) geomorphic processes of hillslope erosion; 3) anthropic processes of tree harvesting, replanting, and road construction under constraints imposed by regulations and cost/benefit ratio. The model uses digital elevation models, parameters (where available), and calibration (where measurements are not available) for conditions presently found in central Cape Breton, Nova Scotia. The model is unique because it 1) deals with the impacts of harvesting on an Acadian forest; and 2) vegetation and erosion are coupled. The model was tested by comparing the species-specific biomass of long-term (40 y) forest plot data to simulated results. At the spatial scale of individual 1 ha plots, model predictions presently account for approximately 50% of observed biomass changes through time, but predictions are hampered by the effects of serendipitous "random" events such as single tree windfall. Harvesting increases the cumulative erosion over 3000 years by 240% when compared to an old growth forest and significantly suppresses the growth of Balsam Fir and Sugar Maple. We discuss further tests of the model, and how it might be used to investigate the long-term sustainability of the

  1. Validation of Model Simulations of Anvil Cirrus Properties During TWP-ICE: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zipser, Edward J.

    2013-05-20

    This 3-year grant, with two extensions, resulted in a successful 5-year effort, led by Ph.D. student Adam Varble, to compare cloud resolving model (CRM) simulations with the excellent database obtained during the TWP-ICE field campaign. The objective, largely achieved, is to undertake these comparisons comprehensively and quantitatively, informing the community in ways that goes beyond pointing out errors in the models, but points out ways to improve both cloud dynamics and microphysics parameterizations in future modeling efforts. Under DOE support, Adam Varble, with considerable assistance from Dr. Ann Fridlind and others, entrained scientists who ran some 10 different CRMs andmore » 4 different limited area models (LAMs) using a variety of microphysics parameterizations, to ensure that the conclusions of the study will have considerable generality.« less

  2. Simulation-based modeling of building complexes construction management

    NASA Astrophysics Data System (ADS)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  3. Regional fuel load modeled for two contrasting years in central and southern Africa

    NASA Astrophysics Data System (ADS)

    Hely, C.; Dowty, P. R.; Alleaume, S.; Caylor, K. K.; Shugart, H. H.

    2001-12-01

    Fuel load has been modeled for southern hemisphere Africa for the 1991-92 and 1999-2000 growing seasons. The 1991-92 year was generally dry due to a strong El Nino event in contrast to the particularly wet year of 1999-2000. The method integrates site-level process modeling with 15 day AVHRR NDVI data. The site model was used to simulate landscape light-use efficiency (LUE) at a series of sites in the Kalahari region ranging from evergreen woodland to arid shrubland. This site-level LUE is extrapolated over the southern African region with gridded tree cover data and gridded rainfall. The predicted net primary production (NPP) is allocated into the different fuel types (grass, litter, twigs) using empirical based relationships. The model results are compared with field measurements of fuel load at a number of sites. The results will be used for modeling of biomass burning emissions.

  4. Tsunami simulation using submarine displacement calculated from simulation of ground motion due to seismic source model

    NASA Astrophysics Data System (ADS)

    Akiyama, S.; Kawaji, K.; Fujihara, S.

    2013-12-01

    Since fault fracturing due to an earthquake can simultaneously cause ground motion and tsunami, it is appropriate to evaluate the ground motion and the tsunami by single fault model. However, several source models are used independently in the ground motion simulation or the tsunami simulation, because of difficulty in evaluating both phenomena simultaneously. Many source models for the 2011 off the Pacific coast of Tohoku Earthquake are proposed from the inversion analyses of seismic observations or from those of tsunami observations. Most of these models show the similar features, which large amount of slip is located at the shallower part of fault area near the Japan Trench. This indicates that the ground motion and the tsunami can be evaluated by the single source model. Therefore, we examine the possibility of the tsunami prediction, using the fault model estimated from seismic observation records. In this study, we try to carry out the tsunami simulation using the displacement field of oceanic crustal movements, which is calculated from the ground motion simulation of the 2011 off the Pacific coast of Tohoku Earthquake. We use two fault models by Yoshida et al. (2011), which are based on both the teleseismic body wave and on the strong ground motion records. Although there is the common feature in those fault models, the amount of slip near the Japan trench is lager in the fault model from the strong ground motion records than in that from the teleseismic body wave. First, the large-scale ground motion simulations applying those fault models used by the voxel type finite element method are performed for the whole eastern Japan. The synthetic waveforms computed from the simulations are generally consistent with the observation records of K-NET (Kinoshita (1998)) and KiK-net stations (Aoi et al. (2000)), deployed by the National Research Institute for Earth Science and Disaster Prevention (NIED). Next, the tsunami simulations are performed by the finite

  5. University Research in Support of TREAT Modeling and Simulation, FY 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark David

    Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less

  6. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  7. Arctic Ocean Freshwater: How Robust are Model Simulations

    NASA Technical Reports Server (NTRS)

    Jahn, A.; Aksenov, Y.; deCuevas, B. A.; deSteur, L.; Haekkinen, S.; Hansen, E.; Herbaut, C.; Houssais, M.-N.; Karcher, M.; Kauker, F.; hide

    2012-01-01

    The Arctic freshwater (FW) has been the focus of many modeling studies, due to the potential impact of Arctic FW on the deep water formation in the North Atlantic. A comparison of the hindcasts from ten ocean-sea ice models shows that the simulation of the Arctic FW budget is quite different in the investigated models. While they agree on the general sink and source terms of the Arctic FW budget, the long-term means as well as the variability of the FW export vary among models. The best model-to-model agreement is found for the interannual and seasonal variability of the solid FW export and the solid FW storage, which also agree well with observations. For the interannual and seasonal variability of the liquid FW export, the agreement among models is better for the Canadian Arctic Archipelago (CAA) than for Fram Strait. The reason for this is that models are more consistent in simulating volume flux anomalies than salinity anomalies and volume-flux anomalies dominate the liquid FW export variability in the CAA but not in Fram Strait. The seasonal cycle of the liquid FW export generally shows a better agreement among models than the interannual variability, and compared to observations the models capture the seasonality of the liquid FW export rather well. In order to improve future simulations of the Arctic FW budget, the simulation of the salinity field needs to be improved, so that model results on the variability of the liquid FW export and storage become more robust.

  8. Benchmark simulation model no 2: general protocol and exploratory case studies.

    PubMed

    Jeppsson, U; Pons, M-N; Nopens, I; Alex, J; Copp, J B; Gernaey, K V; Rosen, C; Steyer, J-P; Vanrolleghem, P A

    2007-01-01

    Over a decade ago, the concept of objectively evaluating the performance of control strategies by simulating them using a standard model implementation was introduced for activated sludge wastewater treatment plants. The resulting Benchmark Simulation Model No 1 (BSM1) has been the basis for a significant new development that is reported on here: Rather than only evaluating control strategies at the level of the activated sludge unit (bioreactors and secondary clarifier) the new BSM2 now allows the evaluation of control strategies at the level of the whole plant, including primary clarifier and sludge treatment with anaerobic sludge digestion. In this contribution, the decisions that have been made over the past three years regarding the models used within the BSM2 are presented and argued, with particular emphasis on the ADM1 description of the digester, the interfaces between activated sludge and digester models, the included temperature dependencies and the reject water storage. BSM2-implementations are now available in a wide range of simulation platforms and a ring test has verified their proper implementation, consistent with the BSM2 definition. This guarantees that users can focus on the control strategy evaluation rather than on modelling issues. Finally, for illustration, twelve simple operational strategies have been implemented in BSM2 and their performance evaluated. Results show that it is an interesting control engineering challenge to further improve the performance of the BSM2 plant (which is the whole idea behind benchmarking) and that integrated control (i.e. acting at different places in the whole plant) is certainly worthwhile to achieve overall improvement.

  9. Multiphase flow modeling and simulation of explosive volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Neri, Augusto

    Recent worldwide volcanic activity, such as eruptions at Mt. St. Helens, Washington, in 1980, Mt. Pinatubo, Philippines, in 1991, as well as the ongoing eruption at Montserrat, West Indies, highlighted again the complex nature of explosive volcanic eruptions as well as the tremendous risk associated to them. In the year 2000, about 500 million people are expected to live under the shadow of an active volcano. The understanding of pyroclastic dispersion processes produced by explosive eruptions is, therefore, of primary interest, not only from the scientific point of view, but also for the huge worldwide risk associated with them. The thesis deals with an interdisciplinary research aimed at the modeling and simulation of explosive volcanic eruptions by using multiphase thermo-fluid-dynamic models. The first part of the work was dedicated to the understanding and validation of recently developed kinetic theory of two-phase flow. The hydrodynamics of fluid catalytic cracking particles in the IIT riser were simulated and compared with lab experiments. Simulation results confirm the validity of the kinetic theory approach. Transport of solids in the riser is due to dense clusters. On a time-average basis the bottom of the riser and the walls are dense, in agreement with IIT experimental data. The low frequency of oscillation (about 0.2 Hz) is also in agreement with data. The second part of the work was devoted to the development of transient two-dimensional multiphase and multicomponent flow models of pyroclastic dispersion processes. In particular, the dynamics of ground-hugging high-speed and high-temperature pyroclastic flows generated by the collapse of volcanic columns or by impulsive discrete explosions, was investigated. The model accounts for the mechanical and thermal non-equilibrium between a multicomponent gas phase and N different solid phases representative of pyroclastic particles of different sizes. Pyroclastic dispersion dynamics describes the formation

  10. A comparison of emission calculations using different modeled indicators with 1-year online measurements.

    PubMed

    Lengers, Bernd; Schiefler, Inga; Büscher, Wolfgang

    2013-12-01

    The overall measurement of farm level greenhouse gas (GHG) emissions in dairy production is not feasible, from either an engineering or administrative point of view. Instead, computational model systems are used to generate emission inventories, demanding a validation by measurement data. This paper tests the GHG calculation of the dairy farm-level optimization model DAIRYDYN, including methane (CH₄) from enteric fermentation and managed manure. The model involves four emission calculation procedures (indicators), differing in the aggregation level of relevant input variables. The corresponding emission factors used by the indicators range from default per cow (activity level) emissions up to emission factors based on feed intake, manure amount, and milk production intensity. For validation of the CH₄ accounting of the model, 1-year CH₄ measurements of an experimental free-stall dairy farm in Germany are compared to model simulation results. An advantage of this interdisciplinary study is given by the correspondence of the model parameterization and simulation horizon with the experimental farm's characteristics and measurement period. The results clarify that modeled emission inventories (2,898, 4,637, 4,247, and 3,600 kg CO₂-eq. cow(-1) year(-1)) lead to more or less good approximations of online measurements (average 3,845 kg CO₂-eq. cow(-1) year(-1) (±275 owing to manure management)) depending on the indicator utilized. The more farm-specific characteristics are used by the GHG indicator; the lower is the bias of the modeled emissions. Results underline that an accurate emission calculation procedure should capture differences in energy intake, owing to milk production intensity as well as manure storage time. Despite the differences between indicator estimates, the deviation of modeled GHGs using detailed indicators in DAIRYDYN from on-farm measurements is relatively low (between -6.4% and 10.5%), compared with findings from the literature.

  11. Benchmark Data Set for Wheat Growth Models: Field Experiments and AgMIP Multi-Model Simulations.

    NASA Technical Reports Server (NTRS)

    Asseng, S.; Ewert, F.; Martre, P.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P.J.; Rotter, R. P.

    2015-01-01

    The data set includes a current representative management treatment from detailed, quality-tested sentinel field experiments with wheat from four contrasting environments including Australia, The Netherlands, India and Argentina. Measurements include local daily climate data (solar radiation, maximum and minimum temperature, precipitation, surface wind, dew point temperature, relative humidity, and vapor pressure), soil characteristics, frequent growth, nitrogen in crop and soil, crop and soil water and yield components. Simulations include results from 27 wheat models and a sensitivity analysis with 26 models and 30 years (1981-2010) for each location, for elevated atmospheric CO2 and temperature changes, a heat stress sensitivity analysis at anthesis, and a sensitivity analysis with soil and crop management variations and a Global Climate Model end-century scenario.

  12. Development of a precipitation-runoff model to simulate unregulated streamflow in the South Fork Flathead River Basin, Montana

    USGS Publications Warehouse

    Chase, K.J.

    2011-01-01

    This report documents the development of a precipitation-runoff model for the South Fork Flathead River Basin, Mont. The Precipitation-Runoff Modeling System model, developed in cooperation with the Bureau of Reclamation, can be used to simulate daily mean unregulated streamflow upstream and downstream from Hungry Horse Reservoir for water-resources planning. Two input files are required to run the model. The time-series data file contains daily precipitation data and daily minimum and maximum air-temperature data from climate stations in and near the South Fork Flathead River Basin. The parameter file contains values of parameters that describe the basin topography, the flow network, the distribution of the precipitation and temperature data, and the hydrologic characteristics of the basin soils and vegetation. A primary-parameter file was created for simulating streamflow during the study period (water years 1967-2005). The model was calibrated for water years 1991-2005 using the primary-parameter file. This calibration was further refined using snow-covered area data for water years 2001-05. The model then was tested for water years 1967-90. Calibration targets included mean monthly and daily mean unregulated streamflow upstream from Hungry Horse Reservoir, mean monthly unregulated streamflow downstream from Hungry Horse Reservoir, basin mean monthly solar radiation and potential evapotranspiration, and daily snapshots of basin snow-covered area. Simulated streamflow generally was in better agreement with observed streamflow at the upstream gage than at the downstream gage. Upstream from the reservoir, simulated mean annual streamflow was within 0.0 percent of observed mean annual streamflow for the calibration period and was about 2 percent higher than observed mean annual streamflow for the test period. Simulated mean April-July streamflow upstream from the reservoir was about 1 percent lower than observed streamflow for the calibration period and about 4

  13. Pre-operative simulation of pediatric mastoid surgery with 3D-printed temporal bone models.

    PubMed

    Rose, Austin S; Webster, Caroline E; Harrysson, Ola L A; Formeister, Eric J; Rawal, Rounak B; Iseli, Claire E

    2015-05-01

    As the process of additive manufacturing, or three-dimensional (3D) printing, has become more practical and affordable, a number of applications for the technology in the field of pediatric otolaryngology have been considered. One area of promise is temporal bone surgical simulation. Having previously developed a model for temporal bone surgical training using 3D printing, we sought to produce a patient-specific model for pre-operative simulation in pediatric otologic surgery. Our hypothesis was that the creation and pre-operative dissection of such a model was possible, and would demonstrate potential benefits in cases of abnormal temporal bone anatomy. In the case presented, an 11-year-old boy underwent a planned canal-wall-down (CWD) tympano-mastoidectomy for recurrent cholesteatoma preceded by a pre-operative surgical simulation using 3D-printed models of the temporal bone. The models were based on the child's pre-operative clinical CT scan and printed using multiple materials to simulate both bone and soft tissue structures. To help confirm the models as accurate representations of the child's anatomy, distances between various anatomic landmarks were measured and compared to the temporal bone CT scan and the 3D model. The simulation allowed the surgical team to appreciate the child's unusual temporal bone anatomy as well as any challenges that might arise in the safety of the temporal bone laboratory, prior to actual surgery in the operating room (OR). There was minimal variability, in terms of absolute distance (mm) and relative distance (%), in measurements between anatomic landmarks obtained from the patient intra-operatively, the pre-operative CT scan and the 3D-printed models. Accurate 3D temporal bone models can be rapidly produced based on clinical CT scans for pre-operative simulation of specific challenging otologic cases in children, potentially reducing medical errors and improving patient safety. Copyright © 2015 Elsevier Ireland Ltd. All rights

  14. Modeling, Simulation, and Forecasting of Subseasonal Variability

    NASA Technical Reports Server (NTRS)

    Waliser, Duane; Schubert, Siegfried; Kumar, Arun; Weickmann, Klaus; Dole, Randall

    2003-01-01

    A planning workshop on "Modeling, Simulation and Forecasting of Subseasonal Variability" was held in June 2003. This workshop was the first of a number of meetings planned to follow the NASA-sponsored workshop entitled "Prospects For Improved Forecasts Of Weather And Short-Term Climate Variability On Sub-Seasonal Time Scales" that was held April 2002. The 2002 workshop highlighted a number of key sources of unrealized predictability on subseasonal time scales including tropical heating, soil wetness, the Madden Julian Oscillation (MJO) [a.k.a Intraseasonal Oscillation (ISO)], the Arctic Oscillation (AO) and the Pacific/North American (PNA) pattern. The overarching objective of the 2003 follow-up workshop was to proceed with a number of recommendations made from the 2002 workshop, as well as to set an agenda and collate efforts in the areas of modeling, simulation and forecasting intraseasonal and short-term climate variability. More specifically, the aims of the 2003 workshop were to: 1) develop a baseline of the "state of the art" in subseasonal prediction capabilities, 2) implement a program to carry out experimental subseasonal forecasts, and 3) develop strategies for tapping the above sources of predictability by focusing research, model development, and the development/acquisition of new observations on the subseasonal problem. The workshop was held over two days and was attended by over 80 scientists, modelers, forecasters and agency personnel. The agenda of the workshop focused on issues related to the MJO and tropicalextratropical interactions as they relate to the subseasonal simulation and prediction problem. This included the development of plans for a coordinated set of GCM hindcast experiments to assess current model subseasonal prediction capabilities and shortcomings, an emphasis on developing a strategy to rectify shortcomings associated with tropical intraseasonal variability, namely diabatic processes, and continuing the implementation of an

  15. Predictions of Cockpit Simulator Experimental Outcome Using System Models

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Goka, T.

    1984-01-01

    This study involved predicting the outcome of a cockpit simulator experiment where pilots used cockpit displays of traffic information (CDTI) to establish and maintain in-trail spacing behind a lead aircraft during approach. The experiments were run on the NASA Ames Research Center multicab cockpit simulator facility. Prior to the experiments, a mathematical model of the pilot/aircraft/CDTI flight system was developed which included relative in-trail and vertical dynamics between aircraft in the approach string. This model was used to construct a digital simulation of the string dynamics including response to initial position errors. The model was then used to predict the outcome of the in-trail following cockpit simulator experiments. Outcome included performance and sensitivity to different separation criteria. The experimental results were then used to evaluate the model and its prediction accuracy. Lessons learned in this modeling and prediction study are noted.

  16. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  17. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. eShopper modeling and simulation

    NASA Astrophysics Data System (ADS)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  19. Conceptual Design of Simulation Models in an Early Development Phase of Lunar Spacecraft Simulator Using SMP2 Standard

    NASA Astrophysics Data System (ADS)

    Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.

  20. The Serpent Strikes: Simulation in a Large First-Year Course.

    ERIC Educational Resources Information Center

    Schrag, Philip G.

    1989-01-01

    A year-long simulation of a single case supplements a traditional civil procedure course at Georgetown University. Experience with the approach suggests that design features can reduce the burdens on the instructor without reducing course effectiveness, making the approach feasible even with larger classes. (MSE)

  1. Shoulder Arthroscopy Simulator Training Improves Shoulder Arthroscopy Performance in a Cadaver Model

    PubMed Central

    Henn, R. Frank; Shah, Neel; Warner, Jon J.P.; Gomoll, Andreas H.

    2013-01-01

    Purpose The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaver model of shoulder arthroscopy. Methods Seventeen first year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and nine of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The two groups were compared with students t-tests, and change over time within groups was analyzed with paired t-tests. Results There were no observed differences between the two groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (p<0.05). Time to completion was significantly faster in the simulator group compared to controls at final evaluation (p<0.05). No difference was observed between the groups on the subjective scores at final evaluation (p=0.98). Conclusions Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaver model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. Clinical Relevance There may be a role for simulator training in shoulder arthroscopy education. PMID:23591380

  2. Modeling ground-based timber harvesting systems using computer simulation

    Treesearch

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  3. Simulation and analysis of a model dinoflagellate predator-prey system

    NASA Astrophysics Data System (ADS)

    Mazzoleni, M. J.; Antonelli, T.; Coyne, K. J.; Rossi, L. F.

    2015-12-01

    This paper analyzes the dynamics of a model dinoflagellate predator-prey system and uses simulations to validate theoretical and experimental studies. A simple model for predator-prey interactions is derived by drawing upon analogies from chemical kinetics. This model is then modified to account for inefficiencies in predation. Simulation results are shown to closely match the model predictions. Additional simulations are then run which are based on experimental observations of predatory dinoflagellate behavior, and this study specifically investigates how the predatory dinoflagellate Karlodinium veneficum uses toxins to immobilize its prey and increase its feeding rate. These simulations account for complex dynamics that were not included in the basic models, and the results from these computational simulations closely match the experimentally observed predatory behavior of K. veneficum and reinforce the notion that predatory dinoflagellates utilize toxins to increase their feeding rate.

  4. A Simple Evacuation Modeling and Simulation Tool for First Responders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B; Payne, Patricia W

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools canmore » quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.« less

  5. Predicting the difficulty of pure, strict, epistatic models: metrics for simulated model selection.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Fisher, Jonathan M; Moore, Jason H

    2012-09-26

    Algorithms designed to detect complex genetic disease associations are initially evaluated using simulated datasets. Typical evaluations vary constraints that influence the correct detection of underlying models (i.e. number of loci, heritability, and minor allele frequency). Such studies neglect to account for model architecture (i.e. the unique specification and arrangement of penetrance values comprising the genetic model), which alone can influence the detectability of a model. In order to design a simulation study which efficiently takes architecture into account, a reliable metric is needed for model selection. We evaluate three metrics as predictors of relative model detection difficulty derived from previous works: (1) Penetrance table variance (PTV), (2) customized odds ratio (COR), and (3) our own Ease of Detection Measure (EDM), calculated from the penetrance values and respective genotype frequencies of each simulated genetic model. We evaluate the reliability of these metrics across three very different data search algorithms, each with the capacity to detect epistatic interactions. We find that a model's EDM and COR are each stronger predictors of model detection success than heritability. This study formally identifies and evaluates metrics which quantify model detection difficulty. We utilize these metrics to intelligently select models from a population of potential architectures. This allows for an improved simulation study design which accounts for differences in detection difficulty attributed to model architecture. We implement the calculation and utilization of EDM and COR into GAMETES, an algorithm which rapidly and precisely generates pure, strict, n-locus epistatic models.

  6. Evaluation of regional climate simulations for air quality modelling purposes

    NASA Astrophysics Data System (ADS)

    Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand

    2013-05-01

    In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

  7. Hot-bench simulation of the active flexible wing wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Houck, Jacob A.

    1990-01-01

    Two simulations, one batch and one real-time, of an aeroelastically-scaled wind-tunnel model were developed. The wind-tunnel model was a full-span, free-to-roll model of an advanced fighter concept. The batch simulation was used to generate and verify the real-time simulation and to test candidate control laws prior to implementation. The real-time simulation supported hot-bench testing of a digital controller, which was developed to actively control the elastic deformation of the wind-tunnel model. Time scaling was required for hot-bench testing. The wind-tunnel model, the mathematical models for the simulations, the techniques employed to reduce the hot-bench time-scale factors, and the verification procedures are described.

  8. Modeling and Simulation of Shuttle Launch and Range Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    The simulation and modeling test bed is based on a mockup of a space flight operations control suitable to experiment physical, procedural, software, hardware and psychological aspects of space flight operations. The test bed consists of a weather expert system to advise on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, impact of human health risk, debris dispersion model in 3D visualization. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.

  9. Modeling and simulation of different and representative engineering problems using Network Simulation Method.

    PubMed

    Sánchez-Pérez, J F; Marín, F; Morales, J L; Cánovas, M; Alhama, F

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model.

  10. Climate and marine biogeochemistry during the Holocene from transient model simulations

    NASA Astrophysics Data System (ADS)

    Segschneider, Joachim; Schneider, Birgit; Khon, Vyacheslav

    2018-06-01

    Climate and marine biogeochemistry changes over the Holocene are investigated based on transient global climate and biogeochemistry model simulations over the last 9500 years. The simulations are forced by accelerated and non-accelerated orbital parameters, respectively, and atmospheric pCO2, CH4, and N2O. The analysis focusses on key climatic parameters of relevance to the marine biogeochemistry, and on the physical and biogeochemical processes that drive atmosphere-ocean carbon fluxes and changes in the oxygen minimum zones (OMZs). The simulated global mean ocean temperature is characterized by a mid-Holocene cooling and a late Holocene warming, a common feature among Holocene climate simulations which, however, contradicts a proxy-derived mid-Holocene climate optimum. As the most significant result, and only in the non-accelerated simulation, we find a substantial increase in volume of the OMZ in the eastern equatorial Pacific (EEP) continuing into the late Holocene. The concurrent increase in apparent oxygen utilization (AOU) and age of the water mass within the EEP OMZ can be attributed to a weakening of the deep northward inflow into the Pacific. This results in a large-scale mid-to-late Holocene increase in AOU in most of the Pacific and hence the source regions of the EEP OMZ waters. The simulated expansion of the EEP OMZ raises the question of whether the deoxygenation that has been observed over the last 5 decades could be a - perhaps accelerated - continuation of an orbitally driven decline in oxygen. Changes in global mean biological production and export of detritus remain of the order of 10 %, with generally lower values in the mid-Holocene. The simulated atmosphere-ocean CO2 flux would result in atmospheric pCO2 changes of similar magnitudes to those observed for the Holocene, but with different timing. More technically, as the increase in EEP OMZ volume can only be simulated with the non-accelerated model simulation, non-accelerated model

  11. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    PubMed Central

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  12. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model.

    PubMed

    Fabian, M Patricia; Stout, Natasha K; Adamkiewicz, Gary; Geggel, Amelia; Ren, Cizao; Sandel, Megan; Levy, Jonathan I

    2012-09-18

    In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality, and medical history on asthma outcomes

  13. The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model

    PubMed Central

    2012-01-01

    Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality

  14. A quasi-biennial oscillation signal in general circulation model simulations.

    PubMed

    Cariolle, D; Amodei, M; Déqué, M; Mahfouf, J F; Simon, P; Teyssédre, H

    1993-09-03

    The quasi-biennial oscillation (QBO) is a free atmospheric mode that affects the equatorial lower stratosphere. With a quasi-regular frequency, the mean equatorial zonal wind alternates from easterly to westerly regimes. This oscillation is zonally symmetric about the equator, has its largest amplitude in the latitudinal band from 20 degrees S to 20 degrees N, and has a mean period of about 27 months. The QBO appears to originate in the momentum deposition produced by the damping in the stratosphere of equatorial waves excited by diabatic thermal processes in the troposphere. The results of three 10-year simulations obtained with three general circulation models are reported, all of which show the development in the stratosphere of a QBO signal with a period and a spatial propagating structure that are in good agreement with observations without any ad hoc parameterization of equatorial wave forcing. Although the amplitude of the oscillation in the simulations is still less than the observed value, the result is promising for the development of global climate models.

  15. A simulation of water pollution model parameter estimation

    NASA Technical Reports Server (NTRS)

    Kibler, J. F.

    1976-01-01

    A parameter estimation procedure for a water pollution transport model is elaborated. A two-dimensional instantaneous-release shear-diffusion model serves as representative of a simple transport process. Pollution concentration levels are arrived at via modeling of a remote-sensing system. The remote-sensed data are simulated by adding Gaussian noise to the concentration level values generated via the transport model. Model parameters are estimated from the simulated data using a least-squares batch processor. Resolution, sensor array size, and number and location of sensor readings can be found from the accuracies of the parameter estimates.

  16. Great Britain Storm Surge Modeling for a 10,000-Year Stochastic Catalog with the Effect of Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Blair, A.; Yablonsky, R. M.

    2017-12-01

    Storm surge caused by Extratropical Cyclones (ETCs) has significantly impacted not only the life of private citizens but also the insurance and reinsurance industry in Great Britain. The storm surge risk assessment requires a larger dataset of storms than the limited recorded historical ETCs. Thus, historical ETCs were perturbed to generate a 10,000-year stochastic catalog that accounts for surge-generating ETCs in the study area with return periods from one year to 10,000 years. Delft3D-Flexible Mesh hydrodynamic model was used to numerically simulate the storm surge along the Great Britain coastline. A nested grid technique was used to increase the simulation grid resolution up to 200 m near the highly populated coastal areas. Coarse and fine mesh models were calibrated and validated using historical recorded water elevations. Then, numerical simulations were performed on a 10,000-year stochastic catalog. The 50-, 100-, and 500-year return period maps were generated for Great Britain coastal areas. The corresponding events with return periods of 50-, 100-, and 500-years in Humber Bay and Thames River coastal areas were identified, and simulated with the consideration of projected sea level rises to reveal the effect of rising sea levels on the inundation return period maps in two highly-populated coastal areas. Finally, the return period of Storm Xaver (2013) was determined with and without the effect of rising sea levels.

  17. Simulations of Tropospheric NO2 by the Global Modeling Initiative (GMI) Model Utilizing Assimilated and Forecast Meteorological Fields: Comparison to Ozone Monitoring Instrument (OMI) Measurements

    NASA Technical Reports Server (NTRS)

    Rodriquez, J. M.; Yoshida, Y.; Duncan, B. N.; Bucsela, E. J.; Gleason, J. F.; Allen, D.; Pickering, K. E.

    2007-01-01

    We present simulations of the tropospheric composition for the years 2004 and 2005, carried out by the GMI Combined Stratosphere-Troposphere (Combo) model, at a resolution of 2degx2.5deg. The model includes a new parameterization of lightning sources of NO(x) which is coupled to the cloud mass fluxes in the adopted meteorological fields. These simulations use two different sets of input meteorological fields: a)late-look assimilated fields from the Global Modeling and Assimilation Office (GMAO), GEOS-4 system and b) 12-hour forecast fields initialized with the assimilated data. Comparison of the forecast to the assimilated fields indicates that the forecast fields exhibit less vigorous convection, and yield tropical precipitation fields in better agreement with observations. Since these simulations include a complete representation of the stratosphere, they provide realistic stratosphere-tropospheric fluxes of O3 and NO(y). Furthermore, the stratospheric contribution to total columns of different troposheric species can be subtracted in a consistent fashion, and the lightning production of NO(y) will depend on the adopted meteorological field. We concentrate here on the simulated tropospheric columns of NO2, and compare them to observations by the OM1 instrument for the years 2004 and 2005. The comparison is used to address these questions: a) is there a significant difference in the agreement/disagreement between simulations for these two different meteorological fields, and if so, what causes these differences?; b) how do the simulations compare to OMI observations, and does this comparison indicate an improvement in simulations with the forecast fields? c) what are the implications of these simulations for our understanding of the NO2 emissions over continental polluted regions?

  18. A future Outlook: Web based Simulation of Hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Islam, A. S.; Piasecki, M.

    2003-12-01

    Despite recent advances to present simulation results as 3D graphs or animation contours, the modeling user community still faces some shortcomings when trying to move around and analyze data. Typical problems include the lack of common platforms with standard vocabulary to exchange simulation results from different numerical models, insufficient descriptions about data (metadata), lack of robust search and retrieval tools for data, and difficulties to reuse simulation domain knowledge. This research demonstrates how to create a shared simulation domain in the WWW and run a number of models through multi-user interfaces. Firstly, meta-datasets have been developed to describe hydrodynamic model data based on geographic metadata standard (ISO 19115) that has been extended to satisfy the need of the hydrodynamic modeling community. The Extended Markup Language (XML) is used to publish this metadata by the Resource Description Framework (RDF). Specific domain ontology for Web Based Simulation (WBS) has been developed to explicitly define vocabulary for the knowledge based simulation system. Subsequently, this knowledge based system is converted into an object model using Meta Object Family (MOF). The knowledge based system acts as a Meta model for the object oriented system, which aids in reusing the domain knowledge. Specific simulation software has been developed based on the object oriented model. Finally, all model data is stored in an object relational database. Database back-ends help store, retrieve and query information efficiently. This research uses open source software and technology such as Java Servlet and JSP, Apache web server, Tomcat Servlet Engine, PostgresSQL databases, Protégé ontology editor, RDQL and RQL for querying RDF in semantic level, Jena Java API for RDF. Also, we use international standards such as the ISO 19115 metadata standard, and specifications such as XML, RDF, OWL, XMI, and UML. The final web based simulation product is deployed as

  19. An electrical circuit model for simulation of indoor radon concentration.

    PubMed

    Musavi Nasab, S M; Negarestani, A

    2013-01-01

    In this study, a new model based on electric circuit theory was introduced to simulate the behaviour of indoor radon concentration. In this model, a voltage source simulates radon generation in walls, conductivity simulates migration through walls and voltage across a capacitor simulates radon concentration in a room. This simulation considers migration of radon through walls by diffusion mechanism in one-dimensional geometry. Data reported in a typical Greek house were employed to examine the application of this technique of simulation to the behaviour of radon.

  20. Integrated modeling and heat treatment simulation of austempered ductile iron

    NASA Astrophysics Data System (ADS)

    Hepp, E.; Hurevich, V.; Schäfer, W.

    2012-07-01

    The integrated modeling and simulation of the casting and heat treatment processes for producing austempered ductile iron (ADI) castings is presented. The focus is on describing different models to simulate the austenitization, quenching and austempering steps during ADI heat treatment. The starting point for the heat treatment simulation is the simulated microstructure after solidification and cooling. The austenitization model considers the transformation of the initial ferrite-pearlite matrix into austenite as well as the dissolution of graphite in austenite to attain a uniform carbon distribution. The quenching model is based on measured CCT diagrams. Measurements have been carried out to obtain these diagrams for different alloys with varying Cu, Ni and Mo contents. The austempering model includes nucleation and growth kinetics of the ADI matrix. The model of ADI nucleation is based on experimental measurements made for varied Cu, Ni, Mo contents and austempering temperatures. The ADI kinetic model uses a diffusion controlled approach to model the growth. The models have been integrated in a tool for casting process simulation. Results are shown for the optimization of the heat treatment process of a planetary carrier casting.

  1. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  2. Land surface modeling in convection permitting simulations

    NASA Astrophysics Data System (ADS)

    van Heerwaarden, Chiel; Benedict, Imme

    2017-04-01

    The next generation of weather and climate models permits convection, albeit at a grid spacing that is not sufficient to resolve all details of the clouds. Whereas much attention is being devoted to the correct simulation of convective clouds and associated precipitation, the role of the land surface has received far less interest. In our view, convective permitting simulations pose a set of problems that need to be solved before accurate weather and climate prediction is possible. The heart of the problem lies at the direct runoff and at the nonlinearity of the surface stress as a function of soil moisture. In coarse resolution simulations, where convection is not permitted, precipitation that reaches the land surface is uniformly distributed over the grid cell. Subsequently, a fraction of this precipitation is intercepted by vegetation or leaves the grid cell via direct runoff, whereas the remainder infiltrates into the soil. As soon as we move to convection permitting simulations, this precipitation falls often locally in large amounts. If the same land-surface model is used as in simulations with parameterized convection, this leads to an increase in direct runoff. Furthermore, spatially non-uniform infiltration leads to a very different surface stress, when scaled up to the course resolution of simulations without convection. Based on large-eddy simulation of realistic convection events at a large domain, this study presents a quantification of the errors made at the land surface in convection permitting simulation. It compares the magnitude of the errors to those made in the convection itself due to the coarse resolution of the simulation. We find that, convection permitting simulations have less evaporation than simulations with parameterized convection, resulting in a non-realistic drying of the atmosphere. We present solutions to resolve this problem.

  3. Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models

    PubMed Central

    Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.

    2016-01-01

    We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881

  4. Circulation and rainfall climatology of a 10-year (1979 - 1988) integration with the Goddard Laboratory for atmospheres general circulation model

    NASA Technical Reports Server (NTRS)

    Kim, J.-H.; Sud, Y. C.

    1993-01-01

    A 10-year (1979-1988) integration of Goddard Laboratory for Atmospheres (GLA) general circulation model (GCM) under Atmospheric Model Intercomparison Project (AMIP) is analyzed and compared with observation. The first momentum fields of circulation variables and also hydrological variables including precipitation, evaporation, and soil moisture are presented. Our goals are (1) to produce a benchmark documentation of the GLA GCM for future model improvements; (2) to examine systematic errors between the simulated and the observed circulation, precipitation, and hydrologic cycle; (3) to examine the interannual variability of the simulated atmosphere and compare it with observation; and (4) to examine the ability of the model to capture the major climate anomalies in response to events such as El Nino and La Nina. The 10-year mean seasonal and annual simulated circulation is quite reasonable compared to the analyzed circulation, except the polar regions and area of high orography. Precipitation over tropics are quite well simulated, and the signal of El Nino/La Nina episodes can be easily identified. The time series of evaporation and soil moisture in the 12 biomes of the biosphere also show reasonable patterns compared to the estimated evaporation and soil moisture.

  5. Evaluation of the Simulation of Arctic and Antarctic Sea Ice Coverages by Eleven Major Global Climate Models

    NASA Technical Reports Server (NTRS)

    Parksinson, Claire; Vinnikov, Konstantin Y.; Cavalieri, Donald J.

    2005-01-01

    Comparison of polar sea ice results from 11 major global climate models and satellite-derived observations for 1979-2004 reveals that each of the models is simulating seasonal cycles that are phased at least approximately correctly in both hemispheres. Each is also simulating various key aspects of the observed ice cover distributions, such as winter ice not only throughout the central Arctic basin but also throughout Hudson Bay, despite its relatively low latitudes. However, some of the models simulate too much ice, others too little ice (in some cases varying depending on hemisphere and/or season), and some match the observations better in one season versus another. Several models do noticeably better in the Northern Hemisphere than in the Southern Hemisphere, and one does noticeably better in the Southern Hemisphere. In the Northern Hemisphere all simulate monthly average ice extents to within +/-5.1 x 10(exp 6)sq km of the observed ice extent throughout the year; and in the Southern Hemisphere all except one simulate the monthly averages to within +/-6.3 x 10(exp 6) sq km of the observed values. All the models properly simulate a lack of winter ice to the west of Norway; however, most do not obtain as much absence of ice immediately north of Norway as the observations show, suggesting an under simulation of the North Atlantic Current. The spread in monthly averaged ice extents amongst the 11 model simulations is greater in the Southern Hemisphere than in the Northern Hemisphere and greatest in the Southern Hemisphere winter and spring.

  6. The atmospheric boundary layer in the CSIRO global climate model: simulations versus observations

    NASA Astrophysics Data System (ADS)

    Garratt, J. R.; Rotstayn, L. D.; Krummel, P. B.

    2002-07-01

    A 5-year simulation of the atmospheric boundary layer in the CSIRO global climate model (GCM) is compared with detailed boundary-layer observations at six locations, two over the ocean and four over land. Field observations, in the form of surface fluxes and vertical profiles of wind, temperature and humidity, are generally available for each hour over periods of one month or more in a single year. GCM simulations are for specific months corresponding to the field observations, for each of five years. At three of the four land sites (two in Australia, one in south-eastern France), modelled rainfall was close to the observed climatological values, but was significantly in deficit at the fourth (Kansas, USA). Observed rainfall during the field expeditions was close to climatology at all four sites. At the Kansas site, modelled screen temperatures (Tsc), diurnal temperature amplitude and sensible heat flux (H) were significantly higher than observed, with modelled evaporation (E) much lower. At the other three land sites, there is excellent correspondence between the diurnal amplitude and phase and absolute values of each variable (Tsc, H, E). Mean monthly vertical profiles for specific times of the day show strong similarities: over land and ocean in vertical shape and absolute values of variables, and in the mixed-layer and nocturnal-inversion depths (over land) and the height of the elevated inversion or height of the cloud layer (over the sea). Of special interest is the presence climatologically of early morning humidity inversions related to dewfall and of nocturnal low-level jets; such features are found in the GCM simulations. The observed day-to-day variability in vertical structure is captured well in the model for most sites, including, over a whole month, the temperature range at all levels in the boundary layer, and the mix of shallow and deep mixed layers. Weaknesses or unrealistic structure include the following, (a) unrealistic model mixed

  7. Putting aquifers into atmospheric simulation models: An example from the Mill Creek Watershed, Northeastern Kansas

    USGS Publications Warehouse

    York, J.P.; Person, M.; Gutowski, W.J.; Winter, T.C.

    2002-01-01

    Aquifer-atmosphere interactions can be important in regions where the water table is shallow (<2 m). A shallow water table provides moisture for the soil and vegetation and thus acts as a source term for evapotranspiration to the atmosphere. A coupled aquifer-land surface-atmosphere model has been developed to study aquifer-atmosphere interactions in watersheds, on decadal timescales. A single column vertically discretized atmospheric model is linked to a distributed soil-vegetation-aquifer model. This physically based model was able to reproduce monthly and yearly trends in precipitation, stream discharge, and evapotranspiration, for a catchment in northeastern Kansas. However, the calculated soil moisture tended to drop to levels lower than were observed in drier years. The evapotranspiration varies spatially and seasonally and was highest in cells situated in topographic depressions where the water table is in the root zone. Annually, simulation results indicate that from 5-20% of groundwater supported evapotranspiration is drawn from the aquifer. The groundwater supported fraction of evapotranspiration is higher in drier years, when evapotranspiration exceeds precipitation. A long-term (40 year) simulation of extended drought conditions indicated that water table position is a function of groundwater hydrodynamics and cannot be predicted solely on the basis of topography. The response time of the aquifer to drought conditions was on the order of 200 years indicating that feedbacks between these two water reservoirs act on disparate time scales. With recent advances in the computational power of massively parallel supercomputers, it may soon become possible to incorporate physically based representations of aquifer hydrodynamics into general circulation models (GCM) land surface parameterization schemes. ?? 2002 Elsevier Science Ltd. All rights reserved.

  8. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  9. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete

  10. A Five-Year CMAQ PM2.5 Model Performance for Wildfires and Prescribed Fires

    NASA Astrophysics Data System (ADS)

    Wilkins, J. L.; Pouliot, G.; Foley, K.; Rappold, A.; Pierce, T. E.

    2016-12-01

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. We will also review plume rise to see how it affects model bias and compare CMAQ current fire emissions input to an hourly dataset from FLAMBE.

  11. Modeling a maintenance simulation of the geosynchronous platform

    NASA Technical Reports Server (NTRS)

    Kleiner, A. F., Jr.

    1980-01-01

    A modeling technique used to conduct a simulation study comparing various maintenance routines for a space platform is dicussed. A system model is described and illustrated, the basic concepts of a simulation pass are detailed, and sections on failures and maintenance are included. The operation of the system across time is best modeled by a discrete event approach with two basic events - failure and maintenance of the system. Each overall simulation run consists of introducing a particular model of the physical system, together with a maintenance policy, demand function, and mission lifetime. The system is then run through many passes, each pass corresponding to one mission and the model is re-initialized before each pass. Statistics are compiled at the end of each pass and after the last pass a report is printed. Items of interest typically include the time to first maintenance, total number of maintenance trips for each pass, average capability of the system, etc.

  12. Defense Modeling and Simulation Initiative

    DTIC Science & Technology

    1992-05-01

    project solicitation and priority ranking process, and reviewing policy issues . The activities of the DMSO and MSWG are also supported by a series of... issues have been raised for discussion, including: *Proumulgation of standards for the interoperability of models and simulations " Modeling and...have been completed or will be completed in the near term. The policy issues should be defined at a high level in the near term, although their

  13. Expected lifetime numbers, risks, and burden of osteoporotic fractures for 50-year old Chinese women: a discrete event simulation incorporating FRAX.

    PubMed

    Jiang, Yawen; Ni, Weiyi

    2016-11-01

    This work was undertaken to provide an estimation of expected lifetime numbers, risks, and burden of fractures for 50-year-old Chinese women. A discrete event simulation model was developed to simulate the lifetime fractures of 50-year-old Chinese women at average risk of osteoporotic fracture. Main events in the model included hip fracture, clinical vertebral fracture, wrist fracture, humerus fracture, and other fracture. Fracture risks were calculated using the FRAX ® tool. Simulations of 50-year-old Chinese women without fracture risks were also carried out as a comparison to determine the burden of fractures. A 50-year-old Chinese woman at average risk of fracture is expected to experience 0.135 (95 % CI: 0.134-0.137) hip fractures, 0.120 (95 % CI: 0.119-0.122) clinical vertebral fractures, 0.095 (95 % CI: 0.094-0.096) wrist fractures, 0.079 (95 % CI: 0.078-0.080) humerus fractures, and 0.407 (95 % CI: 0.404-0.410) other fractures over the remainder of her life. The residual lifetime risk of any fracture, hip fracture, clinical vertebral fracture, wrist fracture, humerus fracture, and other fracture for a 50-year-old Chinese woman is 37.36, 11.77, 10.47, 8.61, 7.30, and 27.80 %, respectively. The fracture-attributable excess quality-adjusted life year (QALY) loss and lifetime costs are estimated at 0.11 QALYs (95 % CI: 0.00-0.22 QALYs) and US $714.61 (95 % CI: US $709.20-720.02), totaling a net monetary benefit loss of US $1,104.43 (95 % CI: US $904.09-1,304.78). Chinese women 50 years of age are at high risk of osteoporotic fracture, and the expected economic and quality-of-life burden attributable to osteoporotic fractures among Chinese women is substantial.

  14. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  15. Trained simulated ultrasound patients: medical students as models, learners, and teachers.

    PubMed

    Blickendorf, J Matthew; Adkins, Eric J; Boulger, Creagh; Bahner, David P

    2014-01-01

    Medical educators must develop ultrasound education programs to ensure that future physicians are prepared to face the changing demands of clinical practice. It can be challenging to find human models for hands-on scanning sessions. This article outlines an educational model from a large university medical center that uses medical students to fulfill the need for human models. During the 2011-2012 academic year, medical students from The Ohio State University College of Medicine served as trained simulated ultrasound patients (TSUP) for hands-on scanning sessions held by the college and many residency programs. The extracurricular program is voluntary and coordinated by medical students with faculty supervision. Students receive a longitudinal didactic and hands-on ultrasound education program as an incentive for serving as a TSUP. The College of Medicine and 7 residency programs used the program, which included 47 second-year and 7 first-year student volunteers. Participation has increased annually because of the program's ease, reliability, and cost savings in providing normal anatomic models for ultrasound education programs. A key success of this program is its inherent reproducibility, as a new class of eager students constitutes the volunteer pool each year. The TSUP program is a feasible and sustainable method of fulfilling the need for normal anatomic ultrasound models while serving as a valuable extracurricular ultrasound education program for medical students. The program facilitates the coordination of ultrasound education programs by educators at the undergraduate and graduate levels.

  16. Intercomparison of Model Simulations of the Impact of 1997/98 El Nino on South American Summer Monsoon

    NASA Technical Reports Server (NTRS)

    Zhou, Jiayu; Lau, K.-M.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    The simulations of climatology and response of the South American summer monsoon (SASM) to the 1997/98 El Nino are investigated using six atmospheric general circulation models. Results show all models simulate the large-scale features of the SASM reasonably well. However, both stationary and seasonal components of the surface pressure are overestimated, resulting in an excessively strong SASM in the model climatology. The low-level northwesterly jet over eastern foothills of the Andes is not well resolved because of the coarse resolution of the models. Large rainfall simulation biases are found in association with the Andes and the Atlantic ITCZ, indicating model problems in handling steep mountains and parameterization of convective processes. The simulation of the 1997/98 El Nino impact on SASM is examined based on an ensemble of ten two-year (September 1996 - August 1998) integration. Results show that most models can simulate the large-scale tropospheric warming response over the tropical central Pacific, including the dynamic response of Rossby wave propagation of the Pacific-South America (PSA) pattern that influences remote areas. Deficiencies are found in simulating the regional impacts over South America. Model simulation fails to capture the southeastward expansion of anomalously warm tropospheric air. As a result, the upper tropospheric anomalous high over the subtropical Andes is less pronounced, and the enhancement of subtropical westerly jet is displaced 5deg-10deg equatorward compared to the observed. Over the Amazon basin, the shift of Walker cell induced by El Nino is not well represented, showing anomalous easterlies in both upper and lower troposphere.

  17. Propulsion simulation for magnetically suspended wind tunnel models

    NASA Technical Reports Server (NTRS)

    Joshi, Prakash B.; Beerman, Henry P.; Chen, James; Krech, Robert H.; Lintz, Andrew L.; Rosen, David I.

    1990-01-01

    The feasibility of simulating propulsion-induced aerodynamic effects on scaled aircraft models in wind tunnels employing Magnetic Suspension and Balance Systems. The investigation concerned itself with techniques of generating exhaust jets of appropriate characteristics. The objectives were to: (1) define thrust and mass flow requirements of jets; (2) evaluate techniques for generating propulsive gas within volume limitations imposed by magnetically-suspended models; (3) conduct simple diagnostic experiments for techniques involving new concepts; and (4) recommend experiments for demonstration of propulsion simulation techniques. Various techniques of generating exhaust jets of appropriate characteristics were evaluated on scaled aircraft models in wind tunnels with MSBS. Four concepts of remotely-operated propulsion simulators were examined. Three conceptual designs involving innovative adaptation of convenient technologies (compressed gas cylinders, liquid, and solid propellants) were developed. The fourth innovative concept, namely, the laser-assisted thruster, which can potentially simulate both inlet and exhaust flows, was found to require very high power levels for small thrust levels.

  18. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  19. One-year simulation of ozone and particulate matter in China using WRF/CMAQ modeling system

    NASA Astrophysics Data System (ADS)

    Hu, Jianlin; Chen, Jianjun; Ying, Qi; Zhang, Hongliang

    2016-08-01

    China has been experiencing severe air pollution in recent decades. Although an ambient air quality monitoring network for criteria pollutants has been constructed in over 100 cities since 2013 in China, the temporal and spatial characteristics of some important pollutants, such as particulate matter (PM) components, remain unknown, limiting further studies investigating potential air pollution control strategies to improve air quality and associating human health outcomes with air pollution exposure. In this study, a yearlong (2013) air quality simulation using the Weather Research and Forecasting (WRF) model and the Community Multi-scale Air Quality (CMAQ) model was conducted to provide detailed temporal and spatial information of ozone (O3), total PM2.5, and chemical components. Multi-resolution Emission Inventory for China (MEIC) was used for anthropogenic emissions and observation data obtained from the national air quality monitoring network were collected to validate model performance. The model successfully reproduces the O3 and PM2.5 concentrations at most cities for most months, with model performance statistics meeting the performance criteria. However, overprediction of O3 generally occurs at low concentration range while underprediction of PM2.5 happens at low concentration range in summer. Spatially, the model has better performance in southern China than in northern China, central China, and Sichuan Basin. Strong seasonal variations of PM2.5 exist and wind speed and direction play important roles in high PM2.5 events. Secondary components have more boarder distribution than primary components. Sulfate (SO42-), nitrate (NO3-), ammonium (NH4+), and primary organic aerosol (POA) are the most important PM2.5 components. All components have the highest concentrations in winter except secondary organic aerosol (SOA). This study proves the ability of the CMAQ model to reproduce severe air pollution in China, identifies the directions where improvements are

  20. Simulating ectomycorrhiza in boreal forests: implementing ectomycorrhizal fungi model MYCOFON in CoupModel (v5)

    NASA Astrophysics Data System (ADS)

    He, Hongxing; Meyer, Astrid; Jansson, Per-Erik; Svensson, Magnus; Rütting, Tobias; Klemedtsson, Leif

    2018-02-01

    The symbiosis between plants and Ectomycorrhizal fungi (ECM) is shown to considerably influence the carbon (C) and nitrogen (N) fluxes between the soil, rhizosphere, and plants in boreal forest ecosystems. However, ECM are either neglected or presented as an implicit, undynamic term in most ecosystem models, which can potentially reduce the predictive power of models.

    In order to investigate the necessity of an explicit consideration of ECM in ecosystem models, we implement the previously developed MYCOFON model into a detailed process-based, soil-plant-atmosphere model, Coup-MYCOFON, which explicitly describes the C and N fluxes between ECM and roots. This new Coup-MYCOFON model approach (ECM explicit) is compared with two simpler model approaches: one containing ECM implicitly as a dynamic uptake of organic N considering the plant roots to represent the ECM (ECM implicit), and the other a static N approach in which plant growth is limited to a fixed N level (nonlim). Parameter uncertainties are quantified using Bayesian calibration in which the model outputs are constrained to current forest growth and soil C / N ratio for four forest sites along a climate and N deposition gradient in Sweden and simulated over a 100-year period.

    The nonlim approach could not describe the soil C / N ratio due to large overestimation of soil N sequestration but simulate the forest growth reasonably well. The ECM implicit and explicit approaches both describe the soil C / N ratio well but slightly underestimate the forest growth. The implicit approach simulated lower litter production and soil respiration than the explicit approach. The ECM explicit Coup-MYCOFON model provides a more detailed description of internal ecosystem fluxes and feedbacks of C and N between plants, soil, and ECM. Our modeling highlights the need to incorporate ECM and organic N uptake into ecosystem models, and the nonlim approach is not recommended for future

  1. Virtual milk for modelling and simulation of dairy processes.

    PubMed

    Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R

    2016-05-01

    The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. A School Finance Computer Simulation Model

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    1974-01-01

    Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)

  3. An ocular biomechanic model for dynamic simulation of different eye movements.

    PubMed

    Iskander, J; Hossny, M; Nahavandi, S; Del Porto, L

    2018-04-11

    Simulating and analysing eye movement is useful for assessing visual system contribution to discomfort with respect to body movements, especially in virtual environments where simulation sickness might occur. It can also be used in the design of eye prosthesis or humanoid robot eye. In this paper, we present two biomechanic ocular models that are easily integrated into the available musculoskeletal models. The model was previously used to simulate eye-head coordination. The models are used to simulate and analyse eye movements. The proposed models are based on physiological and kinematic properties of the human eye. They incorporate an eye-globe, orbital suspension tissues and six muscles with their connective tissues (pulleys). Pulleys were incorporated in rectus and inferior oblique muscles. The two proposed models are the passive pulleys and the active pulleys models. Dynamic simulations of different eye movements, including fixation, saccade and smooth pursuit, are performed to validate both models. The resultant force-length curves of the models were similar to the experimental data. The simulation results show that the proposed models are suitable to generate eye movement simulations with results comparable to other musculoskeletal models. The maximum kinematic root mean square error (RMSE) is 5.68° and 4.35° for the passive and active pulley models, respectively. The analysis of the muscle forces showed realistic muscle activation with increased muscle synergy in the active pulley model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Utility of Emulation and Simulation Computer Modeling of Space Station Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.

  5. Multiscale Modeling and Simulation of Material Processing

    DTIC Science & Technology

    2006-07-01

    As a re- GIMP simulations . Fig. 2 illustrates the contact algo- suit, MPM using a single mesh tends to induce early con- rithm for the contact pair ...21-07-2006 Final Performance Report 05-01-2003 - 04-30-2006 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Multiscale Modeling and Simulation of Material...development of scaling laws for multiscale simulations from atomistic to continuum using material testing techniques, such as tension and indentation

  6. Creating Simulated Microgravity Patient Models

    NASA Technical Reports Server (NTRS)

    Hurst, Victor; Doerr, Harold K.; Bacal, Kira

    2004-01-01

    The Medical Operational Support Team (MOST) has been tasked by the Space and Life Sciences Directorate (SLSD) at the NASA Johnson Space Center (JSC) to integrate medical simulation into 1) medical training for ground and flight crews and into 2) evaluations of medical procedures and equipment for the International Space Station (ISS). To do this, the MOST requires patient models that represent the physiological changes observed during spaceflight. Despite the presence of physiological data collected during spaceflight, there is no defined set of parameters that illustrate or mimic a 'space normal' patient. Methods: The MOST culled space-relevant medical literature and data from clinical studies performed in microgravity environments. The areas of focus for data collection were in the fields of cardiovascular, respiratory and renal physiology. Results: The MOST developed evidence-based patient models that mimic the physiology believed to be induced by human exposure to a microgravity environment. These models have been integrated into space-relevant scenarios using a human patient simulator and ISS medical resources. Discussion: Despite the lack of a set of physiological parameters representing 'space normal,' the MOST developed space-relevant patient models that mimic microgravity-induced changes in terrestrial physiology. These models are used in clinical scenarios that will medically train flight surgeons, biomedical flight controllers (biomedical engineers; BME) and, eventually, astronaut-crew medical officers (CMO).

  7. High resolution simulations of aerosol microphysics in a global and regionally nested chemical transport model

    NASA Astrophysics Data System (ADS)

    Adams, P. J.; Marks, M.

    2015-12-01

    The aerosol indirect effect is the largest source of forcing uncertainty in current climate models. This effect arises from the influence of aerosols on the reflective properties and lifetimes of clouds, and its magnitude depends on how many particles can serve as cloud droplet formation sites. Assessing levels of this subset of particles (cloud condensation nuclei, or CCN) requires knowledge of aerosol levels and their global distribution, size distributions, and composition. A key tool necessary to advance our understanding of CCN is the use of global aerosol microphysical models, which simulate the processes that control aerosol size distributions: nucleation, condensation/evaporation, and coagulation. Previous studies have found important differences in CO (Chen, D. et al., 2009) and ozone (Jang, J., 1995) modeled at different spatial resolutions, and it is reasonable to believe that short-lived, spatially-variable aerosol species will be similarly - or more - susceptible to model resolution effects. The goal of this study is to determine how CCN levels and spatial distributions change as simulations are run at higher spatial resolution - specifically, to evaluate how sensitive the model is to grid size, and how this affects comparisons against observations. Higher resolution simulations are necessary supports for model/measurement synergy. Simulations were performed using the global chemical transport model GEOS-Chem (v9-02). The years 2008 and 2009 were simulated at 4ox5o and 2ox2.5o globally and at 0.5ox0.667o over Europe and North America. Results were evaluated against surface-based particle size distribution measurements from the European Supersites for Atmospheric Aerosol Research project. The fine-resolution model simulates more spatial and temporal variability in ultrafine levels, and better resolves topography. Results suggest that the coarse model predicts systematically lower ultrafine levels than does the fine-resolution model. Significant

  8. Assessment of effectiveness of geologic isolation systems. Geologic-simulation model for a hypothetical site in the Columbia Plateau. Volume 2: results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, M.G.; Petrie, G.M.; Baldwin, A.J.

    1982-06-01

    This report contains the input data and computer results for the Geologic Simulation Model. This model is described in detail in the following report: Petrie, G.M., et. al. 1981. Geologic Simulation Model for a Hypothetical Site in the Columbia Plateau, Pacific Northwest Laboratory, Richland, Washington. The Geologic Simulation Model is a quasi-deterministic process-response model which simulates, for a million years into the future, the development of the geologic and hydrologic systems of the ground-water basin containing the Pasco Basin. Effects of natural processes on the ground-water hydrologic system are modeled principally by rate equations. The combined effects and synergistic interactionsmore » of different processes are approximated by linear superposition of their effects during discrete time intervals in a stepwise-integration approach.« less

  9. Simplified energy-balance model for pragmatic multi-dimensional device simulation

    NASA Astrophysics Data System (ADS)

    Chang, Duckhyun; Fossum, Jerry G.

    1997-11-01

    To pragmatically account for non-local carrier heating and hot-carrier effects such as velocity overshoot and impact ionization in multi-dimensional numerical device simulation, a new simplified energy-balance (SEB) model is developed and implemented in FLOODS[16] as a pragmatic option. In the SEB model, the energy-relaxation length is estimated from a pre-process drift-diffusion simulation using the carrier-velocity distribution predicted throughout the device domain, and is used without change in a subsequent simpler hydrodynamic (SHD) simulation. The new SEB model was verified by comparison of two-dimensional SHD and full HD DC simulations of a submicron MOSFET. The SHD simulations yield detailed distributions of carrier temperature, carrier velocity, and impact-ionization rate, which agree well with the full HD simulation results obtained with FLOODS. The most noteworthy feature of the new SEB/SHD model is its computational efficiency, which results from reduced Newton iteration counts caused by the enhanced linearity. Relative to full HD, SHD simulation times can be shorter by as much as an order of magnitude since larger voltage steps for DC sweeps and larger time steps for transient simulations can be used. The improved computational efficiency can enable pragmatic three-dimensional SHD device simulation as well, for which the SEB implementation would be straightforward as it is in FLOODS or any robust HD simulator.

  10. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    NASA Astrophysics Data System (ADS)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2015-03-01

    We perform a land surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies between 6 modern stand-alone land surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by 5 different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99-135 x 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the best current observation-based estimate of actual permafrost area (101 x 104 km2). However the uncertainty (1-128 x 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air temperature based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification and snow cover. Models are particularly poor at simulating permafrost distribution using definition that soil temperature remains at or below 0°C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in permafrost distribution can be made for the Tibetan Plateau.

  11. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    NASA Astrophysics Data System (ADS)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  12. Simulations of NLC formation using a microphysical model driven by three-dimensional dynamics

    NASA Astrophysics Data System (ADS)

    Kirsch, Annekatrin; Becker, Erich; Rapp, Markus; Megner, Linda; Wilms, Henrike

    2014-05-01

    Noctilucent clouds (NLCs) represent an optical phenomenon occurring in the polar summer mesopause region. These clouds have been known since the late 19th century. Current physical understanding of NLCs is based on numerous observational and theoretical studies, in recent years especially observations from satellites and by lidars from ground. Theoretical studies based on numerical models that simulate NLCs with the underlying microphysical processes are uncommon. Up to date no three-dimensional numerical simulations of NLCs exist that take all relevant dynamical scales into account, i.e., from the planetary scale down to gravity waves and turbulence. Rather, modeling is usually restricted to certain flow regimes. In this study we make a more rigorous attempt and simulate NLC formation in the environment of the general circulation of the mesopause region by explicitly including gravity waves motions. For this purpose we couple the Community Aerosol and Radiation Model for Atmosphere (CARMA) to gravity-wave resolving dynamical fields simulated beforehand with the Kuehlungsborn Mechanistic Circulation Model (KMCM). In our case, the KMCM is run with a horizontal resolution of T120 which corresponds to a minimum horizontal wavelength of 350 km. This restriction causes the resolved gravity waves to be somewhat biased to larger scales. The simulated general circulation is dynamically controlled by these waves in a self-consitent fashion and provides realistic temperatures and wind-fields for July conditions. Assuming a water vapor mixing ratio profile in agreement with current observations results in reasonable supersaturations of up to 100. In a first step, CARMA is applied to a horizontal section covering the Northern hemisphere. The vertical resolution is 120 levels ranging from 72 to 101 km. In this paper we will present initial results of this coupled dynamical microphysical model focussing on the interaction of waves and turbulent diffusion with NLC-microphysics.

  13. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-01-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  14. Physically-Based Modelling and Real-Time Simulation of Fluids.

    NASA Astrophysics Data System (ADS)

    Chen, Jim Xiong

    1995-01-01

    Simulating physically realistic complex fluid behaviors presents an extremely challenging problem for computer graphics researchers. Such behaviors include the effects of driving boats through water, blending differently colored fluids, rain falling and flowing on a terrain, fluids interacting in a Distributed Interactive Simulation (DIS), etc. Such capabilities are useful in computer art, advertising, education, entertainment, and training. We present a new method for physically-based modeling and real-time simulation of fluids in computer graphics and dynamic virtual environments. By solving the 2D Navier -Stokes equations using a CFD method, we map the surface into 3D using the corresponding pressures in the fluid flow field. This achieves realistic real-time fluid surface behaviors by employing the physical governing laws of fluids but avoiding extensive 3D fluid dynamics computations. To complement the surface behaviors, we calculate fluid volume and external boundary changes separately to achieve full 3D general fluid flow. To simulate physical activities in a DIS, we introduce a mechanism which uses a uniform time scale proportional to the clock-time and variable time-slicing to synchronize physical models such as fluids in the networked environment. Our approach can simulate many different fluid behaviors by changing the internal or external boundary conditions. It can model different kinds of fluids by varying the Reynolds number. It can simulate objects moving or floating in fluids. It can also produce synchronized general fluid flows in a DIS. Our model can serve as a testbed to simulate many other fluid phenomena which have never been successfully modeled previously.

  15. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  16. Detached-Eddy Simulation Based on the v2-f Model

    NASA Technical Reports Server (NTRS)

    Jee, Sol Keun; Shariff, Karim

    2012-01-01

    Detached eddy simulation (DES) based on the v2-f RANS model is proposed. This RANS model incorporates the anisotropy of near-wall turbulence which is absent in other RANS models commonly used in the DES community. In LES mode, the proposed DES formulation reduces to a transport equation for the subgrid-scale kinetic energy. The constant, CDES, required by this model was calibrated by simulating isotropic turbulence. In the final paper, DES simulations of canonical separated flows will be presented.

  17. Deep Drawing Simulations With Different Polycrystalline Models

    NASA Astrophysics Data System (ADS)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  18. A Simulation Model for Measuring Customer Satisfaction through Employee Satisfaction

    NASA Astrophysics Data System (ADS)

    Zondiros, Dimitris; Konstantopoulos, Nikolaos; Tomaras, Petros

    2007-12-01

    Customer satisfaction is defined as a measure of how a firm's product or service performs compared to customer's expectations. It has long been a subject of research due to its importance for measuring marketing and business performance. A lot of models have been developed for its measurement. This paper propose a simulation model using employee satisfaction as one of the most important factors leading to customer satisfaction (the others being expectations and disconfirmation of expectations). Data obtained from a two-year survey in customers of banks in Greece were used. The application of three approaches regarding employee satisfaction resulted in greater customer satisfaction when there is serious effort to keep employees satisfied.

  19. Modeling and simulation of different and representative engineering problems using Network Simulation Method

    PubMed Central

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model. PMID:29518121

  20. Simulation Speed Analysis and Improvements of Modelica Models for Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jorissen, Filip; Wetter, Michael; Helsen, Lieve

    This paper presents an approach for speeding up Modelica models. Insight is provided into how Modelica models are solved and what determines the tool’s computational speed. Aspects such as algebraic loops, code efficiency and integrator choice are discussed. This is illustrated using simple building simulation examples and Dymola. The generality of the work is in some cases verified using OpenModelica. Using this approach, a medium sized office building including building envelope, heating ventilation and air conditioning (HVAC) systems and control strategy can be simulated at a speed five hundred times faster than real time.

  1. Influence of wheel-rail contact modelling on vehicle dynamic simulation

    NASA Astrophysics Data System (ADS)

    Burgelman, Nico; Sichani, Matin Sh.; Enblom, Roger; Berg, Mats; Li, Zili; Dollevoet, Rolf

    2015-08-01

    This paper presents a comparison of four models of rolling contact used for online contact force evaluation in rail vehicle dynamics. Until now only a few wheel-rail contact models have been used for online simulation in multibody software (MBS). Many more models exist and their behaviour has been studied offline, but a comparative study of the mutual influence between the calculation of the creep forces and the simulated vehicle dynamics seems to be missing. Such a comparison would help researchers with the assessment of accuracy and calculation time. The contact methods investigated in this paper are FASTSIM, Linder, Kik-Piotrowski and Stripes. They are compared through a coupling between an MBS for the vehicle simulation and Matlab for the contact models. This way the influence of the creep force calculation on the vehicle simulation is investigated. More specifically this study focuses on the influence of the contact model on the simulation of the hunting motion and on the curving behaviour.

  2. Hydrodynamic modeling of petroleum reservoirs using simulator MUFITS

    NASA Astrophysics Data System (ADS)

    Afanasyev, Andrey

    2015-04-01

    MUFITS is new noncommercial software for numerical modeling of subsurface processes in various applications (www.mufits.imec.msu.ru). To this point, the simulator was used for modeling nonisothermal flows in geothermal reservoirs and for modeling underground carbon dioxide storage. In this work, we present recent extension of the code to petroleum reservoirs. The simulator can be applied in conventional black oil modeling, but it also utilizes a more complicated models for volatile oil and gas condensate reservoirs as well as for oil rim fields. We give a brief overview of the code by providing the description of internal representation of reservoir models, which are constructed of grid blocks, interfaces, stock tanks as well as of pipe segments and pipe junctions for modeling wells and surface networks. For conventional black oil approach, we present the simulation results for SPE comparative tests. We propose an accelerated compositional modeling method for sub- and supercritical flows subjected to various phase equilibria, particularly to three-phase equilibria of vapour-liquid-liquid type. The method is based on the calculation of the thermodynamic potential of reservoir fluid as a function of pressure, total enthalpy and total composition and storing its values as a spline table, which is used in hydrodynamic simulation for accelerated PVT properties prediction. We provide the description of both the spline calculation procedure and the flashing algorithm. We evaluate the thermodynamic potential for a mixture of two pseudo-components modeling the heavy and light hydrocarbon fractions. We develop a technique for converting black oil PVT tables to the potential, which can be used for in-situ hydrocarbons multiphase equilibria prediction under sub- and supercritical conditions, particularly, in gas condensate and volatile oil reservoirs. We simulate recovery from a reservoir subject to near-critical initial conditions for hydrocarbon mixture. We acknowledge

  3. A 3-D ecosystem model in the Pacific Ocean and its simulations

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Ba, Q.

    2011-12-01

    A simple 3-D ecosystem model with nutrient, phytoplankton, zooplankton and detritus is coupled into the basinwide ocean general circulation (OGCM) of the Pacific Ocean that has been examined by the passive tracer such as tritium. The model was integrated for 500 years under the forcing of climatological monthly mean fields. The model generates similar distribution patterns of ecosystem variables to the estimates based on satellite-derived chlorophyll maps by vertically generalized production model with low water-column NPP values in the subtropical region and high values in the subarctic region and equatorial upwelling region. But the area and strength of oligotrophic gyre is much larger than that indicated in the observations. Compared with the observations, seasonal variations of surface chlorophyll concentrations and top 200-m average zooplankton biomass in the mid-high latitude regions are well simulated in the model. Because of the restoring term near the northern boundary used in the model, a false phytoplankton bloom can occur nearby 50N during winter time. An unrealistic maximum value in the vertical profile of chlorophyll near ocean weather station Papa is generated by our model. In terms of modification of model structure and sensitivity test of the associated parameters, the simulated results can be well improved. Although the division of nutrient into nitrate and ammonium and inclusion of DON in the model can alleviate the low-NPP problem in the subtropical region, modification of the sinking rate and decomposition rate of detritus in the model can be more effective. Introduction of the influence of mixed layer on the ecosystem process and modification of restraint of nutrients near the northern boundary can overcome the shortcomings of simulation of both spring bloom near 50N and vertical profile of chlorophyll at Papa to some extent.

  4. Simulation Model Development for Icing Effects Flight Training

    NASA Technical Reports Server (NTRS)

    Barnhart, Billy P.; Dickes, Edward G.; Gingras, David R.; Ratvasky, Thomas P.

    2003-01-01

    A high-fidelity simulation model for icing effects flight training was developed from wind tunnel data for the DeHavilland DHC-6 Twin Otter aircraft. First, a flight model of the un-iced airplane was developed and then modifications were generated to model the icing conditions. The models were validated against data records from the NASA Twin Otter Icing Research flight test program with only minimal refinements being required. The goals of this program were to demonstrate the effectiveness of such a simulator for training pilots to recognize and recover from icing situations and to establish a process for modeling icing effects to be used for future training devices.

  5. Testing simulations of intra- and inter-annual variation in the plant production response to elevated CO(2) against measurements from an 11-year FACE experiment on grazed pasture.

    PubMed

    Li, Frank Yonghong; Newton, Paul C D; Lieffering, Mark

    2014-01-01

    Ecosystem models play a crucial role in understanding and evaluating the combined impacts of rising atmospheric CO2 concentration and changing climate on terrestrial ecosystems. However, we are not aware of any studies where the capacity of models to simulate intra- and inter-annual variation in responses to elevated CO2 has been tested against long-term experimental data. Here we tested how well the ecosystem model APSIM/AgPasture was able to simulate the results from a free air carbon dioxide enrichment (FACE) experiment on grazed pasture. At this FACE site, during 11 years of CO2 enrichment, a wide range in annual plant production response to CO2 (-6 to +28%) was observed. As well as running the full model, which includes three plant CO2 response functions (plant photosynthesis, nitrogen (N) demand and stomatal conductance), we also tested the influence of these three functions on model predictions. Model/data comparisons showed that: (i) overall the model over-predicted the mean annual plant production response to CO2 (18.5% cf 13.1%) largely because years with small or negative responses to CO2 were not well simulated; (ii) in general seasonal and inter-annual variation in plant production responses to elevated CO2 were well represented by the model; (iii) the observed CO2 enhancement in overall mean legume content was well simulated but year-to-year variation in legume content was poorly captured by the model; (iv) the best fit of the model to the data required all three CO2 response functions to be invoked; (v) using actual legume content and reduced N fixation rate under elevated CO2 in the model provided the best fit to the experimental data. We conclude that in temperate grasslands the N dynamics (particularly the legume content and N fixation activity) play a critical role in pasture production responses to elevated CO2 , and are processes for model improvement. © 2013 John Wiley & Sons Ltd.

  6. Effect of land model ensemble versus coupled model ensemble on the simulation of precipitation climatology and variability

    NASA Astrophysics Data System (ADS)

    Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan

    2017-10-01

    Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.

  7. Contextualizing the relevance of basic sciences: small-group simulation with debrief for first- and second-year medical students in an integrated curriculum.

    PubMed

    Ginzburg, Samara B; Brenner, Judith; Cassara, Michael; Kwiatkowski, Thomas; Willey, Joanne M

    2017-01-01

    There has been a call for increased integration of basic and clinical sciences during preclinical years of undergraduate medical education. Despite the recognition that clinical simulation is an effective pedagogical tool, little has been reported on its use to demonstrate the relevance of basic science principles to the practice of clinical medicine. We hypothesized that simulation with an integrated science and clinical debrief used with early learners would illustrate the importance of basic science principles in clinical diagnosis and management of patients. Small groups of first- and second-year medical students were engaged in a high-fidelity simulation followed by a comprehensive debrief facilitated by a basic scientist and clinician. Surveys including anchored and open-ended questions were distributed at the conclusion of each experience. The majority of the students agreed that simulation followed by an integrated debrief illustrated the clinical relevance of basic sciences (mean ± standard deviation: 93.8% ± 2.9% of first-year medical students; 96.7% ± 3.5% of second-year medical students) and its importance in patient care (92.8% of first-year medical students; 90.4% of second-year medical students). In a thematic analysis of open-ended responses, students felt that these experiences provided opportunities for direct application of scientific knowledge to diagnosis and treatment, improving student knowledge, simulating real-world experience, and developing clinical reasoning, all of which specifically helped them understand the clinical relevance of basic sciences. Small-group simulation followed by a debrief that integrates basic and clinical sciences is an effective means of demonstrating the relationship between scientific fundamentals and patient care for early learners. As more medical schools embrace integrated curricula and seek opportunities for integration, our model is a novel approach that can be utilized.

  8. STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python

    PubMed Central

    Wils, Stefan; Schutter, Erik De

    2008-01-01

    We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245

  9. Simulation model of a twin-tail, high performance airplane

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  10. Scale-Similar Models for Large-Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Sarghini, F.

    1999-01-01

    Scale-similar models employ multiple filtering operations to identify the smallest resolved scales, which have been shown to be the most active in the interaction with the unresolved subgrid scales. They do not assume that the principal axes of the strain-rate tensor are aligned with those of the subgrid-scale stress (SGS) tensor, and allow the explicit calculation of the SGS energy. They can provide backscatter in a numerically stable and physically realistic manner, and predict SGS stresses in regions that are well correlated with the locations where large Reynolds stress occurs. In this paper, eddy viscosity and mixed models, which include an eddy-viscosity part as well as a scale-similar contribution, are applied to the simulation of two flows, a high Reynolds number plane channel flow, and a three-dimensional, nonequilibrium flow. The results show that simulations without models or with the Smagorinsky model are unable to predict nonequilibrium effects. Dynamic models provide an improvement of the results: the adjustment of the coefficient results in more accurate prediction of the perturbation from equilibrium. The Lagrangian-ensemble approach [Meneveau et al., J. Fluid Mech. 319, 353 (1996)] is found to be very beneficial. Models that included a scale-similar term and a dissipative one, as well as the Lagrangian ensemble averaging, gave results in the best agreement with the direct simulation and experimental data.

  11. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  12. On Improving 4-km Mesoscale Model Simulations

    NASA Astrophysics Data System (ADS)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  13. Numerical Modeling Studies of Wake Vortices: Real Case Simulations

    NASA Technical Reports Server (NTRS)

    Shen, Shao-Hua; Ding, Feng; Han, Jongil; Lin, Yuh-Lang; Arya, S. Pal; Proctor, Fred H.

    1999-01-01

    A three-dimensional large-eddy simulation model, TASS, is used to simulate the behavior of aircraft wake vortices in a real atmosphere. The purpose for this study is to validate the use of TASS for simulating the decay and transport of wake vortices. Three simulations are performed and the results are compared with the observed data from the 1994-1995 Memphis field experiments. The selected cases have an atmospheric environment of weak turbulence and stable stratification. The model simulations are initialized with appropriate meteorological conditions and a post roll-up vortex system. The behavior of wake vortices as they descend within the atmospheric boundary layer and interact with the ground is discussed.

  14. COSP: Satellite simulation software for model assessment

    DOE PAGES

    Bodas-Salcedo, A.; Webb, M. J.; Bony, S.; ...

    2011-08-01

    Errors in the simulation of clouds in general circulation models (GCMs) remain a long-standing issue in climate projections, as discussed in the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report. This highlights the need for developing new analysis techniques to improve our knowledge of the physical processes at the root of these errors. The Cloud Feedback Model Intercomparison Project (CFMIP) pursues this objective, and under that framework the CFMIP Observation Simulator Package (COSP) has been developed. COSP is a flexible software tool that enables the simulation of several satellite-borne active and passive sensor observations from model variables. The flexibilitymore » of COSP and a common interface for all sensors facilitates its use in any type of numerical model, from high-resolution cloud-resolving models to the coarser-resolution GCMs assessed by the IPCC, and the scales in between used in weather forecast and regional models. The diversity of model parameterization techniques makes the comparison between model and observations difficult, as some parameterized variables (e.g., cloud fraction) do not have the same meaning in all models. The approach followed in COSP permits models to be evaluated against observations and compared against each other in a more consistent manner. This thus permits a more detailed diagnosis of the physical processes that govern the behavior of clouds and precipitation in numerical models. The World Climate Research Programme (WCRP) Working Group on Coupled Modelling has recommended the use of COSP in a subset of climate experiments that will be assessed by the next IPCC report. Here we describe COSP, present some results from its application to numerical models, and discuss future work that will expand its capabilities.« less

  15. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  16. Crowd Simulation Incorporating Agent Psychological Models, Roles and Communication

    DTIC Science & Technology

    2005-01-01

    system (PMFserv) that implements human behavior models from a range of ability, stress, emotion , decision theoretic and motivation sources. An...autonomous agents, human behavior models, culture and emotions 1. Introduction There are many applications of computer animation and simulation where...We describe a new architecture to integrate a psychological model into a crowd simulation system in order to obtain believable emergent behaviors

  17. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  18. Mechanism of ENSO influence on the South Asian monsoon rainfall in global model simulations

    NASA Astrophysics Data System (ADS)

    Joshi, Sneh; Kar, Sarat C.

    2018-02-01

    Coupled ocean atmosphere global climate models are increasingly being used for seasonal scale simulation of the South Asian monsoon. In these models, sea surface temperatures (SSTs) evolve as coupled air-sea interaction process. However, sensitivity experiments with various SST forcing can only be done in an atmosphere-only model. In this study, the Global Forecast System (GFS) model at T126 horizontal resolution has been used to examine the mechanism of El Niño-Southern Oscillation (ENSO) forcing on the monsoon circulation and rainfall. The model has been integrated (ensemble) with observed, climatological and ENSO SST forcing to document the mechanism on how the South Asian monsoon responds to basin-wide SST variations in the Indian and Pacific Oceans. The model simulations indicate that the internal variability gets modulated by the SSTs with warming in the Pacific enhancing the ensemble spread over the monsoon region as compared to cooling conditions. Anomalous easterly wind anomalies cover the Indian region both at 850 and 200 hPa levels during El Niño years. The locations and intensity of Walker and Hadley circulations are altered due to ENSO SST forcing. These lead to reduction of monsoon rainfall over most parts of India during El Niño events compared to La Niña conditions. However, internally generated variability is a major source of uncertainty in the model-simulated climate.

  19. Visual performance modeling in the human operator simulator

    NASA Technical Reports Server (NTRS)

    Strieb, M. I.

    1979-01-01

    A brief description of the history of the development of the human operator simulator (HOS) model is presented. Features of the HOS micromodels that impact on the obtainment of visual performance data are discussed along with preliminary details on a HOS pilot model designed to predict the results of visual performance workload data obtained through oculometer studies on pilots in real and simulated approaches and landings.

  20. Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model.

    PubMed

    Henn, R Frank; Shah, Neel; Warner, Jon J P; Gomoll, Andreas H

    2013-06-01

    The purpose of this study was to quantify the benefits of shoulder arthroscopy simulator training with a cadaveric model of shoulder arthroscopy. Seventeen first-year medical students with no prior experience in shoulder arthroscopy were enrolled and completed this study. Each subject completed a baseline proctored arthroscopy on a cadaveric shoulder, which included controlling the camera and completing a standard series of tasks using the probe. The subjects were randomized, and 9 of the subjects received training on a virtual reality simulator for shoulder arthroscopy. All subjects then repeated the same cadaveric arthroscopy. The arthroscopic videos were analyzed in a blinded fashion for time to task completion and subjective assessment of technical performance. The 2 groups were compared by use of Student t tests, and change over time within groups was analyzed with paired t tests. There were no observed differences between the 2 groups on the baseline evaluation. The simulator group improved significantly from baseline with respect to time to completion and subjective performance (P < .05). Time to completion was significantly faster in the simulator group compared with controls at the final evaluation (P < .05). No difference was observed between the groups on the subjective scores at the final evaluation (P = .98). Shoulder arthroscopy simulator training resulted in significant benefits in clinical shoulder arthroscopy time to task completion in this cadaveric model. This study provides important additional evidence of the benefit of simulators in orthopaedic surgical training. There may be a role for simulator training in shoulder arthroscopy education. Copyright © 2013 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  1. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  2. A practical laboratory study simulating the percutaneous lumbar transforaminal epidural injection: training model in fresh cadaveric sheep spine.

    PubMed

    Suslu, Husnu

    2012-01-01

    Laboratory training models are essential for developing and refining treatment skills before the clinical application of surgical and invasive procedures. A simple simulation model is needed for young trainees to learn how to handle instruments, and to perform safe lumbar transforaminal epidural injections. Our aim is to present a model of a fresh cadaveric sheep lumbar spine that simulates the lumbar transforaminal epidural injection. The material consists of a 2-year-old fresh cadaveric sheep spine. A 4-step approach was designed for lumbar transforaminal epidural injection under C-arm scopy. For the lumbar transforaminal epidural injection, the fluoroscope was adjusted to get a proper oblique view while the material was stabilized in a prone position. The procedure then begin, using the C-arm guidance scopy. The model simulates well the steps of standard lumbar transforaminal epidural injections in the human spine. The cadaveric sheep spine represents a good method for training and it simulates fluoroscopic lumbar transforaminal epidural steroid injection procedures performed in the human spine.

  3. Modeling and Simulation of Quenching and Tempering Process in steels

    NASA Astrophysics Data System (ADS)

    Deng, Xiaohu; Ju, Dongying

    Quenching and tempering (Q&T) is a combined heat treatment process to achieve maximum toughness and ductility at a specified hardness and strength. It is important to develop a mathematical model for quenching and tempering process for satisfy requirement of mechanical properties with low cost. This paper presents a modified model to predict structural evolution and hardness distribution during quenching and tempering process of steels. The model takes into account tempering parameters, carbon content, isothermal and non-isothermal transformations. Moreover, precipitation of transition carbides, decomposition of retained austenite and precipitation of cementite can be simulated respectively. Hardness distributions of quenched and tempered workpiece are predicted by experimental regression equation. In order to validate the model, it is employed to predict the tempering of 80MnCr5 steel. The predicted precipitation dynamics of transition carbides and cementite is consistent with the previous experimental and simulated results from literature. Then the model is implemented within the framework of the developed simulation code COSMAP to simulate microstructure, stress and distortion in the heat treated component. It is applied to simulate Q&T process of J55 steel. The calculated results show a good agreement with the experimental ones. This agreement indicates that the model is effective for simulation of Q&T process of steels.

  4. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  5. Modeling and simulation: A key to future defense technology

    NASA Technical Reports Server (NTRS)

    Muccio, Anthony B.

    1993-01-01

    The purpose of this paper is to express the rationale for continued technological and scientific development of the modeling and simulation process for the defense industry. The defense industry, along with a variety of other industries, is currently being forced into making sacrifices in response to the current economic hardships. These sacrifices, which may not compromise the safety of our nation, nor jeopardize our current standing as the world peace officer, must be concentrated in areas which will withstand the needs of the changing world. Therefore, the need for cost effective alternatives of defense issues must be examined. This paper provides support that the modeling and simulation process is an economically feasible process which will ensure our nation's safety as well as provide and keep up with the future technological developments and demands required by the defense industry. The outline of this paper is as follows: introduction, which defines and describes the modeling and simulation process; discussion, which details the purpose and benefits of modeling and simulation and provides specific examples of how the process has been successful; and conclusion, which summarizes the specifics of modeling and simulation of defense issues and lends the support for its continued use in the defense arena.

  6. Simulations and model of the nonlinear Richtmyer–Meshkov instability

    DOE PAGES

    Dimonte, Guy; Ramaprabhu, P.

    2010-01-21

    The nonlinear evolution of the Richtmyer-Meshkov (RM) instability is investigated using numerical simulations with the FLASH code in two-dimensions (2D). The purpose of the simulations is to develop an empiricial nonlinear model of the RM instability that is applicable to inertial confinement fusion (ICF) and ejecta formation, namely, at large Atwood number A and scaled initial amplitude kh o (k ≡ wavenumber) of the perturbation. The FLASH code is first validated with a variety of RM experiments that evolve well into the nonlinear regime. They reveal that bubbles stagnate when they grow by an increment of 2/k and that spikesmore » accelerate for A > 0.5 due to higher harmonics that focus them. These results are then compared with a variety of nonlinear models that are based on potential flow. We find that the models agree with simulations for moderate values of A < 0.9 and kh o< 1, but not for the larger values that characterize ICF and ejecta formation. We thus develop a new nonlinear empirical model that captures the simulation results consistent with potential flow for a broader range of A and kh o. Our hope is that such empirical models concisely capture the RM simulations and inspire more rigorous solutions.« less

  7. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  8. Subgrid Scale Modeling in Solar Convection Simulations using the ASH Code

    NASA Technical Reports Server (NTRS)

    Young, Y.-N.; Miesch, M.; Mansour, N. N.

    2003-01-01

    The turbulent solar convection zone has remained one of the most challenging and important subjects in physics. Understanding the complex dynamics in the solar con- vection zone is crucial for gaining insight into the solar dynamo problem. Many solar observatories have generated revealing data with great details of large scale motions in the solar convection zone. For example, a strong di erential rotation is observed: the angular rotation is observed to be faster at the equator than near the poles not only near the solar surface, but also deep in the convection zone. On the other hand, due to the wide range of dynamical scales of turbulence in the solar convection zone, both theory and simulation have limited success. Thus, cutting edge solar models and numerical simulations of the solar convection zone have focused more narrowly on a few key features of the solar convection zone, such as the time-averaged di erential rotation. For example, Brun & Toomre (2002) report computational finding of differential rotation in an anelastic model for solar convection. A critical shortcoming in this model is that the viscous dissipation is based on application of mixing length theory to stellar dynamics with some ad hoc parameter tuning. The goal of our work is to implement the subgrid scale model developed at CTR into the solar simulation code and examine how the differential rotation will be a affected as a result. Specifically, we implement a Smagorinsky-Lilly subgrid scale model into the ASH (anelastic spherical harmonic) code developed over the years by various authors. This paper is organized as follows. In x2 we briefly formulate the anelastic system that describes the solar convection. In x3 we formulate the Smagorinsky-Lilly subgrid scale model for unstably stratifed convection. We then present some preliminary results in x4, where we also provide some conclusions and future directions.

  9. A cable-driven parallel robots application: modelling and simulation of a dynamic cable model in Dymola

    NASA Astrophysics Data System (ADS)

    Othman, M. F.; Kurniawan, R.; Schramm, D.; Ariffin, A. K.

    2018-05-01

    Modeling a cable model in multibody dynamics simulation tool which dynamically varies in length, mass and stiffness is a challenging task. Simulation of cable-driven parallel robots (CDPR) for instance requires a cable model that can dynamically change in length for every desired pose of the platform. Thus, in this paper, a detailed procedure for modeling and simulation of a dynamic cable model in Dymola is proposed. The approach is also applicable for other types of Modelica simulation environments. The cable is modeled using standard mechanical elements like mass, spring, damper and joint. The parameters of the cable model are based on the factsheet of the manufacturer and experimental results. Its dynamic ability is tested by applying it on a complete planar CDPR model in which the parameters are based on a prototype named CABLAR, which is developed in Chair of Mechatronics, University of Duisburg-Essen. The prototype has been developed to demonstrate an application of CDPR as a goods storage and retrieval machine. The performance of the cable model during the simulation is analyzed and discussed.

  10. SIMYAR: a cable-yarding simulation model.

    Treesearch

    R.J. McGaughey; R.H. Twito

    1987-01-01

    A skyline-logging simulation model designed to help planners evaluate potential yarding options and alternative harvest plans is presented. The model, called SIMYAR, uses information about the timber stand, yarding equipment, and unit geometry to estimate yarding co stand productivity for a particular operation. The costs of felling, bucking, loading, and hauling are...

  11. Nursing students' perceptions of learning after high fidelity simulation: Effects of a Three-step Post-simulation Reflection Model.

    PubMed

    Lestander, Örjan; Lehto, Niklas; Engström, Åsa

    2016-05-01

    High-fidelity simulation (HFS) has become a bridge between theoretical knowledge and practical skills. A safe and realistic environment is commonly used in nursing education to improve cognitive, affective and psychomotor abilities. Debriefing following a simulation experience provides opportunities for students to analyze and begin to reflect upon their decisions, actions and results. The nursing literature highlights the need to promote the concept of reflective practice and to assist students in reflection, and research indicates the need to refine and develop debriefing strategies, which is the focus of the current paper. To explore the value of reflections after HFS by investigating nursing students' perceptions of their learning when a Three-step Post-simulation Reflection Model is used. A qualitative descriptive research approach was applied. A Three-step Post-simulation Reflection Model that combined written and verbal reflections was used after an HFS experience in a second-year course in the Bachelor Program in Nursing at Luleå University of Technology, Sweden. Reflective texts written before and after a verbal group reflection were subjected to qualitative content analysis. The main theme in the first written reflections was identified as "Starting to act as a nurse", with the following categories: feeling stressed, inadequate and inexperienced; developing an awareness of the importance of never compromising patient safety; planning the work and prioritizing; and beginning to understand and implement nursing knowledge. The main theme in the second written reflections was identified to be "Maturing in the profession", with the following categories: appreciating colleagues, good communication and thoughtfulness; gaining increased self-awareness and confidence; and beginning to understand the profession. The Three-step Post-simulation Reflection Model fostered an appreciation of clear and effective communication. Having time for thoughtfulness and

  12. Discrete event simulation modelling of patient service management with Arena

    NASA Astrophysics Data System (ADS)

    Guseva, Elena; Varfolomeyeva, Tatyana; Efimova, Irina; Movchan, Irina

    2018-05-01

    This paper describes the simulation modeling methodology aimed to aid in solving the practical problems of the research and analysing the complex systems. The paper gives the review of a simulation platform sand example of simulation model development with Arena 15.0 (Rockwell Automation).The provided example of the simulation model for the patient service management helps to evaluate the workload of the clinic doctors, determine the number of the general practitioners, surgeons, traumatologists and other specialized doctors required for the patient service and develop recommendations to ensure timely delivery of medical care and improve the efficiency of the clinic operation.

  13. Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Plantenga, Todd D.

    2010-06-01

    The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

  14. Volcano Modelling and Simulation gateway (VMSg): A new web-based framework for collaborative research in physical modelling and simulation of volcanic phenomena

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.

    2009-12-01

    Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.

  15. Development of a Model for the Night Side Magnetopause using Global Simulations

    NASA Technical Reports Server (NTRS)

    Raeder, Joachim

    1998-01-01

    During the final year of this investigation we have finished several event studies that we considered necessary for the development of a tail magnetopause model and for the calibration of our simulation code. We have not reached the ultimate goal of the project, i.e., the development of an analytical tail magnetopause model. In the course of the investigation we have learned that such a model would be much more complex than we had anticipated. However, the investigations that we conducted towards this goal have led to significant results and discoveries that are of considerable value for understanding the tail magnetopause. These are summarized in the following sections.

  16. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  17. Flight Simulation Model Exchange. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce

    2011-01-01

    The NASA Engineering and Safety Center Review Board sponsored an assessment of the draft Standard, Flight Dynamics Model Exchange Standard, BSR/ANSI-S-119-201x (S-119) that was conducted by simulation and guidance, navigation, and control engineers from several NASA Centers. The assessment team reviewed the conventions and formats spelled out in the draft Standard and the actual implementation of two example aerodynamic models (a subsonic F-16 and the HL-20 lifting body) encoded in the Extensible Markup Language grammar. During the implementation, the team kept records of lessons learned and provided feedback to the American Institute of Aeronautics and Astronautics Modeling and Simulation Technical Committee representative. This document contains the appendices to the main report.

  18. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  19. Budget impact analysis of thrombolysis for stroke in Spain: a discrete event simulation model.

    PubMed

    Mar, Javier; Arrospide, Arantzazu; Comas, Mercè

    2010-01-01

    Thrombolysis within the first 3 hours after the onset of symptoms of a stroke has been shown to be a cost-effective treatment because treated patients are 30% more likely than nontreated patients to have no residual disability. The objective of this study was to calculate by means of a discrete event simulation model the budget impact of thrombolysis in Spain. The budget impact analysis was based on stroke incidence rates and the estimation of the prevalence of stroke-related disability in Spain and its translation to hospital and social costs. A discrete event simulation model was constructed to represent the flow of patients with stroke in Spain. If 10% of patients with stroke from 2000 to 2015 would receive thrombolytic treatment, the prevalence of dependent patients in 2015 would decrease from 149,953 to 145,922. For the first 6 years, the cost of intervention would surpass the savings. Nevertheless, the number of cases in which patient dependency was avoided would steadily increase, and after 2006 the cost savings would be greater, with a widening difference between the cost of intervention and the cost of nonintervention, until 2015. The impact of thrombolysis on society's health and social budget indicates a net benefit after 6 years, and the improvement in health grows continuously. The validation of the model demonstrates the adequacy of the discrete event simulation approach in representing the epidemiology of stroke to calculate the budget impact.

  20. Modelling and Simulation as a Recognizing Method in Education

    ERIC Educational Resources Information Center

    Stoffa, Veronika

    2004-01-01

    Computer animation-simulation models of complex processes and events, which are the method of instruction, can be an effective didactic device. Gaining deeper knowledge about objects modelled helps to plan simulation experiments oriented on processes and events researched. Animation experiments realized on multimedia computers can aid easier…

  1. Cardiovascular Surgery Residency Program: Training Coronary Anastomosis Using the Arroyo Simulator and UNIFESP Models.

    PubMed

    Maluf, Miguel Angel; Gomes, Walter José; Bras, Ademir Massarico; Araújo, Thiago Cavalcante Vila Nova de; Mota, André Lupp; Cardoso, Caio Cesar; Coutinho, Rafael Viana dos S

    2015-01-01

    Engage the UNIFESP Cardiovascular Surgery residents in coronary anastomosis, assess their skills and certify results, using the Arroyo Anastomosis Simulator and UNIFESP surgical models. First to 6th year residents attended a weekly program of technical training in coronary anastomosis, using 4 simulation models: 1. Arroyo simulator; 2. Dummy with a plastic heart; 3. Dummy with a bovine heart; and 4. Dummy with a beating pig heart. The assessment test was comprised of 10 items, using a scale from 1 to 5 points in each of them, creating a global score of 50 points maximum. The technical performance of the candidate showed improvement in all items, especially manual skill and technical progress, critical sense of the work performed, confidence in the procedure and reduction of the time needed to perform the anastomosis after 12 weeks practice. In response to the multiplicity of factors that currently influence the cardiovascular surgeon training, there have been combined efforts to reform the practices of surgical medical training. 1 - The four models of simulators offer a considerable contribution to the field of cardiovascular surgery, improving the skill and dexterity of the surgeon in training. 2 - Residents have shown interest in training and cooperate in the development of innovative procedures for surgical medical training in the art.

  2. Development of the BIOME-BGC model for the simulation of managed Moso bamboo forest ecosystems.

    PubMed

    Mao, Fangjie; Li, Pingheng; Zhou, Guomo; Du, Huaqiang; Xu, Xiaojun; Shi, Yongjun; Mo, Lufeng; Zhou, Yufeng; Tu, Guoqing

    2016-05-01

    Numerical models are the most appropriate instrument for the analysis of the carbon balance of terrestrial ecosystems and their interactions with changing environmental conditions. The process-based model BIOME-BGC is widely used in simulation of carbon balance within vegetation, litter and soil of unmanaged ecosystems. For Moso bamboo forests, however, simulations with BIOME-BGC are inaccurate in terms of the growing season and the carbon allocation, due to the oversimplified representation of phenology. Our aim was to improve the applicability of BIOME-BGC for managed Moso bamboo forest ecosystem by implementing several new modules, including phenology, carbon allocation, and management. Instead of the simple phenology and carbon allocation representations in the original version, a periodic Moso bamboo phenology and carbon allocation module was implemented, which can handle the processes of Moso bamboo shooting and high growth during "on-year" and "off-year". Four management modules (digging bamboo shoots, selective cutting, obtruncation, fertilization) were integrated in order to quantify the functioning of managed ecosystems. The improved model was calibrated and validated using eddy covariance measurement data collected at a managed Moso bamboo forest site (Anji) during 2011-2013 years. As a result of these developments and calibrations, the performance of the model was substantially improved. Regarding the measured and modeled fluxes (gross primary production, total ecosystem respiration, net ecosystem exchange), relative errors were decreased by 42.23%, 103.02% and 18.67%, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  4. Historical droughts in Mediterranean regions during the last 500 years: a data/model approach

    NASA Astrophysics Data System (ADS)

    Brewer, S.; Alleaume, S.; Guiot, J.; Nicault, A.

    2007-06-01

    We present here a new method for comparing the output of General Circulation Models (GCMs) with proxy-based reconstructions, using time series of reconstructed and simulated climate parameters. The method uses k-means clustering to allow comparison between different periods that have similar spatial patterns, and a fuzzy logic-based distance measure in order to take reconstruction errors into account. The method has been used to test two coupled ocean-atmosphere GCMs over the Mediterranean region for the last 500 years, using an index of drought stress, the Palmer Drought Severity Index. The results showed that, whilst no model exactly simulated the reconstructed changes, all simulations were an improvement over using the mean climate, and a good match was found after 1650 with a model run that took into account changes in volcanic forcing, solar irradiance, and greenhouse gases. A more detailed investigation of the output of this model showed the existence of a set of atmospheric circulation patterns linked to the patterns of drought stress: 1) a blocking pattern over northern Europe linked to dry conditions in the south prior to the Little Ice Age (LIA) and during the 20th century; 2) a NAO-positive like pattern with increased westerlies during the LIA; 3) a NAO-negative like period shown in the model prior to the LIA, but that occurs most frequently in the data during the LIA. The results of the comparison show the improvement in simulated climate as various forcings are included and help to understand the atmospheric changes that are linked to the observed reconstructed climate changes.

  5. NEAMS FPL M2 Milestone Report: Development of a UO₂ Grain Size Model using Multicale Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonks, Michael R; Zhang, Yongfeng; Bai, Xianming

    2014-06-01

    This report summarizes development work funded by the Nuclear Energy Advanced Modeling Simulation program's Fuels Product Line (FPL) to develop a mechanistic model for the average grain size in UO₂ fuel. The model is developed using a multiscale modeling and simulation approach involving atomistic simulations, as well as mesoscale simulations using INL's MARMOT code.

  6. SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with Smp Standard

    NASA Astrophysics Data System (ADS)

    Koo, Cheol-Hea; Lee, Hoon-Hee; Cheon, Yee-Jin

    2010-12-01

    Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP) is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI) are available. Korea Aerospace Research Institute (KARI) is developing hardware abstraction layer (HAL) supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.

  7. Improving sea level simulation in Mediterranean regional climate models

    NASA Astrophysics Data System (ADS)

    Adloff, Fanny; Jordà, Gabriel; Somot, Samuel; Sevault, Florence; Arsouze, Thomas; Meyssignac, Benoit; Li, Laurent; Planton, Serge

    2017-08-01

    For now, the question about future sea level change in the Mediterranean remains a challenge. Previous climate modelling attempts to estimate future sea level change in the Mediterranean did not meet a consensus. The low resolution of CMIP-type models prevents an accurate representation of important small scales processes acting over the Mediterranean region. For this reason among others, the use of high resolution regional ocean modelling has been recommended in literature to address the question of ongoing and future Mediterranean sea level change in response to climate change or greenhouse gases emissions. Also, it has been shown that east Atlantic sea level variability is the dominant driver of the Mediterranean variability at interannual and interdecadal scales. However, up to now, long-term regional simulations of the Mediterranean Sea do not integrate the full sea level information from the Atlantic, which is a substantial shortcoming when analysing Mediterranean sea level response. In the present study we analyse different approaches followed by state-of-the-art regional climate models to simulate Mediterranean sea level variability. Additionally we present a new simulation which incorporates improved information of Atlantic sea level forcing at the lateral boundary. We evaluate the skills of the different simulations in the frame of long-term hindcast simulations spanning from 1980 to 2012 analysing sea level variability from seasonal to multidecadal scales. Results from the new simulation show a substantial improvement in the modelled Mediterranean sea level signal. This confirms that Mediterranean mean sea level is strongly influenced by the Atlantic conditions, and thus suggests that the quality of the information in the lateral boundary conditions (LBCs) is crucial for the good modelling of Mediterranean sea level. We also found that the regional differences inside the basin, that are induced by circulation changes, are model-dependent and thus not

  8. Equation-oriented specification of neural models for simulations

    PubMed Central

    Stimberg, Marcel; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain

    2013-01-01

    Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modeling software is to build network models based on a library of pre-defined components and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions. The presented approach has been implemented in the Brian2 simulator. PMID:24550820

  9. Freezing Transition Studies Through Constrained Cell Model Simulation

    NASA Astrophysics Data System (ADS)

    Nayhouse, Michael; Kwon, Joseph Sang-Il; Heng, Vincent R.; Amlani, Ankur M.; Orkoulas, G.

    2014-10-01

    In the present work, a simulation method based on cell models is used to deduce the fluid-solid transition of a system of particles that interact via a pair potential, , which is of the form with . The simulations are implemented under constant-pressure conditions on a generalized version of the constrained cell model. The constrained cell model is constructed by dividing the volume into Wigner-Seitz cells and confining each particle in a single cell. This model is a special case of a more general cell model which is formed by introducing an additional field variable that controls the number of particles per cell and, thus, the relative stability of the solid against the fluid phase. High field values force configurations with one particle per cell and thus favor the solid phase. Fluid-solid coexistence on the isotherm that corresponds to a reduced temperature of 2 is determined from constant-pressure simulations of the generalized cell model using tempering and histogram reweighting techniques. The entire fluid-solid phase boundary is determined through a thermodynamic integration technique based on histogram reweighting, using the previous coexistence point as a reference point. The vapor-liquid phase diagram is obtained from constant-pressure simulations of the unconstrained system using tempering and histogram reweighting. The phase diagram of the system is found to contain a stable critical point and a triple point. The phase diagram of the corresponding constrained cell model is also found to contain both a stable critical point and a triple point.

  10. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  11. Pedestrians’ behavior in emergency evacuation: Modeling and simulation

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zheng, Jie-Hui; Zhang, Xiao-Shuang; Zhang, Jian-Lin; Wang, Qiu-Zhen; Zhang, Qian

    2016-11-01

    The social force model has been widely used to simulate pedestrian evacuation by analyzing attractive, repulsive, driving, and fluctuating forces among pedestrians. Many researchers have improved its limitations in simulating behaviors of large-scale population. This study modifies the well-accepted social force model by considering the impacts of interaction among companions and further develops a comprehensive model by combining that with a multi-exit utility function. Then numerical simulations of evacuations based on the comprehensive model are implemented in the waiting hall of the Wulin Square Subway Station in Hangzhou, China. The results provide safety thresholds of pedestrian density and panic levels in different operation situations. In spite of the operation situation and the panic level, a larger friend-group size results in lower evacuation efficiency. Our study makes important contributions to building a comprehensive multi-exit social force model and to applying it to actual scenarios, which produces data to facilitate decision making in contingency plans and emergency treatment. Project supported by the National Natural Science Foundation of China (Grant No. 71471163).

  12. Validating clustering of molecular dynamics simulations using polymer models.

    PubMed

    Phillips, Joshua L; Colvin, Michael E; Newsam, Shawn

    2011-11-14

    Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers

  13. Validating clustering of molecular dynamics simulations using polymer models

    PubMed Central

    2011-01-01

    Background Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the

  14. Stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural streamflow

    USGS Publications Warehouse

    Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.

    2016-02-24

    -balance model was developed for simulating monthly natural (unregulated) mean streamflow based on precipitation, temperature, and potential evapotranspiration at select streamflow-gaging stations. The model was calibrated using streamflow data from the U.S. Geological Survey and Environment Canada, along with natural (unregulated) streamflow data from the U.S. Army Corps of Engineers. Correlation coefficients between simulated and natural (unregulated) flows generally were high (greater than 0.8), and the seasonal means and standard deviations of the simulated flows closely matched the means and standard deviations of the natural (unregulated) flows. After calibrating the model for a monthly time step, monthly streamflow for each subbasin was disaggregated into three values per month, or an approximately 10-day time step, and a separate routing model was developed for simulating 10-day streamflow for downstream gages.The stochastic climate simulation model for precipitation, temperature, and potential evapotranspiration was combined with the water-balance model to simulate potential future sequences of 10-day mean streamflow for each of the streamflow-gaging station locations. Flood risk, as determined by equilibrium flow-frequency distributions for the dry (1912–69) and wet (1970–2011) climate states, was considerably higher for the wet state compared to the dry state. Future flood risk will remain high until the wet climate state ends, and for several years after that, because there may be a long lag-time between the return of drier conditions and the onset of a lower soil-moisture storage equilibrium.

  15. Review of Real-Time Simulator and the Steps Involved for Implementation of a Model from MATLAB/SIMULINK to Real-Time

    NASA Astrophysics Data System (ADS)

    Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi

    2015-06-01

    Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.

  16. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  17. Analysis of the impact of simulation model simplifications on the quality of low-energy buildings simulation results

    NASA Astrophysics Data System (ADS)

    Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr

    2017-11-01

    The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.

  18. The cost effectiveness of tacrolimus versus microemulsified cyclosporin: a 10-year model of renal transplantation outcomes.

    PubMed

    Orme, Michelle E; Jurewicz, Wieslaw A; Kumar, Nagappan; McKechnie, Tracy L

    2003-01-01

    . A Monte Carlo simulation of the model (10000 simulations) gave an average cost at 10 years of pounds sterling 23279 (SD pounds sterling 3457) for cyclosporin ME and pounds sterling 22841 (SD pounds sterling 3590) for tacrolimus. A (second order) probabilistic sensitivity analysis was also performed. The average cost at 10 years from a simulated cohort of 1000 was pounds sterling 23473 (SD pounds sterling 2154) for cyclosporin ME and pounds sterling 24087 (SD pounds sterling 2025) for tacrolimus. Renal transplant recipients maintained on tacrolimus have better short- and long-term outcomes than patients maintained on cyclosporin ME. The long-term use of tacrolimus is a more cost-effective solution in terms of the number of survivors, patients with a functioning graft and rejection-free patients.

  19. Photometric Modeling of Simulated Surace-Resolved Bennu Images

    NASA Astrophysics Data System (ADS)

    Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.

    2017-12-01

    The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the

  20. Simulation on Poisson and negative binomial models of count road accident modeling

    NASA Astrophysics Data System (ADS)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  1. An Object Model for a Rocket Engine Numerical Simulator

    NASA Technical Reports Server (NTRS)

    Mitra, D.; Bhalla, P. N.; Pratap, V.; Reddy, P.

    1998-01-01

    Rocket Engine Numerical Simulator (RENS) is a packet of software which numerically simulates the behavior of a rocket engine. Different parameters of the components of an engine is the input to these programs. Depending on these given parameters the programs output the behaviors of those components. These behavioral values are then used to guide the design of or to diagnose a model of a rocket engine "built" by a composition of these programs simulating different components of the engine system. In order to use this software package effectively one needs to have a flexible model of a rocket engine. These programs simulating different components then should be plugged into this modular representation. Our project is to develop an object based model of such an engine system. We are following an iterative and incremental approach in developing the model, as is the standard practice in the area of object oriented design and analysis of softwares. This process involves three stages: object modeling to represent the components and sub-components of a rocket engine, dynamic modeling to capture the temporal and behavioral aspects of the system, and functional modeling to represent the transformational aspects. This article reports on the first phase of our activity under a grant (RENS) from the NASA Lewis Research center. We have utilized Rambaugh's object modeling technique and the tool UML for this purpose. The classes of a rocket engine propulsion system are developed and some of them are presented in this report. The next step, developing a dynamic model for RENS, is also touched upon here. In this paper we will also discuss the advantages of using object-based modeling for developing this type of an integrated simulator over other tools like an expert systems shell or a procedural language, e.g., FORTRAN. Attempts have been made in the past to use such techniques.

  2. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  3. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation

    PubMed Central

    Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715

  4. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    PubMed

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  5. Progress in Modeling and Simulation of Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, John A

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilitiesmore » * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.« less

  6. Further Investigations of Gravity Modeling on Surface-Interacting Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2009-01-01

    A vehicle simulation is "surface-interacting" if the state of the vehicle (position, velocity, and acceleration) relative to the surface is important. Surface-interacting simulations perform ascent, entry, descent, landing, surface travel, or atmospheric flight. The dynamics of surface-interacting simulations are influenced by the modeling of gravity. Gravity is the sum of gravitation and the centrifugal acceleration due to the world s rotation. Both components are functions of position relative to the world s center and that position for a given set of geodetic coordinates (latitude, longitude, and altitude) depends on the world model (world shape and dynamics). Thus, gravity fidelity depends on the fidelities of the gravitation model and the world model and on the interaction of the gravitation and world model. A surface-interacting simulation cannot treat the gravitation separately from the world model. This paper examines the actual performance of different pairs of world and gravitation models (or direct gravity models) on the travel of a subsonic civil transport in level flight under various starting conditions.

  7. Combining experimental and simulation data of molecular processes via augmented Markov models.

    PubMed

    Olsson, Simon; Wu, Hao; Paul, Fabian; Clementi, Cecilia; Noé, Frank

    2017-08-01

    Accurate mechanistic description of structural changes in biomolecules is an increasingly important topic in structural and chemical biology. Markov models have emerged as a powerful way to approximate the molecular kinetics of large biomolecules while keeping full structural resolution in a divide-and-conquer fashion. However, the accuracy of these models is limited by that of the force fields used to generate the underlying molecular dynamics (MD) simulation data. Whereas the quality of classical MD force fields has improved significantly in recent years, remaining errors in the Boltzmann weights are still on the order of a few [Formula: see text], which may lead to significant discrepancies when comparing to experimentally measured rates or state populations. Here we take the view that simulations using a sufficiently good force-field sample conformations that are valid but have inaccurate weights, yet these weights may be made accurate by incorporating experimental data a posteriori. To do so, we propose augmented Markov models (AMMs), an approach that combines concepts from probability theory and information theory to consistently treat systematic force-field error and statistical errors in simulation and experiment. Our results demonstrate that AMMs can reconcile conflicting results for protein mechanisms obtained by different force fields and correct for a wide range of stationary and dynamical observables even when only equilibrium measurements are incorporated into the estimation process. This approach constitutes a unique avenue to combine experiment and computation into integrative models of biomolecular structure and dynamics.

  8. USMC Inventory Control Using Optimization Modeling and Discrete Event Simulation

    DTIC Science & Technology

    2016-09-01

    release. Distribution is unlimited. USMC INVENTORY CONTROL USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION by Timothy A. Curling...USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION 5. FUNDING NUMBERS 6. AUTHOR(S) Timothy A. Curling 7. PERFORMING ORGANIZATION NAME(S...optimization and discrete -event simulation. This construct can potentially provide an effective means in improving order management decisions. However

  9. Influence of plasticity models upon the outcome of simulated hypervelocity impacts

    NASA Astrophysics Data System (ADS)

    Thomas, John N.

    1994-07-01

    This paper describes the results of numerical simulations of aluminum upon aluminum impacts which were performed with the CTH hydrocode to determine the effect plasticity formulations upon the final perforation size in the targets. The targets were 1 mm and 5 mm thick plates and the projectiles were 10 mm by 10 mm right circular cylinders. Both targets and projectiles were represented as 2024 aluminium alloy. The hydrocode simulations were run in a two-dimensional cylindrical geometry. Normal impacts at velocites between 5 and 15 km/s were simulated. Three isotropic yield stress models were explored in the simulations: an elastic-perfectly plastic model and the Johnson-Cook and Steinberg-Guinan-Lund viscoplastic models. The fracture behavior was modeled by a simple tensile pressure criterion. The simulations show that using the three strength models resulted in only minor differences in the final perforation diameter. The simulation results were used to construct an equation to predict the final hole size resulting from impacts on thin targets.

  10. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    USGS Publications Warehouse

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future

  11. A Variable Resolution Stretched Grid General Circulation Model: Regional Climate Simulation

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Govindaraju, Ravi C.; Suarez, Max J.

    2000-01-01

    The development of and results obtained with a variable resolution stretched-grid GCM for the regional climate simulation mode, are presented. A global variable resolution stretched- grid used in the study has enhanced horizontal resolution over the U.S. as the area of interest The stretched-grid approach is an ideal tool for representing regional to global scale interaction& It is an alternative to the widely used nested grid approach introduced over a decade ago as a pioneering step in regional climate modeling. The major results of the study are presented for the successful stretched-grid GCM simulation of the anomalous climate event of the 1988 U.S. summer drought- The straightforward (with no updates) two month simulation is performed with 60 km regional resolution- The major drought fields, patterns and characteristics such as the time averaged 500 hPa heights precipitation and the low level jet over the drought area. appear to be close to the verifying analyses for the stretched-grid simulation- In other words, the stretched-grid GCM provides an efficient down-scaling over the area of interest with enhanced horizontal resolution. It is also shown that the GCM skill is sustained throughout the simulation extended to one year. The developed and tested in a simulation mode stretched-grid GCM is a viable tool for regional and subregional climate studies and applications.

  12. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  13. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  14. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    ERIC Educational Resources Information Center

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  15. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  16. Proposed best practice for projects that involve modelling and simulation.

    PubMed

    O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad

    2017-03-01

    Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Population models and simulation methods: The case of the Spearman rank correlation.

    PubMed

    Astivia, Oscar L Olvera; Zumbo, Bruno D

    2017-11-01

    The purpose of this paper is to highlight the importance of a population model in guiding the design and interpretation of simulation studies used to investigate the Spearman rank correlation. The Spearman rank correlation has been known for over a hundred years to applied researchers and methodologists alike and is one of the most widely used non-parametric statistics. Still, certain misconceptions can be found, either explicitly or implicitly, in the published literature because a population definition for this statistic is rarely discussed within the social and behavioural sciences. By relying on copula distribution theory, a population model is presented for the Spearman rank correlation, and its properties are explored both theoretically and in a simulation study. Through the use of the Iman-Conover algorithm (which allows the user to specify the rank correlation as a population parameter), simulation studies from previously published articles are explored, and it is found that many of the conclusions purported in them regarding the nature of the Spearman correlation would change if the data-generation mechanism better matched the simulation design. More specifically, issues such as small sample bias and lack of power of the t-test and r-to-z Fisher transformation disappear when the rank correlation is calculated from data sampled where the rank correlation is the population parameter. A proof for the consistency of the sample estimate of the rank correlation is shown as well as the flexibility of the copula model to encompass results previously published in the mathematical literature. © 2017 The British Psychological Society.

  18. Development of full regeneration establishment models for the forest vegetation simulator

    Treesearch

    John D. Shaw

    2015-01-01

    For most simulation modeling efforts, the goal of model developers is to produce simulations that are the best representations of realism as possible. Achieving this goal commonly requires a considerable amount of data to set the initial parameters, followed by validation and model improvement – both of which require even more data. The Forest Vegetation Simulator (FVS...

  19. cellGPU: Massively parallel simulations of dynamic vertex models

    NASA Astrophysics Data System (ADS)

    Sussman, Daniel M.

    2017-10-01

    Vertex models represent confluent tissue by polygonal or polyhedral tilings of space, with the individual cells interacting via force laws that depend on both the geometry of the cells and the topology of the tessellation. This dependence on the connectivity of the cellular network introduces several complications to performing molecular-dynamics-like simulations of vertex models, and in particular makes parallelizing the simulations difficult. cellGPU addresses this difficulty and lays the foundation for massively parallelized, GPU-based simulations of these models. This article discusses its implementation for a pair of two-dimensional models, and compares the typical performance that can be expected between running cellGPU entirely on the CPU versus its performance when running on a range of commercial and server-grade graphics cards. By implementing the calculation of topological changes and forces on cells in a highly parallelizable fashion, cellGPU enables researchers to simulate time- and length-scales previously inaccessible via existing single-threaded CPU implementations. Program Files doi:http://dx.doi.org/10.17632/6j2cj29t3r.1 Licensing provisions: MIT Programming language: CUDA/C++ Nature of problem: Simulations of off-lattice "vertex models" of cells, in which the interaction forces depend on both the geometry and the topology of the cellular aggregate. Solution method: Highly parallelized GPU-accelerated dynamical simulations in which the force calculations and the topological features can be handled on either the CPU or GPU. Additional comments: The code is hosted at https://gitlab.com/dmsussman/cellGPU, with documentation additionally maintained at http://dmsussman.gitlab.io/cellGPUdocumentation

  20. Simulation of Mercury's magnetosheath with a combined hybrid-paraboloid model

    NASA Astrophysics Data System (ADS)

    Parunakian, David; Dyadechkin, Sergey; Alexeev, Igor; Belenkaya, Elena; Khodachenko, Maxim; Kallio, Esa; Alho, Markku

    2017-08-01

    In this paper we introduce a novel approach for modeling planetary magnetospheres that involves a combination of the hybrid model and the paraboloid magnetosphere model (PMM); we further refer to it as the combined hybrid model. While both of these individual models have been successfully applied in the past, their combination enables us both to overcome the traditional difficulties of hybrid models to develop a self-consistent magnetic field and to compensate the lack of plasma simulation in the PMM. We then use this combined model to simulate Mercury's magnetosphere and investigate the geometry and configuration of Mercury's magnetosheath controlled by various conditions in the interplanetary medium. The developed approach provides a unique comprehensive view of Mercury's magnetospheric environment for the first time. Using this setup, we compare the locations of the bow shock and the magnetopause as determined by simulations with the locations predicted by stand-alone PMM runs and also verify the magnetic and dynamic pressure balance at the magnetopause. We also compare the results produced by these simulations with observational data obtained by the magnetometer on board the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft along a dusk-dawn orbit and discuss the signatures of the magnetospheric features that appear in these simulations. Overall, our analysis suggests that combining the semiempirical PMM with a self-consistent global kinetic model creates new modeling possibilities which individual models cannot provide on their own.